Nothing Special   »   [go: up one dir, main page]

Skip to main content

Managing and Enhancing Performance Benchmarks

  • Conference paper
  • First Online:
Advances in Dependability Engineering of Complex Systems (DepCoS-RELCOMEX 2017)

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 582))

Included in the following conference series:

Abstract

The paper deals with the problem of computer performance evaluation basing on program benchmarks. We outline the drawbacks of existing techniques and propose a significant enhancement using available performance counters. This new approach needed development of supplementary tools to control benchmarking processes and combining them with various measurements. In particular, this resulted in some extensions of the open source program Phoronix Test Suite. The usefulness of our approach have been verified in many experiments taking into account various benchmarks and system configurations. We present the measurement methodology and interpretation of the obtained results, which confirmed their practical significance.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Bienia, Ch., Kumar, S., Singh, J.P., Li, K.: The PARSEC benchmark suite: characterization and architectural implications. Princeton University Technical report TR-811-08 (2008)

    Google Scholar 

  2. Chen, T., Guo, Q., Temam, O., Wu, Y., Bao, Y.: Statistical performance comparisons of computers. IEEE Trans. Comput. 64(5), 1442–1455 (2015)

    Article  MathSciNet  Google Scholar 

  3. Clemons, J., Zhu, H., Savarese, S., Austin, T.: MEVBench: a mobile computer vision benchmarking suite. In: IEEE International Symposium on Workload Characterization (IISWC), pp. 91–102 (2011)

    Google Scholar 

  4. Cooper, B.F., Silberstein, A., Tam, E., Ramakrishnan, R., Sears, R.: Benchmarking cloud serving systems with YCSB. In: ACM Symposium on Cloud Computing, pp. 143–154 (2010)

    Google Scholar 

  5. Eigeman, R.: Performance Evaluation and Benchmarking with Real Applications. MIT Press, Cambrige (2001)

    Google Scholar 

  6. Feitelstone, D.G.: Workload Modelling for Computer Performance Evaluation. Cambridge University Press, Cambridge (2015)

    Book  Google Scholar 

  7. Horn, A.: Kieker: a framework for application performance monitoring and dynamic software analysis. In: ACM International Conference on Performance Evaluation, pp. 247–248 (2012)

    Google Scholar 

  8. John, L.K., Eeckhout, L.: Performance Evaluation and Benchmarking. CRC Taylor & Francis, Boca Raton (2006)

    MATH  Google Scholar 

  9. Kaelli, D., Sachs, K. (eds.): Computer Performance Evaluation and Benchmarking. Proceedings of SPEC Benchmark Workshop. LNCS, vol. 5419. Springer (2009)

    Google Scholar 

  10. Kanoun, K., Spainhower, L.: Dependability Benchmarking for Computer Systems. Wiley-IEEE Computer Society Press (2008). ISBN: 978-0-470-23055-8

    Google Scholar 

  11. Król, M., Sosnowski, J.: Multidimensional monitoring of computer systems, In: UIC-ATC 2009 Symposium and Workshops on Ubiquitous, Autonomic and Trusted Computing in conjunction with the UIC 2009 and ATC 2009, pp. 68–74. IEEE Computer Society (2009)

    Google Scholar 

  12. Kubacki, M., Sosnowski, J.: Applying time series analysis to performance logs. In: Proceedings of SPIE, Photonics Applications in Astronomy, Communications, Industry, and High-Energy Physics Experiments, vol. 9662, pp. 3C1–3C12. SPIE (2015)

    Google Scholar 

  13. Kubacki, M., Sosnowski, J.: Performance issues in creating cloud environment. In: Zamojski, W., et al. (eds.) Complex Systems and Dependability. Advances in Intelligent Systems and Computing, vol. 365, pp. 235–244. Springer (2015)

    Google Scholar 

  14. Kukunas, J.: Power and Performance: Software Analysis and Optimization. Elsevier, San Francisco (2015). ISBN: 978-0-12-800726-6

    Google Scholar 

  15. Leal-Taixé, L., Milan, A., Reid, I., Roth, S., Schindler, K.: MOTChallenge: towards a benchmark for multi-target tracking (2015). https://arxiv.org/abs/1504.01942

  16. Nambiar, R., Poess, M., et al.: TPC benchmark roadmap. In: Selected Topics in Performance Evaluation and Benchmarking, Proceedings of the 4th TPCTC Conference. LNCS, vol. 7755, pp. 1–20. Springer (2012)

    Google Scholar 

  17. Nambiar, R., Poess, M. (eds.): Performance Evaluation and Benchmarking, Traditional to Big Data to Internet of Things. Proceedings of the 7th TPCTC Conference. LNCS, vol. 9508. Springer (2016)

    Google Scholar 

  18. Ondrejka, R., et al.: Red Hat Enterprise Linux 7 Resource Management Guide. 1.3. Resource Controllers in Linux Kernel. https://access.redhat.com/documentation/en-US/Red_Hat_Enterprise_Linux/7/html/Resource_Management_Guide/br-Resource_Controllers_in_Linux_Kernel.html

  19. Stallkamp, J., Schlipsing, M., Salmen, J., Igel, C.: Man vs. computer: benchmarking machine learning algorithms for traffic sign recognition. Neural Netw. 32, 323–332 (2012). doi:10.1016/j.neunet.2012.02.016

    Article  Google Scholar 

  20. Waller, J., Ehmke, N.C., Hasselbring, H.: Including performance benchmarks into continuous integration to enable DevOps. ACM SIGSOFT Softw. Eng. Notes 40(2), 1–4 (2015)

    Article  Google Scholar 

  21. Weiss, C., Westermann, D., Heger, C., Moser, M.: Systematic performance evaluation based on tailored benchmark applications. In: Proceedings of the 4th ACM/SPEC International Conference on Performance Engineering (ICPE 2013), pp. 411–420 (2013)

    Google Scholar 

  22. Phoronix Test Suite (2017). http://phoronix-test-suite.com/

  23. (2016). https://github.com/phoronix-test-suite/phoronix-test-suite/pull/108 and pages with final parameters /92 and /94

  24. Problem Sizes and Parameters in NAS Parallel Benchmarks (2012). https://www.nas.nasa.gov/publications/npb_problem_sizes.html

  25. TPC Current Specifications (2017). http://www.tpc.org/tpc_documents_current_versions/current_specifications.asp

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Janusz Sosnowski .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG

About this paper

Cite this paper

Maleszewski, J., Sosnowski, J. (2018). Managing and Enhancing Performance Benchmarks. In: Zamojski, W., Mazurkiewicz, J., Sugier, J., Walkowiak, T., Kacprzyk, J. (eds) Advances in Dependability Engineering of Complex Systems. DepCoS-RELCOMEX 2017. Advances in Intelligent Systems and Computing, vol 582. Springer, Cham. https://doi.org/10.1007/978-3-319-59415-6_28

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-59415-6_28

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-59414-9

  • Online ISBN: 978-3-319-59415-6

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics