Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/1806596.1806618acmconferencesArticle/Chapter ViewAbstractPublication PagespldiConference Proceedingsconference-collections
research-article

Evaluating the accuracy of Java profilers

Published: 05 June 2010 Publication History

Abstract

Performance analysts profile their programs to find methods that are worth optimizing: the "hot" methods. This paper shows that four commonly-used Java profilers (xprof, hprof, jprofile, and yourkit) often disagree on the identity of the hot methods. If two profilers disagree, at least one must be incorrect. Thus, there is a good chance that a profiler will mislead a performance analyst into wasting time optimizing a cold method with little or no performance improvement.
This paper uses causality analysis to evaluate profilers and to gain insight into the source of their incorrectness. It shows that these profilers all violate a fundamental requirement for sampling based profilers: to be correct, a sampling-based profilermust collect samples randomly.
We show that a proof-of-concept profiler, which collects samples randomly, does not suffer from the above problems. Specifically, we show, using a number of case studies, that our profiler correctly identifies methods that are important to optimize; in some cases other profilers report that these methods are cold and thus not worth optimizing.

References

[1]
B. Alpern, C. R. Attanasio, J. J. Barton, M. G. Burke, P. Cheng, J.-D. Choi, A. Cocchi, S. J. Fink, D. Grove, M. Hind, S. F. Hummel, D. Lieber, V. Litvinov, M. F. Mergen, T. Ngo, J. R. Russell, V. Sarkar, M. J. Serrano, J. C. Shepherd, S. E. Smith, V. C. Sreedhar, H. Srinivasan, and J. Whaley. The Jalapeño virtual machine. IBM Systems Journal, 39(1):211--238, February 2000.
[2]
M. Arnold and D. Grove. Collecting and exploiting high-accuracy call graph profiles in virtual machines. In Proc. of Int'l Symposium on Code Generation and Optimization, pages 51--62, Los Alamos, CA, March 2005. IEEE Computer Society.
[3]
S. M. Blackburn, R. Garner, C. Hoffman, A. M. Khan, K. S. McKinley, R. Bentzur, A. Diwan, D. Feinberg, D. Frampton, S. Z. Guyer, M. Hirzel, A. Hosking, M. Jump, H. Lee, J. E. B. Moss, A. Phansalkar, D. Stefanović, T. VanDrunen, D. von Dincklage, and B. Wiedermann. The DaCapo benchmarks: Java benchmarking development and analysis. In Proc. of ACM SIGPLAN Conf. on Object-Oriented Programing, Systems, Languages, and Applications, pages 169--190, Portland, OR, Oct. 2006. ACM.
[4]
S. M. Blackburn, P. Cheng, and K. S. Mckinley. Myths and realities: The performance impact of garbage collection. In Proc. of ACM SIMETRICS Conf. onMeasurement andModeling Computer Systems, pages 25--36, New York, NY, Jan. 2004. ACM.
[5]
D. Buytaert, A. Georges, M. Hind, M. Arnold, L. Eeckhout, and K. De Bosschere. Using HPM-sampling to drive dynamic compilation. In Proc. of ACM SIGPLAN Conf. on Object-Oriented Programing, Systems, Languages, and Applications, pages 553--568, Montreal, Canada, Oct. 2007. ACM.
[6]
Dehao Chen, Neil Vachharajani, and Robert Hundt. Taming hardware event samples for fdo compilation. International Symposium on Code Generation and Optimization (CGO), 2010.
[7]
A. Diwan, E. Moss, and R. Hudson. Compiler support for garbage collection in a statically typed language. SIGPLAN Not., 27(7):273--282, 1992.
[8]
M. Dmitriev. Selective profiling of Java applications using dynamic bytecode instrumentation. In Proc. of IEEE Int'l Symposium on Performance Analysis of Systems and Software, pages 141--150, Washington, DC, March 2004. IEEE.
[9]
E. Duesterwald and V. Bala. Software profiling for hot path prediction: less is more. SIGPLAN Not., 35(11):202--211, 2000.
[10]
Eclipse: Open source java profiler v4.6.1. http://www.eclipse.org/tptp/.
[11]
Ej technologies: Commercial java profiler. http://www.ejtechnologies.com/products/jprofiler/overview.html.
[12]
A. Georges, D. Buytaert, and L. Eeckhout. Statistically rigorous Java performance evaluation. In Proc. of ACM SIGPLAN Conf. on Objectoriented Programming, Systems, Languages and Applications, pages 57--76, Montreal, Canada, Oct. 2007. ACM.
[13]
S. L. Graham, P. B. Kessler, and M. K. Mckusick. Gprof: A call graph execution profiler. In Proc. of ACM SIGPLAN Symposium on Compiler Construction, pages 120--126, Boston, Mass., 1982. ACM.
[14]
D. Gu, C. Verbrugge, and E. Gagnon. Code layout as a source of noise in JVM performance. Studia Informatica Universalis, pages 83--99, 2004.
[15]
R. Hegger, H. Kantz, and T. Schreiber. Practical implementation of nonlinear time series methods: The TISEAN package. Chaos, 9(2):413--435, 1999.
[16]
Sam Kash Kachigan. Statistical Analysis: An Interdisciplinary Introduction to Univariate & Multivariate Methods. Radius Press, 1986.
[17]
S. Mccanne and C. Torek. A randomized sampling clock for CPU utilization estimation and code profiling. In Proc. of the Winter USENIX Conf., pages 387--394, San Diego, CA, Jan. 1993.
[18]
T. Moseley, A. Shye, V.J. Reddi, D. Grunwald, and R. Peri. Shadow profiling: Hiding instrumentation costs with parallelism. In Proc. of Int'l Symposium on Code Generation and Optimization, pages 198--208, Washington, DC, March 2007. IEEE Computer Society.
[19]
T. Mytkowicz, A. Diwan, M. Hauswirth, and P. F. Sweeney. Producing wrong data without doing anything obviously wrong! In Proc. of Int'l Conf. on Architectural Support for Programming Languages and Operating Systems, pages 265--276, Washington, DC, March 2009. ACM.
[20]
Netbeans: Open source java profiler. v6.7. http://profiler.netbeans.org/.
[21]
J. Pearl. Causality: Models, Reasoning, and Inference. Cambridge University Press, 1st edition, 2000.
[22]
S. Rubin, R. Bodík, and T. Chilimbi. An efficient profile-analysis framework for data-layout optimizations. SIGPLAN Not., 37(1):140--153, 2002.
[23]
hprof: an open source java profiler. http://java.sun.com/developer/technicalArticles/Programming/HPROF.html
[24]
xprof: Internal profiler for hotspot. http://java.sun.com/docs/books/performance/1st edition/html/JPAppHotspot.fm.html.
[25]
J. Whaley. A portable sampling-based profiler for java virtual machines. In Proc. of Conf. on Java Grande, pages 78--87, New York, NY, 2000. ACM.
[26]
Yourkit, llc: Commercial java profiler. http://www.yourkit.com

Cited By

View all
  • (2024)Towards Realistic Results for Instrumentation-Based Profilers for JIT-Compiled SystemsProceedings of the 21st ACM SIGPLAN International Conference on Managed Programming Languages and Runtimes10.1145/3679007.3685058(82-89)Online publication date: 13-Sep-2024
  • (2024)Comparing apples and oranges? Investigating the consistency of CPU and memory profiler results across multiple java versionsAutomated Software Engineering10.1007/s10515-024-00423-231:1Online publication date: 22-Mar-2024
  • (2023)Don’t Trust Your Profiler: An Empirical Study on the Precision and Accuracy of Java Profilers (Poster Abstract)Proceedings of the 20th ACM SIGPLAN International Conference on Managed Programming Languages and Runtimes10.1145/3617651.3624307(181-182)Online publication date: 19-Oct-2023
  • Show More Cited By

Index Terms

  1. Evaluating the accuracy of Java profilers

      Recommendations

      Comments

      Please enable JavaScript to view thecomments powered by Disqus.

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      PLDI '10: Proceedings of the 31st ACM SIGPLAN Conference on Programming Language Design and Implementation
      June 2010
      514 pages
      ISBN:9781450300193
      DOI:10.1145/1806596
      • cover image ACM SIGPLAN Notices
        ACM SIGPLAN Notices  Volume 45, Issue 6
        PLDI '10
        June 2010
        496 pages
        ISSN:0362-1340
        EISSN:1558-1160
        DOI:10.1145/1809028
        Issue’s Table of Contents
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 05 June 2010

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. bias
      2. observer effect
      3. profiling

      Qualifiers

      • Research-article

      Conference

      PLDI '10
      Sponsor:

      Acceptance Rates

      Overall Acceptance Rate 406 of 2,067 submissions, 20%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)41
      • Downloads (Last 6 weeks)0
      Reflects downloads up to 31 Jan 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Towards Realistic Results for Instrumentation-Based Profilers for JIT-Compiled SystemsProceedings of the 21st ACM SIGPLAN International Conference on Managed Programming Languages and Runtimes10.1145/3679007.3685058(82-89)Online publication date: 13-Sep-2024
      • (2024)Comparing apples and oranges? Investigating the consistency of CPU and memory profiler results across multiple java versionsAutomated Software Engineering10.1007/s10515-024-00423-231:1Online publication date: 22-Mar-2024
      • (2023)Don’t Trust Your Profiler: An Empirical Study on the Precision and Accuracy of Java Profilers (Poster Abstract)Proceedings of the 20th ACM SIGPLAN International Conference on Managed Programming Languages and Runtimes10.1145/3617651.3624307(181-182)Online publication date: 19-Oct-2023
      • (2023)Improving Garbage Collection Observability with Performance TracingProceedings of the 20th ACM SIGPLAN International Conference on Managed Programming Languages and Runtimes10.1145/3617651.3622986(85-99)Online publication date: 19-Oct-2023
      • (2023)Don’t Trust Your Profiler: An Empirical Study on the Precision and Accuracy of Java ProfilersProceedings of the 20th ACM SIGPLAN International Conference on Managed Programming Languages and Runtimes10.1145/3617651.3622985(100-113)Online publication date: 19-Oct-2023
      • (2021)TIP: Time-Proportional Instruction ProfilingMICRO-54: 54th Annual IEEE/ACM International Symposium on Microarchitecture10.1145/3466752.3480058(15-27)Online publication date: 18-Oct-2021
      • (2021)JPortal: precise and efficient control-flow tracing for JVM programs with Intel processor traceProceedings of the 42nd ACM SIGPLAN International Conference on Programming Language Design and Implementation10.1145/3453483.3454096(1080-1094)Online publication date: 19-Jun-2021
      • (2021)Methodological Principles for Reproducible Performance Evaluation in Cloud ComputingIEEE Transactions on Software Engineering10.1109/TSE.2019.292790847:8(1528-1543)Online publication date: 1-Aug-2021
      • (2021)White-Box Performance-Influence Models: A Profiling and Learning Approach2021 IEEE/ACM 43rd International Conference on Software Engineering (ICSE)10.1109/ICSE43902.2021.00099(1059-1071)Online publication date: May-2021
      • (2020)FaultSee: Reproducible Fault Injection in Distributed Systems2020 16th European Dependable Computing Conference (EDCC)10.1109/EDCC51268.2020.00014(25-32)Online publication date: Sep-2020
      • Show More Cited By

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media