Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3342195.3387554acmconferencesArticle/Chapter ViewAbstractPublication PageseurosysConference Proceedingsconference-collections
research-article

Analyzing system performance with probabilistic performance annotations

Published: 17 April 2020 Publication History

Abstract

To understand, debug, and predict the performance of complex software systems, we develop the concept of probabilistic performance annotations. In essence, we annotate components (e.g., methods) with a relation between a measurable performance metric, such as running time, and one or more features of the input or the state of that component. We use two forms of regression analysis: regression trees and mixture models. Such relations can capture non-trivial behaviors beyond the more classic algorithmic complexity of a component. We present a method to derive such annotations automatically by generalizing observed measurements. We illustrate the use of our approach on three complex systems---the ownCloud distributed storage service; the MySQL database system; and the x264 video encoder library and application---producing non-trivial characterizations of the performance. Notably, we isolate a performance regression and identify the root cause of a second performance bug in MySQL.

References

[1]
Zdravko Botev, Joseph Grotowski, and Dirk Kroese. 2010. Kernel Density Estimation via Diffusion. The Annals of Statistics 38 (11 2010).
[2]
Marc Brünink and David S. Rosenblum. 2016. Mining Performance Specifications. In Proc. 24th ACM SIGSOFT Symposium on Foundations of Software Engineering (FSE '16). ACM, New York, NY, USA, 39--49.
[3]
Emilio Coppa, Camil Demetrescu, and Irene Finocchi. 2012. Input-Sensitive Profiling. In Proc. 2012 PLDI (PLDI '12). ACM, New York, NY, USA, 89--98.
[4]
Emilio Coppa, Camil Demetrescu, Irene Finocchi, and Romolo Marotta. 2014. Estimating the Empirical Cost Function of Routines with Dynamic Workloads. In Proc. 2014 IEEE/ACM International Symposium on Code Generation and Optimization (CGO '14). ACM, New York, NY, USA, 230:230--230:239.
[5]
Michael D. Ernst, Jeff H. Perkins, Philip J. Guo, Stephen McCamant, Carlos Pacheco, Matthew S. Tschantz, and Chen Xiao. 2007. The Daikon System for Dynamic Detection of Likely Invariants. Science of Computer Programming 69, 1--3 (Dec. 2007), 35--45.
[6]
Cormac Flanagan, K. Rustan M. Leino, Mark Lillibridge, Greg Nelson, James B. Saxe, and Raymie Statã. 2002. Extended Static Checking for Java. In Proc. 24th ACM SIGSOFT Symposium on Foundations of Software Engineering (FSE '16). ACM, 234--245.
[7]
Simon F. Goldsmith, Alex S. Aiken, and Daniel S. Wilkerson. 2007. Measuring Empirical Computational Complexity. In Proceedings of the the 6th Joint Meeting of the European Software Engineering Conference and the ACM SIGSOFT Symposium on The Foundations of Software Engineering (ESEC-FSE '07). ACM, New York, NY, USA, 395--404.
[8]
Susan L. Graham, Peter B. Kessler, and Marshall K. Mckusick. 1982. Gprof: A Call Graph Execution Profiler. In Proc. 1982 ACM SIGPLAN Symposium on Compiler Construction (SIGPLAN '82). ACM, New York, NY, USA, 120--126.
[9]
Jianmei Guo, Krzysztof Czarnecki, Sven Apely, Norbert Siegmundy, and Andrzej Wasowski. 2013. Variability-Aware Performance Prediction: A Statistical Learning Approach. In Proc. 28th IEEE/ACM International Conference on Automated Software Engineering (ASE'13). IEEE Press, Piscataway, NJ, USA, 301--311.
[10]
Matthias Hauswirth, Peter F. Sweeney, Amer Diwan, and Michael Hind. 2004. Vertical Profiling: Understanding the Behavior of Object-Priented Applications. In Proc. 2004 OOPSLA (OOPSLA '04). ACM, New York, NY, USA, 251--269.
[11]
Johannes Henkel and Amer Diwan. 2003. Discovering Algebraic Specifications from Java Classes. In Proc. 17th ECCOP (ECOOP '03). 431--456.
[12]
Johannes Henkel and Amer Diwan. 2004. A Tool for Writing and Debugging Algebraic Specifications. In Proc. 26th ICSE (ICSE '04). IEEE Computer Society, Washington, DC, USA, 449--458. http://dl.acm.org/citation.cfm?id=998675.999449
[13]
Kenneth Hoste, Aashish Phansalkar, Lieven Eeckhout, Andy Georges, Lizy K. John, and Koen De Bosschere. 2006. Performance Prediction Based on Inherent Program Similarity. In Proc. 15th International Conference on Parallel Architectures and Compilation Techniques (PACT '06). ACM, New York, NY, USA, 114--122.
[14]
jprofiler 2019. JProfiler. https://www.ej-technologies.com/products/jprofiler/overview.html.
[15]
MySQL Server 2019. MySQL Server. https://github.com/mysql/mysql-server.
[16]
Todd Mytkowicz, Amer Diwan, Matthias Hauswirth, and Peter F. Sweeney. 2009. Producing Wrong Data Without Doing Anything Obviously Wrong!. In Proc. 14th ACM International Conference on Architectural Support for Programming Languages and Operating Systems (ASPLOS XIV). ACM, New York, NY, USA, 265--276.
[17]
Sharon E. Perl and William E. Weihl. 1993. Performance Assertion Checking. In Proc. 14th ACM Symposium on Operating Systems Principles (SOSP '93). ACM, New York, NY, USA, 134--145.
[18]
Pin - A Dynamic Binary Instrumentation Tool 2019. Pin - A Dynamic Binary Instrumentation Tool. https://software.intel.com/en-us/articles/pintool.
[19]
runkit 2019. runkit. http://php.net/manual/en/book.runkit.php.
[20]
George R. Terrell and David W. Scott. 1992. Variable Kernel Density Estimation. Annals of Statistics 20, 3 (09 1992), 1236--1265.
[21]
The DWARF Debugging Standard 2019. The DWARF Debugging Standard. http://dwarfstd.org.
[22]
Eno Thereska, Bjoern Doebel, Alice X. Zheng, and Peter Nobel. 2010. Practical Performance Models for Complex, Popular Applications. In Proc. 2010 ACM SIGMETRICS International Conference on Measurement and Modeling of Computer Systems (SIGMETRICS '10). ACM, New York, NY, USA, 1--12.
[23]
Jeffrey S. Vetter and Patrick H. Worley. 2002. Asserting Performance Expectations. In Proc. 2002 ACM/IEEE Conference on Supercomputing (SC '02). IEEE Computer Society Press, Los Alamitos, CA, USA, 1--13.
[24]
x264 2019. x264. https://www.videolan.org/developers/x264.html.
[25]
Dmitrijs Zaparanuks and Matthias Hauswirth. 2012. Algorithmic Profiling. In Proc. 2012 PLDI. 67--76.

Cited By

View all
  • (2024)Robust Resource Bounds with Static Analysis and Bayesian InferenceProceedings of the ACM on Programming Languages10.1145/36563808:PLDI(76-101)Online publication date: 20-Jun-2024
  • (2023)Enabling BPF Runtime policies for better BPF managementProceedings of the 1st Workshop on eBPF and Kernel Extensions10.1145/3609021.3609297(49-55)Online publication date: 10-Sep-2023
  • (2023)Network-Centric Distributed Tracing with DeepFlow: Troubleshooting Your Microservices in Zero CodeProceedings of the ACM SIGCOMM 2023 Conference10.1145/3603269.3604823(420-437)Online publication date: 10-Sep-2023
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
EuroSys '20: Proceedings of the Fifteenth European Conference on Computer Systems
April 2020
49 pages
ISBN:9781450368827
DOI:10.1145/3342195
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 17 April 2020

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. dynamic analysis
  2. instrumentation
  3. performance
  4. performance analysis

Qualifiers

  • Research-article

Conference

EuroSys '20
Sponsor:
EuroSys '20: Fifteenth EuroSys Conference 2020
April 27 - 30, 2020
Heraklion, Greece

Acceptance Rates

EuroSys '20 Paper Acceptance Rate 43 of 234 submissions, 18%;
Overall Acceptance Rate 241 of 1,308 submissions, 18%

Upcoming Conference

EuroSys '25
Twentieth European Conference on Computer Systems
March 30 - April 3, 2025
Rotterdam , Netherlands

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)32
  • Downloads (Last 6 weeks)1
Reflects downloads up to 17 Oct 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Robust Resource Bounds with Static Analysis and Bayesian InferenceProceedings of the ACM on Programming Languages10.1145/36563808:PLDI(76-101)Online publication date: 20-Jun-2024
  • (2023)Enabling BPF Runtime policies for better BPF managementProceedings of the 1st Workshop on eBPF and Kernel Extensions10.1145/3609021.3609297(49-55)Online publication date: 10-Sep-2023
  • (2023)Network-Centric Distributed Tracing with DeepFlow: Troubleshooting Your Microservices in Zero CodeProceedings of the ACM SIGCOMM 2023 Conference10.1145/3603269.3604823(420-437)Online publication date: 10-Sep-2023
  • (2023)With Great Freedom Comes Great Opportunity: Rethinking Resource Allocation for Serverless FunctionsProceedings of the Eighteenth European Conference on Computer Systems10.1145/3552326.3567506(381-397)Online publication date: 8-May-2023
  • (2022)An Empirical Study on the Impact of Deep Parameters on Mobile App Energy Usage2022 IEEE International Conference on Software Analysis, Evolution and Reengineering (SANER)10.1109/SANER53432.2022.00103(844-855)Online publication date: Mar-2022
  • (2022)On the Effectiveness of Bisection in Performance Regression LocalizationEmpirical Software Engineering10.1007/s10664-022-10152-327:4Online publication date: 30-Apr-2022

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media