Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3491204.3527489acmconferencesArticle/Chapter ViewAbstractPublication PagesicpeConference Proceedingsconference-collections
short-paper

Beware of the Interactions of Variability Layers When Reasoning about Evolution of MongoDB

Published: 19 July 2022 Publication History

Abstract

With commits and releases, hundreds of tests are run on varying conditions (e.g., over different hardware and workload) that can help to understand evolution and ensure non-regression of software performance. We hypothesize that performance is not only sensitive to evolution of software, but also to different variability layers of its execution environment, spanning the hardware, the operating system, the build, or the workload processed by the software. Leveraging the MongoDB dataset, our results show that changes in hardware and workload can drastically impact performance evolution and thus should be taken into account when reasoning about performance. An open problem resulting from this study is how to manage the variability layers in order to efficiently test the performance evolution of a software.

References

[1]
Hervé Abdi. 2007. Z-scores. Encyclopedia of measurement and statistics, Vol. 3 (2007), 1055--1058.
[2]
Juan Pablo Sandoval Alcocer and Alexandre Bergel. 2015. Tracking down Performance Variation against Source Code Evolution. SIGPLAN Not., Vol. 51, 2 (oct 2015), 129--139. https://doi.org/10.1145/2936313.2816718
[3]
Juliana Alves Pereira, Mathieu Acher, Hugo Martin, and Jean-Marc Jézéquel. 2020. Sampling Effect on Performance Prediction of Configurable Systems: A Case Study. In International Conference on Performance Engineering (ICPE 2020) . https://hal.inria.fr/hal-02356290
[4]
Christopher Brink, Erik Kamsties, Martin Peters, and Sabine Sachweh. 2014. On Hardware Variability and the Relation to Software Variability. In 2014 40th EUROMICRO Conference on Software Engineering and Advanced Applications . 352--355. https://doi.org/10.1109/SEAA.2014.15
[5]
Claudia Canali, Michele Colajanni, and Riccardo Lancellotti. 2009. Performance Evolution of Mobile Web-Based Services. IEEE Internet Computing, Vol. 13, 2 (2009), 60--68. https://doi.org/10.1109/MIC.2009.43
[6]
Emilio Coppa, Camil Demetrescu, Irene Finocchi, and Romolo Marotta. 2014. Estimating the Empirical Cost Function of Routines with Dynamic Workloads. In Proc. of CGO'14 . 230:239. https://doi.org/10.1145/2581122.2544143
[7]
David Daly. 2021. Creating a Virtuous Cycle in Performance Testing at MongoDB .Association for Computing Machinery, New York, NY, USA, 33--41. https://doi.org/10.1145/3427921.3450234
[8]
David Daly, William Brown, Henrik Ingo, Jim O'Leary, and David Bradford. 2020. The Use of Change Point Detection to Identify Software Performance Regressions in a Continuous Integration System. In Proceedings of the ACM/SPEC International Conference on Performance Engineering (Edmonton AB, Canada) (ICPE '20). Association for Computing Machinery, New York, NY, USA, 67--75. https://doi.org/10.1145/3358960.3375791
[9]
Brian Dougherty, Jules White, Chris Thompson, and Douglas C. Schmidt. 2009. Automating Hardware and Software Evolution Analysis. In 2009 16th Annual IEEE International Conference and Workshop on the Engineering of Computer Based Systems . 265--274. https://doi.org/10.1109/ECBS.2009.22
[10]
Shirley M Dowdy and Stanley Wearden. 1983. Statistics for research .
[11]
Omar Elkeelany and Suman Nimmagadda. 2007. Performance Evaluation of Different Hardware Models of RC5 Algorithm. In 2007 Thirty-Ninth Southeastern Symposium on System Theory. 124--127. https://doi.org/10.1109/SSST.2007.352331
[12]
Hany FathyAtlam, Gamal Attiya, and Nawal El-Fishawy. 2013. Comparative Study on CBIR based on Color Feature. International Journal of Computer Applications, Vol. 78, 16 (Sept. 2013), 9--15. https://doi.org/10.5120/13605--1387
[13]
Simon F. Goldsmith, Alex S. Aiken, and Daniel S. Wilkerson. 2007. Measuring Empirical Computational Complexity. In Proc. of ESEC-FSE'07. 395--404.
[14]
Pooyan Jamshidi, Norbert Siegmund, Miguel Velez, Christian K"astner, Akshay Patel, and Yuvraj Agarwal. 2017. Transfer Learning for Performance Modeling of Configurable Systems: An Exploratory Analysis. In Proc. of ASE'17. 497--508.
[15]
Philipp Leitner and Jürgen Cito. 2016. Patterns in the Chaos-A Study of Performance Variation and Predictability in Public IaaS Clouds. ACM Trans. Internet Technol., Vol. 16, 3, Article 15 (April 2016), 23 pages. https://doi.org/10.1145/2885497
[16]
Luc Lesoil, Mathieu Acher, Arnaud Blouin, and Jean-Marc Jézéquel. 2021 a. Deep Software Variability: Towards Handling Cross-Layer Configuration. In 15th International Working Conference on Variability Modelling of Software-Intensive Systems (Krems, Austria) (VaMoS'21). Association for Computing Machinery, New York, NY, USA, Article 10, 8 pages. https://doi.org/10.1145/3442391.3442402
[17]
Luc Lesoil, Mathieu Acher, Arnaud Blouin, and Jean-Marc Jézéquel. 2021 b. The Interaction between Inputs and Configurations fed to Software Systems: an Empirical Study. arxiv: 2112.07279 [cs.SE]
[18]
Luc Lesoil, Mathieu Acher, Xhevahire Tërnava, Arnaud Blouin, and Jean-Marc Jézéquel. 2021 c. The Interplay of Compile-time and Run-time Options for Performance Prediction. In SPLC 2021 - 25th ACM International Systems and Software Product Line Conference - Volume A. ACM, Leicester, United Kingdom, 1--12. https://doi.org/10.1145/3461001.3471149
[19]
Hugo Martin, Mathieu Acher, Juliana Alves Pereira, Luc Lesoil, Jean-Marc Jézéquel, and Djamel Eddine Khelladi. 2021. Transfer Learning Across Variants and Versions: The Case of Linux Kernel Size. IEEE Transactions on Software Engineering (2021), 1--17. https://hal.inria.fr/hal-03358817
[20]
T. Mens, M. Wermelinger, S. Ducasse, S. Demeyer, R. Hirschfeld, and M. Jazayeri. 2005. Challenges in software evolution. In Eighth International Workshop on Principles of Software Evolution (IWPSE'05). 13--22. https://doi.org/10.1109/IWPSE.2005.7
[21]
Meinard Müller. 2007. Dynamic time warping. Information retrieval for music and motion (2007), 69--84.
[22]
S. Mühlbauer, S. Apel, and N. Siegmund. 2020. Identifying Software Performance Changes Across Variants and Versions. In 2020 35th IEEE/ACM International Conference on Automated Software Engineering (ASE) . 611--622.
[23]
Suporn Pongnumkul, Chaiyaphum Siripanpornchana, and Suttipong Thajchayapong. 2017. Performance Analysis of Private Blockchain Platforms in Varying Workloads. In Proc. of ICCCN'17. 1--7. https://doi.org/10.1109/icccn.2017.8038517
[24]
Juan Pablo Sandoval Alcocer, Alexandre Bergel, Stéphane Ducasse, and Marcus Denker. 2013. Performance evolution blueprint: Understanding the impact of software evolution on performance. In 2013 First IEEE Working Conference on Software Visualization (VISSOFT). 1--9. https://doi.org/10.1109/VISSOFT.2013.6650523
[25]
Mohammed Sayagh, Noureddine Kerzazi, and Bram Adams. 2017. On Cross-Stack Configuration Errors. In Proceedings of the 39th International Conference on Software Engineering (Buenos Aires, Argentina) (ICSE '17). IEEE Press, 255--265. https://doi.org/10.1109/ICSE.2017.31
[26]
Urjoshi Sinha, Mikaela Cashman, and Myra B. Cohen. 2020. Using a Genetic Algorithm to Optimize Configurations in a Data-Driven Application. In Proc. of SSBSE'20. 137--152. https://doi.org/10.1007/978--3-030--59762--7_10
[27]
Pavel Valov, Jianmei Guo, and Krzysztof Czarnecki. 2020. Transferring Pareto Frontiers across Heterogeneous Hardware Environments. In Proc. of ICPE'20 . 12--23. https://doi.org/10.1145/3358960.3379127
[28]
Pavel Valov, Jean-Christophe Petkovich, Jianmei Guo, Sebastian Fischmeister, and Krzysztof Czarnecki. 2017. Transferring Performance Prediction Models Across Different Hardware Platforms. In Proc. of ICPE'17. 39--50.

Cited By

View all

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
ICPE '22: Companion of the 2022 ACM/SPEC International Conference on Performance Engineering
July 2022
166 pages
ISBN:9781450391597
DOI:10.1145/3491204
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 19 July 2022

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. deep software variability
  2. software evolution

Qualifiers

  • Short-paper

Funding Sources

Conference

ICPE '22

Acceptance Rates

ICPE '22 Paper Acceptance Rate 14 of 58 submissions, 24%;
Overall Acceptance Rate 252 of 851 submissions, 30%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)11
  • Downloads (Last 6 weeks)1
Reflects downloads up to 16 Nov 2024

Other Metrics

Citations

Cited By

View all

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media