Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/2517208.2517209acmconferencesArticle/Chapter ViewAbstractPublication PagesgpceConference Proceedingsconference-collections
research-article

Family-based performance measurement

Published: 27 October 2013 Publication History

Abstract

Most contemporary programs are customizable. They provide many features that give rise to millions of program variants. Determining which feature selection yields an optimal performance is challenging, because of the exponential number of variants. Predicting the performance of a variant based on previous measurements proved successful, but induces a trade-off between the measurement effort and prediction accuracy. We propose the alternative approach of family-based performance measurement, to reduce the number of measurements required for identifying feature interactions and for obtaining accurate predictions. The key idea is to create a variant simulator (by translating compile-time variability to run-time variability) that can simulate the behavior of all program variants. We use it to measure performance of individual methods, trace methods to features, and infer feature interactions based on the call graph. We evaluate our approach by means of five feature-oriented programs. On average, we achieve accuracy of 98%, with only a single measurement per customizable program. Observations show that our approach opens avenues of future research in different domains, such an feature-interaction detection and testing.

References

[1]
G. Ammons, T. Ball, and J. Larus. Exploiting hardware performance counters with flow and context sensitive profiling. In Proc., PLDI, pages 85--96. ACM, 1997.
[2]
S. Apel, H. Speidel, P. Wendler, A. von Rhein, and D. Beyer. Detection of feature interactions using feature-aware verification. In Proc. ASE, pages 372--375. IEEE, 2011.
[3]
S. Apel, C. Kästner, and C. Lengauer. Language-independent and automated software composition: The FeatureHouse experience. IEEE Transactions on Software Engineering, 39 (1): 63--79, 2013.
[4]
S. Apel, A. von Rhein, P. Wendler, A. Größlinger, and D. Beyer. Strategies for product-line verification: Case studies and experiments. In Proc. ICSE, pages 482--491. IEEE, 2013.
[5]
S. Balsamo, A. Di Marco, P. Inverardi, and M. Simeoni. Model-based performance prediction in software development: A survey. IEEE Transactions on Software Engineering, 30 (5): 295--310, 2004.
[6]
D. Batory, J. N. Sarvela, and A. Rauschmayer. Scaling step-wise refinement. IEEE Transactions on Software Engineering, 30 (6): 355--371, 2004.
[7]
D. Batory, P. Höfner, and J. Kim. Feature interactions, products, and composition. In Proc. GPCE, pages 13--22. ACM, 2011.
[8]
C. Brabrand, M. Ribeiro, T. Tolêdo, J. Winther, and P. Borba. Intraprocedural dataflow analysis for software product lines. Transactions on Aspect-Oriented Software Development, 10: 73--108, 2013.
[9]
M. Calder and A. Miller. Feature interaction detection by pairwise analysis of LTL properties: A case study. Formal Methods in System Design, 28 (3): 213--261, 2006.
[10]
M. Calder, M. Kolberg, E. H. Magill, and S. Reiff-Marganiec. Feature interaction: A critical review and considered forecast. Computer Networks and ISDN Systems, 41: 115--141, 2003.
[11]
S. Chen, Y. Liu, I. Gorton, and A. Liu. Performance prediction of component-based applications. Journal of Systems and Software, 74 (1): 35--43, 2005.
[12]
E. Clarke, O. Grumberg, and D. Peled. Model Checking. MIT Press, 1999.
[13]
A. Classen, P. Heymans, P.-Y. Schobbens, A. Legay, and J.-F. Raskin. Model checking lots of systems: Efficient verification of temporal properties in software product lines. In Proc. ICSE, pages 335--344. ACM, 2010.
[14]
M. Erwig and E. Walkingshaw. The choice calculus: A representation for software variation. ACM Transactions on Software Engineering and Methodology, 21 (1): 1--27, 2011.
[15]
C. Ghezzi and A. Sharifloo. Model-based verification of quantitative non-functional properties for software product lines. Information and Software Technology, 55 (3): 508--524, 2013.
[16]
M. Grechanik, C. Fu, and Q. Xie. Automatically finding performance problems with feedback-directed learning software testing. In Proc., ICSE, pages 156--166. IEEE, 2012.
[17]
J. Guo, K. Czarnecki, S. Apel, N. Siegmund, and A. Wasowski. Variability-aware performance prediction: A statistical learning approach. In Proc., ASE. IEEE, 2013. to appear.
[18]
R. Hall. Fundamental nonmodularity in electronic mail. Automated Software Engineering, 12 (1): 41--79, 2005.
[19]
M. Jovic, A. Adamoli, and M. Hauswirth. Catch me if you can: Performance bug detection in the wild. In Proc.,OOPSLA, pages 155--170. ACM, 2011.
[20]
C. Kästner and S. Apel. Type-checking software product lines - a formal approach. In Proc., ASE, pages 258--267. IEEE, 2008.
[21]
C. Kästner, S. Apel, and M. Kuhlemann. A model of refactoring physically and virtually separated features. In Proc., GPCE, pages 157--166, 2009.
[22]
C. Kästner, P. Giarrusso, T. Rendel, S. Erdweg, K. Ostermann, and T. Berger. Variability-aware parsing in the presence of lexical macros and conditional compilation. In Proc. OOPSLA, pages 805--824. ACM, 2011.
[23]
C. Kästner, S. Apel, T. Thüm, and G. Saake. Type checking annotation-based product lines. ACM Transactions on Software Engineering and Methodology, 21 (3): 14:1--14:39, 2012.
[24]
S. Kolesnikov, S. Apel, N. Siegmund, S. Sobernig, C. Kästner, and S. Senkaya. Predicting quality attributes of software product lines using software and network measures and sampling. In Proc., VaMoS, pages 25--29. ACM, 2013.
[25]
Y. Kwon, S. Lee, H. Yi, D. Kwon, S. Yang, B.-G. Chun, L. Huang, P. Maniatis, M. Naik, and Y. Paek. Automatic generation of efficient performance predictors for smartphone applications. In Proc., USENIX, pages 297--308. Usenix Association, 2013.
[26]
J. Liebig, S. Apel, C. Lengauer, C. Kästner, and M. Schulze. An analysis of the variability in forty preprocessor-based software product lines. In Proc. ICSE, pages 105--114. ACM, 2010.
[27]
J. Liebig, A. von Rhein, C. Kästner, S. Apel, J. Dörre, and C. Lengauer. Scalable Analysis of Variable Software. In Proc. ESEC/FSE. ACM, 2013.
[28]
J. Liu, D. Batory, and C. Lengauer. Feature-oriented refactoring of legacy applications. In Proc., ICSE, pages 112--121. ACM, 2006.
[29]
M. Plath and M. Ryan. Feature integration using a feature construct. Science of Computer Programming, 41 (1): 53--84, 2001.
[30]
K. Pomakis and J. Atlee. Reachability analysis of feature interactions: A progress report. In Proc., ISSTA, pages 216--223. ACM, 1996.
[31]
H. Post and C. Sinz. Configuration lifting: Verification meets software configuration. In Proc. ASE, pages 347--350. IEEE, 2008.
[32]
N. Siegmund, M. Rosenmüller, C. Kästner, P. Giarrusso, S. Apel, and S. Kolesnikov. Scalable prediction of non-functional properties in software product lines. In Proc. SPLC, pages 160--169. IEEE, 2011.
[33]
N. Siegmund, S. Kolesnikov, C. Kästner, S. Apel, D. Batory, M. Rosenmüller, and G. Saake. Predicting performance via automated feature-interaction detection. In Proc. ICSE, pages 167--177. IEEE, 2012.
[34]
N. Siegmund, M. Rosenmüller, C. Kästner, P. Giarrusso, S. Apel, and S. Kolesnikov. Scalable prediction of non-functional properties in software product lines: Footprint and memory consumption. Information and Software Technology, 55 (3): 491--507, 2013.
[35]
C. Song, A. Porter, and J. Foster. iTree: Efficiently discovering high-coverage configurations using interaction trees. In Proc. ICSE, pages 903--913. IEEE, 2012.
[36]
T. Thüm, S. Apel, C. Kästner, M. Kuhlemann, I. Schäfer, and G. Saake. Analysis strategies for software product lines. Technical report, University of Magdeburg, Nb.: FIN-04-2012, 2012.
[37]
T. Thüm, I. Schaefer, S. Apel, and M. Hentschel. Family-based deductive verification of software product lines. In Proc. GPCE, pages 11--20. ACM, 2012.
[38]
I. H. Witten and E. Frank. Data mining: Practical machine learning tools and techniques. Elsevier, Morgan Kaufman, 2. edition, 2005.

Cited By

View all
  • (2023)HINNPerf: Hierarchical Interaction Neural Network for Performance Prediction of Configurable SystemsACM Transactions on Software Engineering and Methodology10.1145/352810032:2(1-30)Online publication date: 30-Mar-2023
  • (2021)White-Box Analysis over Machine Learning: Modeling Performance of Configurable Systems2021 IEEE/ACM 43rd International Conference on Software Engineering (ICSE)10.1109/ICSE43902.2021.00100(1072-1084)Online publication date: May-2021
  • (2021)White-Box Performance-Influence Models: A Profiling and Learning Approach2021 IEEE/ACM 43rd International Conference on Software Engineering (ICSE)10.1109/ICSE43902.2021.00099(1059-1071)Online publication date: May-2021
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
GPCE '13: Proceedings of the 12th international conference on Generative programming: concepts & experiences
October 2013
198 pages
ISBN:9781450323734
DOI:10.1145/2517208
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 27 October 2013

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. family-based analysis
  2. featurehouse
  3. performance prediction

Qualifiers

  • Research-article

Conference

GPCE'13
Sponsor:
GPCE'13: Generative Programming: Concepts and Experiences
October 27 - 28, 2013
Indiana, Indianapolis, USA

Acceptance Rates

GPCE '13 Paper Acceptance Rate 20 of 59 submissions, 34%;
Overall Acceptance Rate 56 of 180 submissions, 31%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)7
  • Downloads (Last 6 weeks)3
Reflects downloads up to 12 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2023)HINNPerf: Hierarchical Interaction Neural Network for Performance Prediction of Configurable SystemsACM Transactions on Software Engineering and Methodology10.1145/352810032:2(1-30)Online publication date: 30-Mar-2023
  • (2021)White-Box Analysis over Machine Learning: Modeling Performance of Configurable Systems2021 IEEE/ACM 43rd International Conference on Software Engineering (ICSE)10.1109/ICSE43902.2021.00100(1072-1084)Online publication date: May-2021
  • (2021)White-Box Performance-Influence Models: A Profiling and Learning Approach2021 IEEE/ACM 43rd International Conference on Software Engineering (ICSE)10.1109/ICSE43902.2021.00099(1059-1071)Online publication date: May-2021
  • (2020)ConfigCrusher: towards white-box performance analysis for configurable systemsAutomated Software Engineering10.1007/s10515-020-00273-8Online publication date: 5-Aug-2020
  • (2019)Challenges and Insights from Optimizing Configurable Software SystemsProceedings of the 13th International Workshop on Variability Modelling of Software-Intensive Systems10.1145/3302333.3302335(1-2)Online publication date: 6-Feb-2019
  • (2019)PLUSProceedings of the 41st International Conference on Software Engineering: New Ideas and Emerging Results10.1109/ICSE-NIER.2019.00028(77-80)Online publication date: 27-May-2019
  • (2019)On the relation of control-flow and performance feature interactionsEmpirical Software Engineering10.1007/s10664-019-09705-w24:4(2410-2437)Online publication date: 1-Aug-2019
  • (2018)Feature-family-based reliability analysis of software product linesInformation and Software Technology10.5555/3163583.316367994:C(59-81)Online publication date: 1-Feb-2018
  • (2018)A classification of product sampling for software product linesProceedings of the 22nd International Systems and Software Product Line Conference - Volume 110.1145/3233027.3233035(1-13)Online publication date: 10-Sep-2018
  • (2018)Quantifying structural attributes of system decompositions in 28 feature-oriented software product linesEmpirical Software Engineering10.1007/s10664-014-9336-621:4(1670-1705)Online publication date: 26-Dec-2018
  • Show More Cited By

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media