Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3597926.3605232acmconferencesArticle/Chapter ViewAbstractPublication PagesisstaConference Proceedingsconference-collections
research-article
Open access

Automatic Testing and Benchmarking for Configurable Static Analysis Tools

Published: 13 July 2023 Publication History

Abstract

Static analysis is an important tool for detecting bugs in real-world software. The advent of numerous analysis algorithms with their own tradeoffs has led to the proliferation of configurable static analysis tools, but their complex, undertested configuration spaces are obstacles to their widespread adoption. To improve the reliability of these tools, my research focuses on developing new approaches to automatically test and debug them. First, I describe an empirical study that helps to understand the performance and behavior of configurable taint analysis tools for Android. The findings of this study motivate the development of ECSTATIC, a framework for testing and debugging that goes beyond taint analysis to test any configurable static analysis tool. The next steps for this research involve the automatic creation of real-world benchmarks for static analysis with associated ground truths and analysis features.

References

[1]
2021. DroidBench 3.0. §MALL. https://github.com/FoelliX/ReproDroid
[2]
2021. FossDroid. §MALL. https://fossdroid.com
[3]
2022. JS Delta. https://github.com/wala/jsdelta
[4]
2022. TAJS. https://github.com/cs-au-dk/TAJS
[5]
2022. The Call-graph Assessment & Test Suite. https://bitbucket.org/delors/cats/src/master/.
[6]
Esben Andreasen and Anders Møller. 2014. Determinacy in Static Analysis for jQuery. In Proc. ACM SIGPLAN Conference on Object-Oriented Programming, Systems, Languages, and Applications (OOPSLA).
[7]
Steven Arzt, Siegfried Rasthofer, Christian Fritz, Eric Bodden, Alexandre Bartel, Jacques Klein, Yves Le Traon, Damien Octeau, and Patrick McDaniel. 2014. FlowDroid: Precise Context, Flow, Field, Object-Sensitive and Lifecycle-Aware Taint Analysis for Android Apps. In Proceedings of the 35th ACM SIGPLAN Conference on Programming Language Design and Implementation (PLDI ’14). Association for Computing Machinery, New York, NY, USA. 259–269. isbn:9781450327848 https://doi.org/10.1145/2594291.2594299
[8]
Dan Boxler and Kristen R Walcott. 2018. Static Taint Analysis Tools to Detect Information Flows. In Proceedings of the International Conference on Software Engineering Research and Practice (SERP). 46–52.
[9]
Junjie Chen, Jibesh Patra, Michael Pradel, Yingfei Xiong, Hongyu Zhang, Dan Hao, and Lu Zhang. 2020. A Survey of Compiler Testing. ACM Comput. Surv., 53, 1 (2020), Article 4, feb, 36 pages. issn:0360-0300 https://doi.org/10.1145/3363562
[10]
Lisa Nguyen Quang Do, Stefan Krüger, Patrick Hill, Karim Ali, and Eric Bodden. 2020. Debugging Static Analysis. IEEE Transactions on Software Engineering, 46, 7 (2020), 697–709. https://doi.org/10.1109/TSE.2018.2868349
[11]
Michael I Gordon, Deokhwan Kim, Jeff H Perkins, Limei Gilham, Nguyen Nguyen, and Martin C Rinard. 2015. Information flow analysis of android applications in droidsafe. In NDSS. 15, 110.
[12]
Brittany Johnson, Yoonki Song, Emerson Murphy-Hill, and Robert Bowdidge. 2013. Why don’t software developers use static analysis tools to find bugs? In 2013 35th International Conference on Software Engineering (ICSE). 672–681. https://doi.org/10.1109/ICSE.2013.6606613
[13]
Christian Gram Kalhauge and Jens Palsberg. 2019. Binary reduction of dependency graphs. In Proceedings of the 2019 27th ACM Joint Meeting on European Software Engineering Conference and Symposium on the Foundations of Software Engineering. 556–566.
[14]
Min Gyung Kang, Stephen McCamant, Pongsin Poosankam, and Dawn Song. 2011. Dta++: dynamic taint analysis with targeted control-flow propagation. In NDSS.
[15]
Ugur Koc, Austin Mordahl, Shiyi Wei, Jeffrey S Foster, and Adam A Porter. 2021. SATune: a study-driven auto-tuning approach for configurable software verification tools. In 2021 36th IEEE/ACM International Conference on Automated Software Engineering (ASE). 330–342.
[16]
Ondřej Lhoták and Laurie Hendren. 2008. Evaluating the Benefits of Context-Sensitive Points-to Analysis Using a BDD-Based Implementation. ACM Trans. Softw. Eng. Methodol., 18, 1 (2008), Article 3, oct, 53 pages. issn:1049-331X https://doi.org/10.1145/1391984.1391987
[17]
L. Luo, E. Bodden, and J. Späth. 2019. A Qualitative Analysis of Android Taint-Analysis Results. In 2019 34th IEEE/ACM International Conference on Automated Software Engineering (ASE). 102–114.
[18]
Linghui Luo, Felix Pauck, Goran Piskachev, Manuel Benz, Ivan Pashchenko, Martin Mory, Eric Bodden, Ben Hermann, and Fabio Massacci. 2022. TaintBench: Automatic Real-World Malware Benchmarking of Android Taint Analyses. Empirical Softw. Engg., 27, 1 (2022), jan, 41 pages. issn:1382-3256 https://doi.org/10.1007/s10664-021-10013-5
[19]
Ghassan Misherghi and Zhendong Su. 2006. HDD: Hierarchical Delta Debugging. In Proceedings of the 28th International Conference on Software Engineering (ICSE ’06). Association for Computing Machinery, New York, NY, USA. 142–151. isbn:1595933751 https://doi.org/10.1145/1134285.1134307
[20]
Austin Mordahl, Jeho Oh, Ugur Koc, Shiyi Wei, and Paul Gazzillo. 2019. An Empirical Study of Real-World Variability Bugs Detected by Variability-Oblivious Tools. In Proceedings of the 2019 27th ACM Joint Meeting on European Software Engineering Conference and Symposium on the Foundations of Software Engineering (ESEC/FSE 2019). Association for Computing Machinery, New York, NY, USA. 50–61. isbn:9781450355728 https://doi.org/10.1145/3338906.3338967
[21]
Austin Mordahl and Shiyi Wei. 2021. The Impact of Tool Configuration Spaces on the Evaluation of Configurable Taint Analysis for Android. In Proceedings of the 30th ACM SIGSOFT International Symposium on Software Testing and Analysis (ISSTA 2021). Association for Computing Machinery, New York, NY, USA. 466–477. isbn:9781450384599 https://doi.org/10.1145/3460319.3464823
[22]
Austin Mordahl, Zenong Zhang, Dakota Soles, and Shiyi Wei. 2023. ECSTATIC: An Extensible Framework for Testing and Debugging Configurable Static Analysis. In 2023 45th International Conference on Software Engineering (ICSE).
[23]
Felix Pauck, Eric Bodden, and Heike Wehrheim. 2018. Do Android Taint Analysis Tools Keep Their Promises? In Proceedings of the 2018 26th ACM Joint Meeting on European Software Engineering Conference and Symposium on the Foundations of Software Engineering (ESEC/FSE 2018). Association for Computing Machinery, New York, NY, USA. 331–341. isbn:9781450355735 https://doi.org/10.1145/3236024.3236029
[24]
Lina Qiu, Yingying Wang, and Julia Rubin. 2018. Analyzing the Analyzers: FlowDroid/IccTA, AmanDroid, and DroidSafe. In Proceedings of the 27th ACM SIGSOFT International Symposium on Software Testing and Analysis (ISSTA 2018). Association for Computing Machinery, New York, NY, USA. 176–186. isbn:9781450356992 https://doi.org/10.1145/3213846.3213873
[25]
Henry Gordon Rice. 1953. Classes of recursively enumerable sets and their decision problems. Transactions of the American Mathematical society, 74, 2 (1953), 358–366.
[26]
Stefan Schott and Felix Pauck. 2022. Benchmark Fuzzing for Android Taint Analyses. In 2022 IEEE 22nd International Working Conference on Source Code Analysis and Manipulation (SCAM). 12–23. issn:2470-6892 https://doi.org/10.1109/SCAM55253.2022.00007
[27]
Yannis Smaragdakis, Martin Bravenboer, and Ondřej Lhoták. 2011. Pick Your Contexts Well: Understanding Object-Sensitivity. SIGPLAN Not., 46, 1 (2011), Jan., 17–30. issn:0362-1340 https://doi.org/10.1145/1925844.1926390
[28]
Fengguo Wei, Sankardas Roy, and Xinming Ou. 2014. Amandroid: A precise and general inter-component data flow analysis framework for security vetting of android apps. In Proceedings of the 2014 ACM SIGSAC Conference on Computer and Communications Security. 1329–1341.
[29]
Shiyi Wei, Piotr Mardziel, Andrew Ruef, Jeffrey S. Foster, and Michael Hicks. 2018. Evaluating Design Tradeoffs in Numeric Static Analysis for Java. In Programming Languages and Systems - 27th European Symposium on Programming, ESOP 2018, Held as Part of the European Joint Conferences on Theory and Practice of Software, ETAPS 2018, Thessaloniki, Greece, April 14-20, 2018, Proceedings. Springer International Publishing, 653–682. https://doi.org/10.1007/978-3-319-89884-1_23
[30]
Tianyin Xu, Long Jin, Xuepeng Fan, Yuanyuan Zhou, Shankar Pasupathy, and Rukma Talwadker. 2015. Hey, You Have given Me Too Many Knobs!: Understanding and Dealing with over-Designed Configuration in System Software. In Proceedings of the 2015 10th Joint Meeting on Foundations of Software Engineering (ESEC/FSE 2015). Association for Computing Machinery, New York, NY, USA. 307–319. isbn:9781450336758 https://doi.org/10.1145/2786805.2786852
[31]
Sai Yerramreddy, Austin Mordahl, Ugur Koc, Shiyi Wei, Jeffrey S Foster, Marine Carpuat, and Adam A Porter. 2023. An empirical assessment of machine learning approaches for triaging reports of static analysis tools. Empirical Software Engineering, 28, 2 (2023), 28.
[32]
Andreas Zeller and Ralf Hildebrandt. 2002. Simplifying and isolating failure-inducing input. IEEE Transactions on Software Engineering, 28, 2 (2002), 183–200.

Cited By

View all
  • (2025)DeVAIC: A tool for security assessment of AI-generated codeInformation and Software Technology10.1016/j.infsof.2024.107572177(107572)Online publication date: Jan-2025
  • (2024)Characterizing and Detecting Program Representation Faults of Static Analysis FrameworksProceedings of the 33rd ACM SIGSOFT International Symposium on Software Testing and Analysis10.1145/3650212.3680398(1772-1784)Online publication date: 11-Sep-2024

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
ISSTA 2023: Proceedings of the 32nd ACM SIGSOFT International Symposium on Software Testing and Analysis
July 2023
1554 pages
ISBN:9798400702211
DOI:10.1145/3597926
This work is licensed under a Creative Commons Attribution 4.0 International License.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 13 July 2023

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. benchmarking
  2. configurable static analysis
  3. debugging
  4. testing

Qualifiers

  • Research-article

Funding Sources

Conference

ISSTA '23
Sponsor:

Acceptance Rates

Overall Acceptance Rate 58 of 213 submissions, 27%

Upcoming Conference

ISSTA '25

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)346
  • Downloads (Last 6 weeks)53
Reflects downloads up to 21 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2025)DeVAIC: A tool for security assessment of AI-generated codeInformation and Software Technology10.1016/j.infsof.2024.107572177(107572)Online publication date: Jan-2025
  • (2024)Characterizing and Detecting Program Representation Faults of Static Analysis FrameworksProceedings of the 33rd ACM SIGSOFT International Symposium on Software Testing and Analysis10.1145/3650212.3680398(1772-1784)Online publication date: 11-Sep-2024

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media