Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3611643.3616262acmconferencesArticle/Chapter ViewAbstractPublication PagesfseConference Proceedingsconference-collections
research-article

Comparison and Evaluation on Static Application Security Testing (SAST) Tools for Java

Published: 30 November 2023 Publication History

Abstract

Static application security testing (SAST) takes a significant role in the software development life cycle (SDLC). However, it is challenging to comprehensively evaluate the effectiveness of SAST tools to determine which is the better one for detecting vulnerabilities. In this paper, based on well-defined criteria, we first selected seven free or open-source SAST tools from 161 existing tools for further evaluation. Owing to the synthetic and newly-constructed real-world benchmarks, we evaluated and compared these SAST tools from different and comprehensive perspectives such as effectiveness, consistency, and performance. While SAST tools perform well on synthetic benchmarks, our results indicate that only 12.7% of real-world vulnerabilities can be detected by the selected tools. Even combining the detection capability of all tools, most vulnerabilities (70.9%) remain undetected, especially those beyond resource control and insufficiently neutralized input/output vulnerabilities. The fact is that although they have already built the corresponding detecting rules and integrated them into their capabilities, the detection result still did not meet the expectations. All useful findings unveiled in our comprehensive study indeed help to provide guidance on tool development, improvement, evaluation, and selection for developers, researchers, and potential users.

Supplementary Material

Video (fse23main-p176-p-video.mp4)
"Static application security testing (SAST) takes a significant role in the software development life cycle (SDLC). However, it is challenging to comprehensively evaluate the effectiveness of SAST tools to determine which is the better one for detecting vulnerabilities. In this paper, based on well-defined criteria, we first selected seven free or open-source SAST tools from 161 existing tools for further evaluation. Owing to the synthetic and newly-constructed real-world benchmarks, we evaluated and compared these SAST tools from different and comprehensive perspectives such as effectiveness, consistency, and performance. While SAST tools perform well on synthetic benchmarks, our results indicate that only 12.7% of real-world vulnerabilities can be detected by the selected tools. Even combining the detection capability of all tools, most vulnerabilities (70.9%) remain undetected, especially those beyond resource control and insufficiently neutralized input/output vulnerabilities. The fact is that although they have already built the corresponding detecting rules and integrated them into their capabilities, the detection result still did not meet the expectations. All useful findings unveiled in our comprehensive study indeed help to provide guidance on tool development, improvement, evaluation, and selection for developers, researchers, and potential users."

References

[1]
Bushra Aloraini, Meiyappan Nagappan, Daniel M. German, Shinpei Hayashi, and Yoshiki Higo. 2019. An empirical study of security warnings from static application security testing tools. Journal of Systems and Software, 158 (2019), 110427. issn:0164-1212 https://doi.org/10.1016/j.jss.2019.110427
[2]
Midya Alqaradaghi, Gregory Morse, and Tamás Kozsik. 2022. Detecting security vulnerabilities with static analysis - A case study. Pollack Periodica, 17, 2 (2022), 1–7. https://doi.org/10.1556/606.2021.00454
[3]
Apache. 2023. Home - Apache Qpid. https://qpid.apache.org/index.html (Accessed on 31/01/2023)
[4]
Christel Baier and Joost-Pieter Katoen. 2008. Principles of model checking. MIT press.
[5]
Sindre Beba and Magnus Melseth Karlsen. 2019. Implementation analysis of open-source Static analysis tools for detecting security vulnerabilities. Master’s thesis. NTNU.
[6]
Alexandre Braga, Ricardo Dahab, Nuno Antunes, Nuno Laranjeiro, and Marco Vieira. 2019. Understanding How to Use Static Analysis Tools for Detecting Cryptography Misuse in Software. IEEE Transactions on Reliability, 68, 4 (2019), 1384–1403. https://doi.org/10.1109/TR.2019.2937214
[7]
Tiago Brito, Mafalda Ferreira, Miguel Monteiro, Pedro Lopes, Miguel Barros, José Fragoso Santos, and Nuno Santos. 2023. Study of JavaScript Static Analysis Tools for Vulnerability Detection in Node. js Packages. arXiv preprint arXiv:2301.05097.
[8]
Joshua Bundt, Andrew Fasano, Brendan Dolan-Gavitt, William Robertson, and Tim Leek. 2021. Evaluating Synthetic Bugs. In Proceedings of the 2021 ACM Asia Conference on Computer and Communications Security (ASIA CCS ’21). Association for Computing Machinery, New York, NY, USA. 716–730. isbn:9781450382878 https://doi.org/10.1145/3433210.3453096
[9]
Checkstyle. 2022. checkstyle – Checkstyle 10.6.0. https://checkstyle.sourceforge.io/ (Accessed on 31/01/2023)
[10]
Sen Chen, Lingling Fan, Guozhu Meng, Ting Su, Minhui Xue, Yinxing Xue, Yang Liu, and Lihua Xu. 2020. An empirical assessment of security risks of global Android banking apps. In Proceedings of the ACM/IEEE 42nd International Conference on Software Engineering. 1310–1322.
[11]
Sen Chen, Ting Su, Lingling Fan, Guozhu Meng, Minhui Xue, Yang Liu, and Lihua Xu. 2018. Are mobile banking apps secure? what can be improved? In Proceedings of the 2018 26th ACM Joint Meeting on European Software Engineering Conference and Symposium on the Foundations of Software Engineering. 797–802.
[12]
Sen Chen, Yuxin Zhang, Lingling Fan, Jiaming Li, and Yang Liu. 2022. Ausera: Automated security vulnerability detection for Android apps. In Proceedings of the 37th IEEE/ACM International Conference on Automated Software Engineering. 1–5.
[13]
CodeQL. 2022. CodeQL. https://codeql.github.com/docs/codeql-overview/about-codeql/ (Accessed on 31/01/2023)
[14]
MITRE corporation. 2023. Common Vulnerabilities and Exposures. https://cve.mitre.org/ (Accessed on 31/01/2023)
[15]
Ctags. 2023. Universal Ctags. https://ctags.io/ (Accessed on 31/01/2023)
[16]
CVSS V2. 2023. CVSS v2 Complete Documentation. https://www.first.org/cvss/v2/guide (Accessed on 16/06/2023)
[17]
CVSS V3. 2023. CVSS v3.0 User Guide. https://www.first.org/cvss/v3.0/user-guide (Accessed on 16/06/2023)
[18]
CWE. 2022. CVE-CWE mapping guidance. https://cwe.mitre.org/documents/cwe_usage/guidance.html (Accessed on 31/01/2023)
[19]
CWE. 2022. CWE-1000: Research Concepts. https://cwe.mitre.org/data/definitions/1000.html (Accessed on 31/01/2023)
[20]
CWE. 2022. CWE-Compatible Products and Services. https://cwe.mitre.org/compatible/compatible.html (Accessed on 31/01/2023)
[21]
CWE. 2023. CWE-View - CWE Glossary. https://cwe.mitre.org/documents/glossary/index.html#View (Accessed on 31/01/2023)
[22]
CWE. 2023. Pillar WeaknessCWE Glossary. https://cwe.mitre.org/documents/glossary/index.html (Accessed on 31/01/2023)
[23]
José D’Abruzzo Pereira and Marco Vieira. 2020. On the Use of Open-Source C/C++ Static Analysis Tools in Large Projects. In 2020 16th European Dependable Computing Conference (EDCC). 97–102.
[24]
Debian. 2023. Debian – The Universal Operating System. https://www.debian.org/ (Accessed on 31/01/2023)
[25]
Common Weakness Enumeration. 2022. Common Weakness Enumeration. https://cwe.mitre.org/index.html (Accessed on 31/01/2023)
[26]
Lingling Fan, Ting Su, Sen Chen, Guozhu Meng, Yang Liu, Lihua Xu, Geguang Pu, and Zhendong Su. 2018. Large-scale analysis of framework-specific exceptions in Android apps. In Proceedings of the 40th International Conference on Software Engineering. 408–419.
[27]
FasterXML. 2020. jackson-dataformats-binary. https://mvnrepository.com/artifact/com.fasterxml.jackson.dataformat/jackson-dataformats-binary (Accessed on 31/01/2023)
[28]
Forum of Incident Response and Security Teams. 2023. Common Vulnerability Scoring System SIG. https://www.first.org/cvss/ (Accessed on 12/06/2023)
[29]
The Apache Software Foundation. 2023. Maven – Welcome to Apache Maven. https://maven.apache.org/ (Accessed on 31/01/2023)
[30]
The OWASP Foundation. 2020. OWASP-Top-Ten-Benchmark, 2020. https://github.com/jrbermh/OWASP-Top-Ten-Benchmark (Accessed on 31/01/2023)
[31]
The OWASP Foundation. 2022. OWASP Benchmark. https://owasp.org/www-project-benchmark/ (Accessed on 31/01/2023)
[32]
The OWASP Foundation. 2023. OWASP Dependency-Check. https://owasp.org/www-project-dependency-check/ (Accessed on 31/01/2023)
[33]
The OWASP Foundation. 2023. Software Component Analysis. https://owasp.org/www-community/Component_Analysis (Accessed on 31/01/2023)
[34]
GitHub. 2022. Awesome static analysis. https://github.com/mre/awesome-static-analysis#multiple-languages-1 (Accessed on 22/08/2022)
[35]
GitHub. 2022. GitHub-analysis-tools-dev. https://github.com/analysis-tools-dev/static-analysis#java (Accessed on 22/08/2022)
[36]
GitHub. 2023. GitHub code scanning. https://github.blog/2022-08-15-the-next-step-for-lgtm-com-github-code-scanning/ (Accessed on 31/01/2023)
[37]
GitHub. 2023. Gitleaks. https://gitleaks.io/ (Accessed on 31/01/2023)
[38]
Google. 2022. Error Prone. https://errorprone.info/ (Accessed on 31/01/2023)
[39]
Google. 2023. Google-java-format. https://github.com/google/google-java-format (Accessed on 31/01/2023)
[40]
Katerina Goseva-Popstojanova and Andrei Perhinschi. 2015. On the capability of static code analysis to detect security vulnerabilities. Information and Software Technology, 68 (2015), 18–33. issn:0950-5849 https://doi.org/10.1016/j.infsof.2015.08.002
[41]
Andrew Habib and Michael Pradel. 2018. How Many of All Bugs Do We Find? A Study of Static Bug Detectors. In Proceedings of the 33rd ACM/IEEE International Conference on Automated Software Engineering (ASE 2018). Association for Computing Machinery, New York, NY, USA. 317–328. isbn:9781450359375 https://doi.org/10.1145/3238147.3238213
[42]
HCL. 2023. HCL AppScan CodeSweep. https://marketplace.visualstudio.com/items?itemName=HCLTechnologies.hclappscancodesweep (Accessed on 31/01/2023)
[43]
Jerónimo Hernández-González, Daniel Rodriguez, Inaki Inza, Rachel Harrison, and Jose A Lozano. 2018. Learning to classify software defects from crowds: a novel approach. Applied Soft Computing, 62 (2018), 579–591.
[44]
Insidersec. 2022. Insider. https://github.com/insidersec/insider (Accessed on 31/01/2023)
[45]
Hong Jin Kang, Khai Loong Aw, and David Lo. 2022. Detecting False Alarms from Automatic Static Analysis Tools: How Far Are We? In Proceedings of the 44th International Conference on Software Engineering (ICSE ’22). Association for Computing Machinery, New York, NY, USA. 698-709. isbn:9781450392211
[46]
Arvinder Kaur and Ruchikaa Nayyar. 2020. A Comparative Study of Static Code Analysis tools for Vulnerability Detection in C/C++ and JAVA Source Code. Procedia Computer Science, 171 (2020), 2023–2029. issn:1877-0509 Third International Conference on Computing and Network Communications (CoCoNet’19)
[47]
William Landi. 1992. Undecidability of static analysis. ACM Letters on Programming Languages and Systems (LOPLAS), 1, 4 (1992), 323–337.
[48]
Valentina Lenarduzzi, Fabiano Pecorelli, Nyyti Saarimaki, Savanna Lujan, and Fabio Palomba. 2023. A critical comparison on six static analysis tools: Detection, agreement, and precision. Journal of Systems and Software, 198 (2023), 111575.
[49]
Jingyue Li, Sindre Beba, and Magnus Melseth Karlsen. 2019. Evaluation of open-source IDE plugins for detecting security vulnerabilities. In Proceedings of the Evaluation and Assessment on Software Engineering. 200–209.
[50]
Stephan Lipp, Sebastian Banescu, and Alexander Pretschner. 2022. An Empirical Study on the Effectiveness of Static C Code Analyzers for Vulnerability Detection. In Proceedings of the 31st ACM SIGSOFT International Symposium on Software Testing and Analysis (ISSTA 2022). Association for Computing Machinery, New York, NY, USA. 544-555. isbn:9781450393799 https://doi.org/10.1145/3533767.3534380
[51]
Han Liu, Sen Chen, Ruitao Feng, Chengwei Liu, Kaixuan Li, Zhengzi Xu, Liming Nie, Yang Liu, and Yixiang Chen. 2023. A Comprehensive Study on Quality Assurance Tools for Java. In Proceedings of the 32st ACM SIGSOFT International Symposium on Software Testing and Analysis (ISSTA 2023). Association for Computing Machinery, New York, NY, USA. isbn:9781450393799
[52]
Linghui Luo, Felix Pauck, Goran Piskachev, Manuel Benz, Ivan Pashchenko, Martin Mory, Eric Bodden, Ben Hermann, and Fabio Massacci. 2022. TaintBench: Automatic real-world malware benchmarking of Android taint analyses. Empirical Software Engineering, 27 (2022), 1–41.
[53]
Maven. 2023. Jackson Databind. https://mvnrepository.com/artifact/com.fasterxml.jackson.core/jackson-databind (Accessed on 16/06/2023)
[54]
Meta. 2023. Infer Static Analyzer. https://fbinfer.com/ (Accessed on 1/06/2023)
[55]
Austin Mordahl and Shiyi Wei. 2021. The impact of tool configuration spaces on the evaluation of configurable taint analysis for android. In Proceedings of the 30th ACM SIGSOFT International Symposium on Software Testing and Analysis. 466–477.
[56]
Marcus Nachtigall, Michael Schlichtig, and Eric Bodden. 2022. A Large-Scale Study of Usability Criteria Addressed by Static Analysis Tools. In Proceedings of the 31st ACM SIGSOFT International Symposium on Software Testing and Analysis (ISSTA 2022). Association for Computing Machinery, New York, NY, USA. 532-543. isbn:9781450393799 https://doi.org/10.1145/3533767.3534374
[57]
National Vulnerability Database. 2023. NVD-Home. https://nvd.nist.gov/ (Accessed on 31/01/2023)
[58]
Anh Nguyen-Duc, Manh Viet Do, Quan Luong Hong, Kiem Nguyen Khac, and Anh Nguyen Quang. 2021. On the adoption of static analysis for software security assessment-A case study of an open-source e-government project. Computers & Security, 111 (2021), 102470. issn:0167-4048
[59]
Flemming Nielson, Hanne R Nielson, and Chris Hankin. 2015. Principles of program analysis. springer.
[60]
Paulo Nunes, Ibéria Medeiros, José Fonseca, Nuno Neves, Miguel Correia, and Marco Vieira. 2019. An empirical study on combining diverse static analysis tools for web security vulnerabilities based on development scenarios. Computing, 101 (2019), 161–185.
[61]
NVD. 2014. CVE-2014-3651. https://nvd.nist.gov/vuln/detail/CVE-2014-3651 (Accessed on 31/01/2023)
[62]
NVD. 2015. CVE-2015-2913. https://nvd.nist.gov/vuln/detail/CVE-2015-2913 (Accessed on 31/01/2023)
[63]
NVD. 2018. CVE-2018-17187. https://nvd.nist.gov/vuln/detail/CVE-2018-17187 (Accessed on 31/01/2023)
[64]
NVD. 2018. CVE-2018-20227. https://nvd.nist.gov/vuln/detail/CVE-2018-20227 (Accessed on 31/01/2023)
[65]
NVD. 2019. CVE-2019-18393. https://nvd.nist.gov/vuln/detail/CVE-2019-18393 (Accessed on 31/01/2023)
[66]
NVD. 2021. Log4Shell: CVE-2021-44228. https://nvd.nist.gov/vuln/detail/CVE-2021-44228 (Accessed on 31/01/2023)
[67]
NVD. 2022. Spring4Shell: CVE-2022-22965. https://nvd.nist.gov/vuln/detail/cve-2022-22965 (Accessed on 31/01/2023)
[68]
The University of Maryland. 2022. FindBugs. http://findbugs.sourceforge.net/ (Accessed on 31/01/2023)
[69]
The University of Maryland. 2022. FindSecurityBugs. https://find-sec-bugs.github.io/ (Accessed on 31/01/2023)
[70]
The University of Maryland. 2022. SpotBugs. https://spotbugs.github.io/ (Accessed on 31/01/2023)
[71]
National Institute of Standards and Technology. 2017. Juliet Test Suite. https://samate.nist.gov/SARD/test-suites (Accessed on 31/01/2023)
[72]
National Institute of Standards and Technology. 2022. NIST: Free for Open Source Application Security Tools. https://www.nist.gov/itl/ssd/software-quality-group/source-code-security-analyzers (Accessed on 22/08/2022)
[73]
National Institute of Standards and Technology. 2022. SAMATE: Source Code Security Analyzers. https://www.nist.gov/itl/ssd/software-quality-group/source-code-security-analyzers (Accessed on 22/08/2022)
[74]
Opensecurity. 2022. NodeJSScan. https://github.com/ajinabraham/nodejsscan (Accessed on 31/01/2023)
[75]
OpenSSF. 2020. OpenSSF CVE Benchmark. https://github.com/ossf-cve-benchmark/ossf-cve-benchmark (Accessed on 31/01/2023)
[76]
OpenSSF. 2022. Open Source Security Foundation. https://openssf.org/ (Accessed on 31/01/2023)
[77]
OWASP. 2022. Free for Open Source Application Security Tools. https://owasp.org/www-community/Free_for_Open_Source_Application_Security_Tools (Accessed on 22/08/2022)
[78]
OWASP. 2022. Source Code Analysis Tools. https://owasp.org/www-community/Source_Code_Analysis_Tools (Accessed on 22/08/2022)
[79]
oxsecurity. 2023. Megalinter. https://github.com/oxsecurity/megalinter (Accessed on 31/01/2023)
[80]
Tosin Daniel Oyetoyan, Bisera Milosheska, Mari Grini, and Daniela Soares Cruzes. 2018. Myths and Facts About Static Application Security Testing Tools: An Action Research at Telenor Digital. In Agile Processes in Software Engineering and Extreme Programming, Juan Garbajosa, Xiaofeng Wang, and Ademar Aguiar (Eds.). Springer International Publishing, Cham. 86–103. isbn:978-3-319-91602-6
[81]
Yuanyuan Pan. 2019. Interactive application security testing. In 2019 International Conference on Smart Grid and Electrical Automation (ICSGEA). 558–561.
[82]
Felix Pauck, Eric Bodden, and Heike Wehrheim. 2018. Do Android Taint Analysis Tools Keep Their Promises? In Proceedings of the 2018 26th ACM Joint Meeting on European Software Engineering Conference and Symposium on the Foundations of Software Engineering (ESEC/FSE 2018). Association for Computing Machinery, New York, NY, USA. 331–341. isbn:9781450355735
[83]
PMD. 2023. PMD. https://pmd.github.io/ (Accessed on 31/01/2023)
[84]
Lina Qiu, Yingying Wang, and Julia Rubin. 2018. Analyzing the analyzers: Flowdroid/iccta, amandroid, and droidsafe. In Proceedings of the 27th ACM SIGSOFT International Symposium on Software Testing and Analysis. 176–186.
[85]
R2C. 2022. Semgrep. https://www.semgrep.dev/ (Accessed on 31/01/2023)
[86]
RedHat. 2018. What is DevSecOps? https://www.redhat.com/en/topics/devops/what-is-devsecops (Accessed on 31/01/2023)
[87]
RedHat. 2023. Red Hat Bugzilla Main Page. https://bugzilla.redhat.com/ (Accessed on 31/05/2023)
[88]
Reshift. 2023. Reshift. https://www.softwaresecured.com/ (Accessed on 31/01/2023)
[89]
Aqua Security. 2023. Trivy. https://trivy.dev/ (Accessed on 31/01/2023)
[90]
Contrast Security. 2022. Contrast Security. https://www.contrastsecurity.com/ (Accessed on 31/01/2023)
[91]
Jacek Śliwerski, Thomas Zimmermann, and Andreas Zeller. 2005. When do changes induce fixes? ACM sigsoft software engineering notes, 30, 4 (2005), 1–5.
[92]
Justin Smith, Lisa Nguyen Quang Do, and Emerson Murphy-Hill. 2020. Why Can’t Johnny Fix Vulnerabilities: A Usability Evaluation of Static Analysis Tools for Security. In Proceedings of the Sixteenth USENIX Conference on Usable Privacy and Security (SOUPS’20). USENIX Association, USA. Article 13, 18 pages. isbn:978-1-939133-16-8
[93]
SonarSource. 2022. SonarQube. https://www.sonarqube.org/ (Accessed on 31/01/2023)
[94]
Spark. 2018. spark/src/main/java/spark/resource/ClassPathResource.java at 27236534e90bd2bfe339fd65fe6ddd6a9f0304e1. https://github.com/perwendel/spark/blob/030e9d00125cbd1ad759668f85488aba1019c668 1/src/main/java/spark/resource/ClassPathResource.java (Accessed on 31/01/2023)
[95]
Martin R Stytz and Sheila B Banks. 2006. Dynamic software security testing. IEEE security & privacy, 4, 3 (2006), 77–79.
[96]
Ting Su, Lingling Fan, Sen Chen, Yang Liu, Lihua Xu, Geguang Pu, and Zhendong Su. 2020. Why my app crashes? understanding and benchmarking framework-specific exceptions of Android apps. IEEE Transactions on Software Engineering, 48, 4 (2020), 1115–1137.
[97]
Ferdian Thung, David Lo, Lingxiao Jiang, Foyzur Rahman, and Premkumar T Devanbu. 2015. To what extent could we detect field defects? An extended empirical study of false negatives in static bug-finding tools. Automated Software Engineering, 22 (2015), 561–602. https://doi.org/10.1007/s10515-014-0169-8
[98]
Ferdian Thung, Lucia, David Lo, Lingxiao Jiang, Foyzur Rahman, and Premkumar T. Devanbu. 2012. To What Extent Could We Detect Field Defects? An Empirical Study of False Negatives in Static Bug Finding Tools. In Proceedings of the 27th IEEE/ACM International Conference on Automated Software Engineering (ASE 2012). Association for Computing Machinery, New York, NY, USA. 50–59. isbn:9781450312042 https://doi.org/10.1145/2351676.2351685
[99]
TIOBE. 2023. The Java Programming Language-TIOBE. https://www.tiobe.com/tiobe-index/java/ (Accessed on 31/01/2023)
[100]
David A. Tomassi. 2018. Bugs in the Wild: Examining the Effectiveness of Static Analyzers at Finding Real-World Bugs. In Proceedings of the 2018 26th ACM Joint Meeting on European Software Engineering Conference and Symposium on the Foundations of Software Engineering (ESEC/FSE 2018). Association for Computing Machinery, New York, NY, USA. 980–982. isbn:9781450355735 https://doi.org/10.1145/3236024.3275439
[101]
Andreas Wagner. and Johannes Sametinger. 2014. Using the Juliet Test Suite to Compare Static Security Scanners. In Proceedings of the 11th International Conference on Security and Cryptography - SECRYPT, (ICETE 2014). SciTePress, 244–252. isbn:978-989-758-045-1 issn:2184-2825 https://doi.org/10.5220/0005032902440252
[102]
Website of This Study. 2023. Comparison and Evaluation on Static Application Security Testing (SAST) Tools for Java. https://sites.google.com/view/java-sast-study/home (Accessed on 31/01/2023)
[103]
Website of This Study. 2023. Tools Selection. https://sites.google.com/view/java-sast-study/tool-selection (Accessed on 31/01/2023)
[104]
Wikipedia. 2022. List of tools for static code analysis. https://en.wikipedia.org/wiki/List_of_tools_for_static_code_analysis (Accessed on 22/08/2022)
[105]
Wikipedia. 2023. Linter-Wikipedia. https://en.wikipedia.org/wiki/Lint_(software) (Accessed on 22/06/2023)
[106]
J. Zheng, L. Williams, N. Nagappan, W. Snipes, J.P. Hudepohl, and M.A. Vouk. 2006. On the value of static analysis for fault detection in software. IEEE Transactions on Software Engineering, 32, 4 (2006), 240–253.
[107]
Zupit. 2022. Horusec. https://docs.horusec.io/docs/overview/ (Accessed on 31/01/2023)

Cited By

View all
  • (2024)An Empirical Study of Static Analysis Tools for Secure Code ReviewProceedings of the 33rd ACM SIGSOFT International Symposium on Software Testing and Analysis10.1145/3650212.3680313(691-703)Online publication date: 11-Sep-2024
  • (2024)Silent Taint-Style Vulnerability Fixes IdentificationProceedings of the 33rd ACM SIGSOFT International Symposium on Software Testing and Analysis10.1145/3650212.3652139(428-439)Online publication date: 11-Sep-2024
  • (2024)A Comprehensive Study on Static Application Security Testing (SAST) Tools for AndroidIEEE Transactions on Software Engineering10.1109/TSE.2024.3488041(1-18)Online publication date: 2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
ESEC/FSE 2023: Proceedings of the 31st ACM Joint European Software Engineering Conference and Symposium on the Foundations of Software Engineering
November 2023
2215 pages
ISBN:9798400703270
DOI:10.1145/3611643
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 30 November 2023

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Benchmarks
  2. Empirical study
  3. Static application security testing

Qualifiers

  • Research-article

Funding Sources

  • the National Key R&D Program of China
  • National Research Foundation, Singapore, and the Cyber Security Agency under its National Cyber- security R&D Programme

Conference

ESEC/FSE '23
Sponsor:

Acceptance Rates

Overall Acceptance Rate 94 of 522 submissions, 18%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)531
  • Downloads (Last 6 weeks)60
Reflects downloads up to 20 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2024)An Empirical Study of Static Analysis Tools for Secure Code ReviewProceedings of the 33rd ACM SIGSOFT International Symposium on Software Testing and Analysis10.1145/3650212.3680313(691-703)Online publication date: 11-Sep-2024
  • (2024)Silent Taint-Style Vulnerability Fixes IdentificationProceedings of the 33rd ACM SIGSOFT International Symposium on Software Testing and Analysis10.1145/3650212.3652139(428-439)Online publication date: 11-Sep-2024
  • (2024)A Comprehensive Study on Static Application Security Testing (SAST) Tools for AndroidIEEE Transactions on Software Engineering10.1109/TSE.2024.3488041(1-18)Online publication date: 2024
  • (2024)Just another copy and paste? Comparing the security vulnerabilities of ChatGPT generated code and StackOverflow answers2024 IEEE Security and Privacy Workshops (SPW)10.1109/SPW63631.2024.00014(87-94)Online publication date: 23-May-2024
  • (2024)Navigating (in)Security of AI-Generated Code2024 IEEE International Conference on Cyber Security and Resilience (CSR)10.1109/CSR61664.2024.10679468(1-8)Online publication date: 2-Sep-2024
  • (2024)Insights from Running 24 Static Analysis Tools on Open Source Software RepositoriesInformation Systems Security10.1007/978-3-031-80020-7_13(225-245)Online publication date: 15-Dec-2024

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media