Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3463274.3463808acmotherconferencesArticle/Chapter ViewAbstractPublication PageseaseConference Proceedingsconference-collections
short-paper
Open access

Open Data-driven Usability Improvements of Static Code Analysis and its Challenges

Published: 21 June 2021 Publication History

Abstract

Context: Software development is moving towards a place where data about development is gathered in a systematic fashion in order to improve the practice, for example, in tuning of static code analysis. However, this kind of data gathering has so far primarily happened within organizations, which is unfortunate as it tends to favor larger organizations with more resources for maintenance of developer tools. Objective: Over the years, we have seen a lot of benefits from open source and recently there has been a lot of development in open data. We see this as an opportunity for cross-organisation community building and wonder to what extent the views on using and sharing open source software developer tools carry across to open data-driven tuning of software development tools. Method: An exploratory study with 11 participants divided into 3 focus groups discussing using and sharing of static code analyzers and data about these analyzers. Results: While using and sharing open-source code (analyzers in this case) is perceived in a positive light as part of the practice of modern software development, sharing data is met with skepticism and uncertainty. Developers are concerned about threats to the company brand, exposure of intellectual property, legal liabilities, and to what extent data is context-specific to a certain organisation. Conclusions: Sharing data in software development is different from sharing data about software development. We need to better understand how we can provide solutions for sharing of software development data in a fashion that reduces risk and enables openness.

References

[1]
E. Aftandilian, R. Sauciuc, S. Priya, and S. Krishnan. 2012. Building Useful Program Analysis Tools Using an Extensible Java Compiler. In SCAM’12. 14–23.
[2]
N. Ayewah, W. Pugh, D. Hovemeyer, J. D. Morgenthaler, and J. Penix. 2008. Using Static Analysis to Find Bugs. IEEE Software 25, 5 (2008), 22–29.
[3]
C. Bird, B. Murphy, N. Nagappan, and T. Zimmermann. 2011. Empirical Software Engineering at Microsoft Research. In CSCW’11.
[4]
A. Blackwell, L. Church, M. Jones, R. Jones, M. Mahmoudi, M. Marasoiu, S. Makins, D. Nauck, K. Prince, A. Semrov, A. Simpson, M. Spott, A. Vuylsteke, and X. Wang. 2018. Computer says ‘don’t know’ – interacting visually with incomplete AI models. In Workshop on Designing Technologies to Support Human Problem Solving - VL/HCC. 5–14.
[5]
R. Capilla, B. Gallina, C. Cetina, and J. Favaro. 2019. Opportunities for software reuse in an uncertain world: From past to emerging trends. Journal of Software: Evolution and Process 31, 8 (2019).
[6]
L. Church and E. Söderberg. 2019. Probes and Sensors: The Design of Feedback Loops for Usability Improvements. In PPIG’19.
[7]
T. Copeland. 2005. PMD applied. Vol. 10. Centennial Books Arexandria, Va, USA.
[8]
R. Enríquez-Reyes, S. Cadena-Vela, A. Fuster-Guilló, J. N. Mazón, L. D. Ibáñez, and E. Simperl. 2021. Systematic Mapping of Open Data Studies: Classification and Trends From a Technological Perspective. IEEE Access 9(2021), 12968–12988.
[9]
J. Feller and B. Fitzgerald. 2002. Understanding open source software development. Addison-Wesley.
[10]
S. Heckman and L. Williams. 2011. A systematic literature review of actionable alert identification techniques for automated static code analysis. Information and Software Technology 53, 4 (2011), 363–387.
[11]
M. Höst and A. Orucevic-Alagic. 2011. A systematic review of research on open source software in commercial software product development. Information and Software Technology 53, 6 (2011), 616–624.
[12]
N. Imtiaz, A. Rahman, E. Farhana, and L. Williams. 2019. Challenges with Responding to Static Analysis Tool Alerts. In MSR’19. 245–249.
[13]
B. Johnson, Y. Song, E. Murphy-Hill, and R. Bowdidge. 2013. Why don’t software developers use static analysis tools to find bugs?. In ICSE’13. 672–681.
[14]
J. Kontio, J. Bragge, and L. Lehtola. 2008. The Focus Group Method as an Empirical Tool in Software Engineering. In Guide to Advanced Empirical Software Engineering, Sjøberg D.I.K. Shull F., Singer J. (Ed.). Springer, 93–116.
[15]
J. Linåker and B. Regnell. 2020. What to share, when, and where : balancing the objectives and complexities of open source software contributions. 25, 5 (2020), 3799–3840.
[16]
A. Ljungberg, D. Åkesson, E. Söderberg, J. Sten, G. Lundh, and L. Church. 2021. Case Study of Data-driven Deployment of Program Analysis on an Open Tools Stack. In ICSE-SEIP’21. IEEE.
[17]
L. Morgan and P. Finnegan. 2007. How Perceptions of Open Source Software Influence Adoption: An Exploratory Study. In ECIS’07. 973–984.
[18]
H. Munir, J. Linåker, K. Wnuk, P. Runeson, and B. Regnell. 2018. Open innovation using open source tools: a case study at Sony Mobile. Empirical Software Engineering 23 (2018), 186–223. Issue 1.
[19]
H. H. Olsson and J. Bosch. 2014. From Opinions to Data-Driven Software R&D: A Multi-case Study on How to Close the ’Open Loop’ Problem. In SEAA’14.
[20]
T. Olsson, M. Hell, M. Höst, U. Franke, and M. Borg. 2019. Sharing of Vulnerability Information Among Companies – A Survey of Swedish Companies. In SEAA’19). 284–291.
[21]
Z. P. Reynolds, A. B. Jayanth, U. Koc, A. A. Porter, R. R. Raje, and J. H. Hill. 2017. Identifying and Documenting False Positive Patterns Generated by Static Code Analysis Tools. In SER&IP’17. 55–61.
[22]
P. Runeson. 2019. Open collaborative data: using OSS principles to share data in SW engineering. In ICSE-NIER’19. 25–28.
[23]
P. Runeson and T. Olsson. 2020. Challenges and Opportunities in Open Data Collaboration – a focus group study. In SEAA’20. 205–212.
[24]
A. Rutkowski, Y. Kadobayashi, I. Furey, D. Rajnovic, R. Martin, T. Takahashi, C. Schultz, G. Reid, G. Schudel, M. Hird, and S. Adegbite. 2010. CYBEX: The Cybersecurity Information Exchange Framework (x.1500). SIGCOMM Comput. Commun. Rev. 40, 5 (2010), 59–64.
[25]
C. Sadowski, J. van Gogh, C. Jaspan, E. Söderberg, and C. Winter. 2015. Tricorder: Building a Program Analysis Ecosystem. In ICSE’15. 598–608.
[26]
S. Shahrivar, S. Elahi, A. Hassanzadeh, and G. Montazer. 2018. A business model for commercial open source software: A systematic literature review. Information and Software Technology 103 (2018), 202–214.
[27]
I. Vakilinia and S. Sengupta. 2019. Fair and private rewarding in a coalitional game of cybersecurity information sharing. IET Information Security 13, 6 (2019), 530–540.
[28]
C. Vassallo, S. Panichella, F. Palomba, S. Proksch, H. C. Gall, and A. Zaidman. 2020. How developers engage with static analysis tools in different contexts. Empirical Software Engineering 25 (2020), 1419–1457. Issue 2.

Cited By

View all

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Other conferences
EASE '21: Proceedings of the 25th International Conference on Evaluation and Assessment in Software Engineering
June 2021
417 pages
ISBN:9781450390538
DOI:10.1145/3463274
This work is licensed under a Creative Commons Attribution International 4.0 License.

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 21 June 2021

Check for updates

Author Tags

  1. data-driven software development
  2. open data
  3. static code analysis

Qualifiers

  • Short-paper
  • Research
  • Refereed limited

Funding Sources

  • Vinnova
  • The Swedish Foundation for Strategic Research
  • The Swedish Research Council

Conference

EASE 2021

Acceptance Rates

Overall Acceptance Rate 71 of 232 submissions, 31%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 394
    Total Downloads
  • Downloads (Last 12 months)101
  • Downloads (Last 6 weeks)16
Reflects downloads up to 20 Sep 2024

Other Metrics

Citations

Cited By

View all

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Get Access

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media