Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3121264.3121265acmconferencesArticle/Chapter ViewAbstractPublication PagesfseConference Proceedingsconference-collections
research-article
Public Access

Mining mobile app markets for prioritization of security assessment effort

Published: 05 September 2017 Publication History

Abstract

Like any other software engineering activity, assessing the security of a software system entails prioritizing the resources and minimizing the risks. Techniques ranging from the manual inspection to automated static and dynamic analyses are commonly employed to identify security vulnerabilities prior to the release of the software. However, none of these techniques is perfect, as static analysis is prone to producing lots of false positives and negatives, while dynamic analysis and manual inspection are unwieldy, both in terms of required time and cost. This research aims to improve these techniques by mining relevant information from vulnerabilities found in the app markets. The approach relies on the fact that many modern software systems, in particular mobile software, are developed using rich application development frameworks (ADF), allowing us to raise the level of abstraction for detecting vulnerabilities and thereby making it possible to classify the types of vulnerabilities that are encountered in a given category of application. By coupling this type of information with severity of the vulnerabilities, we are able to improve the efficiency of static and dynamic analyses, and target the manual effort on the riskiest vulnerabilities.

References

[1]
2017. F-Droid: Free and Open Source App Repository. (2017). https://f-droid.org/ 2017. Fortify Static Code Analyzer. (2017). https://saas.hpe.com/software/sca 2017. National Vulnerability Database CVSS Scoring. (2017). https://nvd.nist. gov/vuln-metrics/cvss/v3-calculator
[2]
Thanassis Avgerinos, Sang Kil Cha, Brent Lim Tze Hao, and David Brumley. 2011. AEG: Automatic exploit generation. In Proceedings of the Network and Distributed System Security Symposium.
[3]
Dimitri P. Bertsekas and John N. Tsitsiklis. 2008. Introduction to Probability, 2nd Edition. Athena Scientific.
[4]
Erika Chin, Adrienne Porter Felt, Kate Greenwood, and David Wagner. 2011. Analyzing inter-application communication in Android. In Proceedings of the 9th international conference on Mobile systems, applications, and services (MobiSys ’11). ACM, New York, NY, USA, 239–252.
[5]
William Enck, Damien Octeau, Patrick McDaniel, and Swarat Chaudhuri. 2011. A study of android application security. In Proceedings of the 20th USENIX security symposium, Vol. 2011.
[6]
Alessandra Gorla, Ilaria Tavecchia, Florian Gross, and Andreas Zeller. 2014. Checking app behavior against app descriptions. In Proceedings of the 36th International Conference on Software Engineering. ACM, 1025–1035.
[7]
Michael Grace, Yajin Zhou, Qiang Zhang, Shihong Zou, and Xuxian Jiang. 2012. Riskranker: scalable and accurate zero-day android malware detection. In The 10th International Conference on Mobile Systems, Applications, and Services, Ambleside, United Kingdom - June 25 - 29, 2012. Washington, DC, 281–294.
[8]
CVSS-SIG group. 2007. Common Vulnerability Scoring System (CVSS-SIG). (2007). www.first.org/cvss
[9]
Yiming Jing, Gail-Joon Ahn, Ziming Zhao, and Hongxin Hu. 2015. Towards Automated Risk Assessment and Mitigation of Mobile Applications. IEEE Trans. Dependable Sec. Comput. 12, 5 (2015), 571–584.
[10]
Mario Linares-Vasquez, Collin McMillan, Denys Poshyvanyk, and Mark Grechanik. 2012. On using machine learning to automatically classify software applications into domain categories. Empirical Software Eng. (2012), 1–37.
[11]
Jack T. Marchewka. 2009. Information Technology Project Management. Wiley.
[12]
Gary McGraw. 1997. Testing for security during development: why we should scrap penetrate-and-patch. In Are We Making Progress Towards Computer Assurance? Proceedings of the 12th Annual Conference on Computer Assurance, 1997. COMPASS ’97. 117–119.
[13]
G. McGraw. 2008. Automated Code Review Tools for Security. Computer 41, 12 (2008), 108–111.
[14]
Alireza Sadeghi, Hamid Bagheri, Joshua Garcia, and Sam Malek. 2016. A Taxonomy and Qualitative Comparison of Program Analysis Techniques for Security Assessment of Android Software. IEEE Transactions on Software Eng. (2016).
[15]
Alireza Sadeghi, Naeem Esfahani, and Sam Malek. 2014. Mining the Categorized Software Repositories to Improve the Analysis of Security Vulnerabilities. In Fundamental Approaches to Software Engineering (FASE). 155–169.
[16]
Bhaskar Pratim Sarma, Ninghui Li, Christopher S. Gates, Rahul Potharaju, Cristina Nita-Rotaru, and Ian Molloy. 2012. Android permissions: a perspective combining risks and benefits. In 17th ACM Symposium on Access Control Models and Technologies, Newark, NJ, USA - June 20 - 22, 2012. 13–22.
[17]
Riccardo Scandariato and James Walden. 2012. Predicting vulnerable classes in an Android application. In Proceedings of the 4th international workshop on Security measurements and metrics. 11–16.
[18]
Yonghee Shin, Andrew Meneely, Laurie Williams, and Jason A. Osborne. 2011. Evaluating complexity, code churn, and developer activity metrics as indicators of software vulnerabilities. IEEE Transactions on Software Eng. 37, 6 (2011), 772–787.
[19]
Symantec Corp. 2012. 2012 Norton Study. (Sept. 2012). www.symantec.com/ about/news/release/article.jsp?prid=20120905_02
[20]
Pang-Ning Tan, Michael Steinbach, and Vipin Kumar. 2005. Introduction to Data Mining (1 ed.). Addison Wesley.
[21]
James Walden and Maureen Doyle. 2012. SAVI: Static-analysis vulnerability indicator. Security & Privacy, IEEE 10, 3 (2012), 32–39.
[22]
T.M. Williams. 1993. Risk-management infrastructures. International Journal of Project Management 11, 1 (1993), 5–10.
[23]
Thomas Zimmermann, Nachiappan Nagappan, and Laurie Williams. 2010. Searching for a needle in a haystack: Predicting security vulnerabilities for windows vista. In Software Testing, Verification and Validation (ICST), 2010 Third International Conference on. 421–428. Abstract 1 Introduction 2 Motivation 3 Approach Overview 4 Probabilistic Rule Classification 5 Vulnerability Impact Calculation 6 Risk Assessment 7 Experiment Setup 8 Evaluation 8.1 Rule Ranking 8.2 Criticality Ranking 8.3 Risk Ranking 9 Related Work 10 Conclusion References

Cited By

View all
  • (2023)Multi-device, Robust, and Integrated Android GUI Testing: A Conceptual FrameworkTesting Software and Systems10.1007/978-3-031-43240-8_8(115-125)Online publication date: 19-Sep-2023
  • (2021)User Interface Matters: Analysing the Complexity of Mobile Applications from a Visual PerspectiveProcedia Computer Science10.1016/j.procs.2021.07.039191(9-16)Online publication date: 2021

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
WAMA 2017: Proceedings of the 2nd ACM SIGSOFT International Workshop on App Market Analytics
September 2017
25 pages
ISBN:9781450351584
DOI:10.1145/3121264
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 05 September 2017

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Mining App Market
  2. Security Vulnerability
  3. Software Analysis

Qualifiers

  • Research-article

Funding Sources

Conference

ESEC/FSE'17
Sponsor:

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)103
  • Downloads (Last 6 weeks)10
Reflects downloads up to 08 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2023)Multi-device, Robust, and Integrated Android GUI Testing: A Conceptual FrameworkTesting Software and Systems10.1007/978-3-031-43240-8_8(115-125)Online publication date: 19-Sep-2023
  • (2021)User Interface Matters: Analysing the Complexity of Mobile Applications from a Visual PerspectiveProcedia Computer Science10.1016/j.procs.2021.07.039191(9-16)Online publication date: 2021

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media