Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/2898375.2898380acmotherconferencesArticle/Chapter ViewAbstractPublication PageshotsosConference Proceedingsconference-collections
research-article

Establishing a baseline for measuring advancement in the science of security: an analysis of the 2015 IEEE security & privacy proceedings

Published: 19 April 2016 Publication History

Abstract

To help establish a more scientific basis for security science, which will enable the development of fundamental theories and move the field from being primarily reactive to primarily proactive, it is important for research results to be reported in a scientifically rigorous manner. Such reporting will allow for the standard pillars of science, namely replication, meta-analysis, and theory building. In this paper we aim to establish a baseline of the state of scientific work in security through the analysis of indicators of scientific research as reported in the papers from the 2015 IEEE Symposium on Security and Privacy. To conduct this analysis, we developed a series of rubrics to determine the completeness of the papers relative to the type of evaluation used (e.g. case study, experiment, proof). Our findings showed that while papers are generally easy to read, they often do not explicitly document some key information like the research objectives, the process for choosing the cases to include in the studies, and the threats to validity. We hope that this initial analysis will serve as a baseline against which we can measure the advancement of the science of security.

References

[1]
Freely associating. Nature Genetics, 22, 1999.
[2]
How reliable are medical studies? half of findings couldn't be replicated. MedlinePlus, 2015.
[3]
J. Abramson and Z. Abramson. Research methods in community medicine: surveys, epidemiological research, programme evaluation, clinical trials. John Wiley & Sons, 2011.
[4]
E. Babbie. The practice of social research. Cengage Learning, 2015.
[5]
A. M. T. Bobby J. Calder, Lynn W. Phillips. The concept of external validity. Journal of Consumer Research, 9(3):240--244, 1982.
[6]
B. Caglayan, B. Turhan, A. Bener, M. Habayeb, A. Miranskyy, and E. Cialini. Merits of organizational metrics in defect prediction: An industrial application. In International Conference on Software Engineering - Software Engineering in Practice Track, may 2015.
[7]
A. Cournand and M. Meyer. The scientist's code. Minerva, 14(1):79--96, 1976.
[8]
C. F. Craver. Structures of scientific theories. The Blackwell guide to the philosophy of science, 19:55, 2008.
[9]
R. DerSimonian and N. Laird. Meta-analysis in clinical trials. Controlled clinical trials, 7(3):177--188, 1986.
[10]
M. Felderer and E. Fourneret. A systematic classification of security regression testing approaches. International Journal on Software Tools for Technology Transfer, 17(3):305--319, 2015.
[11]
M. Felderer, P. Zech, R. Breu, M. Büchler, and A. Pretschner. Model-based security testing: a taxonomy and systematic classification. Software Testing, Verification and Reliability, 2015.
[12]
A. Haidich. Meta-analysis in medical research. Hippokratia, 14:29--37, 2010.
[13]
P. Harris. Designing and reporting experiments in psychology. McGraw-Hill Education (UK), 2008.
[14]
M. Heger. What is a theory? LiveScience, 2012.
[15]
J. P. Ioannidis and T. A. Trikalinos. Early extreme contradictory estimates may appear in published research: The proteus phenomenon in molecular genetics research and randomized trials. Journal of Clinical Epidemiology, 58(6):543--549, 2005.
[16]
A. Jedlitschka, M. Ciolkowski, and D. Pfahl. Reporting experiments in software engineering. In F. Shull, J. Singer, and D. SjÃÿberg, editors, Guide to Advanced Empirical Software Engineering, pages 201--228. Springer London, 2008.
[17]
A. Jedlitschka and D. Pfahl. Reporting guidelines for controlled experiments in software engineering. In International Symposium on Empirical Software Engineering, page 10, Nov 2005.
[18]
B. Kitchenham and S. Pfleeger. Personal opinion surveys. In F. Shull, J. Singer, and D. SjÃÿberg, editors, Guide to Advanced Empirical Software Engineering, pages 63--92. Springer London, 2008.
[19]
L. Lamport. How to write a proof. The American Mathematical Monthly, 102(7):600--608, 1995.
[20]
R. Moonesinghe, M. Khoury, and A. Janssens. Most published research findings are false---but a little replication goes a long way. PLOS Medicine, 4, 2007.
[21]
H. Oueslati, M. M. Rahman, and L. b. Othmane. Literature review of the challenges of developing secure software using the agile approach. In Availability, Reliability and Security (ARES), 2015 10th International Conference on, pages 540--547. IEEE, 2015.
[22]
K. Popper. Conjectures and refutations, volume 7. London: Routledge and Kegan Paul, 1963.
[23]
H. Rahmandad and J. D. Sterman. Reporting guidelines for simulation-based research in social sciences. System Dynamics Review, 28(4):396--411, 2012.
[24]
P. Runeson and M. HÃűst. Guidelines for conducting and reporting case study research in software engineering. Empirical Software Engineering, 14(2):131--164, 2009.
[25]
P. Sandle. Cyber crime costs global economy $445 billion a year: report. Reuters 9 June 2014. Available: http://www.reuters.com/article/2014/06/09/us-cybersecurity-mcafee-csis-idUSKBN0EK0SV20140609.
[26]
F. J. Shull, J. C. Carver, S. Vegas, and N. Juristo. The role of replications in empirical software engineering. Empirical Software Engineering, 13(2):211--218, 2008.
[27]
I. Simera, D. Moher, J. Hoey, K. Schulz, and D. Altman. A catalogue of reporting guidelines for health research. European journal of clinical investigation, 40(1):35--53, 2010.
[28]
J. A. Smith. Qualitative psychology: A practical guide to research methods. Sage, 2015.
[29]
A. J. Sutton, K. R. Abrams, D. R. Jones, D. R. Jones, T. A. Sheldon, and F. Song. Methods for meta-analysis in medical research. J. Wiley Chichester; New York, 2000.
[30]
M. V. Zelkowitz. An update to experimental models for validating computer technology. Journal of Systems and Software, 82(3):373--376, 2009.

Cited By

View all

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Other conferences
HotSos '16: Proceedings of the Symposium and Bootcamp on the Science of Security
April 2016
138 pages
ISBN:9781450342773
DOI:10.1145/2898375
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 19 April 2016

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. literature review
  2. science of security

Qualifiers

  • Research-article

Funding Sources

Conference

HotSoS '16
HotSoS '16: HotSos 2016 Science of Security
April 19 - 21, 2016
Pennsylvania, Pittsburgh

Acceptance Rates

Overall Acceptance Rate 34 of 60 submissions, 57%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)28
  • Downloads (Last 6 weeks)5
Reflects downloads up to 17 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2021)Systematic Mapping of the Literature on Secure Software DevelopmentIEEE Access10.1109/ACCESS.2021.30623889(36852-36867)Online publication date: 2021
  • (2019)Applying Security Testing Techniques to Automotive EngineeringProceedings of the 14th International Conference on Availability, Reliability and Security10.1145/3339252.3340329(1-10)Online publication date: 26-Aug-2019
  • (2019)Science Leaves CluesIEEE Security & Privacy10.1109/MSEC.2019.292564817:5(4-6)Online publication date: Sep-2019
  • (2019)Not all areas are equal: analysis of citations in information security researchScientometrics10.1007/s11192-019-03279-6Online publication date: 2-Nov-2019
  • (2019)TestRExInternational Journal on Software Tools for Technology Transfer (STTT)10.1007/s10009-017-0474-121:1(105-119)Online publication date: 1-Feb-2019
  • (2018)Get Me Cited, Scotty!Proceedings of the 13th International Conference on Availability, Reliability and Security10.1145/3230833.3233265(1-8)Online publication date: 27-Aug-2018
  • (2017)Step One Towards Science of SecurityProceedings of the 2017 Workshop on Automated Decision Making for Active Cyber Defense10.1145/3140368.3140374(31-35)Online publication date: 3-Nov-2017
  • (2017)Characterizing Scientific Reporting in Security LiteratureProceedings of the Hot Topics in Science of Security: Symposium and Bootcamp10.1145/3055305.3055307(13-23)Online publication date: 4-Apr-2017
  • (2017)SoK: Science, Security and the Elusive Goal of Security as a Scientific Pursuit2017 IEEE Symposium on Security and Privacy (SP)10.1109/SP.2017.38(99-120)Online publication date: May-2017

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media