Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/2610384.2610406acmconferencesArticle/Chapter ViewAbstractPublication PagesisstaConference Proceedingsconference-collections
research-article

DOM-based test adequacy criteria for web applications

Published: 21 July 2014 Publication History

Abstract

To assess the quality of web application test cases, web developers currently measure code coverage. Although code coverage has traditionally been a popular test adequacy criterion, we believe it alone is not adequate for assessing the quality of web application test cases. We propose a set of novel DOM-based test adequacy criteria for web applications. These criteria aim at measuring coverage at two granularity levels, (1) the percentage of DOM states and transitions covered in the total state space of the web application under test, and (2) the percentage of elements covered in each particular DOM state. We present a technique and tool, called DomCovery, which automatically extracts and measures the proposed adequacy criteria and generates a visual DOM coverage report. Our evaluation shows that there is no correlation between code coverage and DOM coverage. A controlled experiment illustrates that participants using DomCovery completed coverage related tasks 22% more accurately and 66% faster.

References

[1]
CasperJS. http://casperjs.org.
[2]
Claroline. http://www.claroline.net/.
[3]
Document Object Model (DOM). http://www.w3.org/DOM/.
[4]
DOMCovery. http://salt.ece.ubc.ca/software/domcovery/.
[5]
Firebug. https://getfirebug.com/.
[6]
JavaParser. https://code.google.com/p/javaparser/.
[7]
Jscover. http://tntim96.github.io/JSCover/.
[8]
Page Objects. https: //code.google.com/p/selenium/wiki/PageObjects.
[9]
Phormer photogallery. http://sourceforge.net/projects/rephormer/.
[10]
Selenium HQ. http://seleniumhq.org/.
[11]
H. Abdi. The kendall rank correlation coefficient. Encyclopedia of Measurement and Statistics. Sage, Thousand Oaks, CA, pages 508–510, 2007.
[12]
M. Alalfi, J. Cordy, and T. Dean. Automating coverage metrics for dynamic web applications. In Proceedings of European Conference on Software Maintenance and Reengineering (CSMR), pages 51–60. IEEE, 2010.
[13]
M. Alalfi, J. R. Cordy, and T. R. Dean. DWASTIC: Automating coverage metrics for dynamic web applications. In Proceedings of the ACM Symposium on Applied Computing, 2009.
[14]
S. Alimadadi, S. Sequeira, A. Mesbah, and K. Pattabiraman. Understanding JavaScript event-based interactions. In Proceedings of the ACM/IEEE International Conference on Software Engineering (ICSE), pages 367–377. ACM, 2014.
[15]
L. Briand and D. Pfahl. Using simulation for assessing the real impact of test coverage on defect coverage. In Proceedings of the International Symposium on Software Reliability Engineering, pages 148–157. IEEE, 1999.
[16]
R. C. Bryce and A. M. Memon. Test suite prioritization by interaction coverage. In Workshop on Domain specific approaches to software test automation: in conjunction with the 6th ESEC/FSE joint meeting, pages 1–7. ACM, 2007.
[17]
T. Dao and E. Shibayama. Coverage criteria for automatic security testing of web applications. In Information Systems Security, volume 6503 of Lecture Notes in Computer Science, pages 111–124, 2011.
[18]
M. Gligoric, A. Groce, C. Zhang, R. Sharma, M. A. Alipour, and D. Marinov. Comparing non-adequate test suites using coverage criteria. In Proceedings of International Symposium on Software Testing and Analysis (ISSTA), pages 302–313. ACM, 2013.
[19]
J. R. Horgan, S. London, and M. R. Lyu. Achieving software quality with testing coverage measures. Computer, 27(9):60–69, 1994.
[20]
L. Inozemtseva and R. Holmes. Coverage is not strongly correlated with test suite effectiveness. In Proceedings of the ACM/IEEE International Conference on Software Engineering (ICSE), 2014.
[21]
K. Koster and D. Kao. State coverage: a structural test adequacy criterion for behavior checking. In The 6th Joint Meeting on European software engineering conference and the ACM SIGSOFT symposium on the foundations of software engineering: companion papers, pages 541–544. ACM, 2007.
[22]
A. M. Memon, M. L. Soffa, and M. E. Pollack. Coverage criteria for GUI testing. In Proceedings of the European Software Engineering Conference and International Symposium on Foundations of Software Engineering (ESEC/FSE), pages 256–267. ACM, 2001.
[23]
A. Mesbah, A. van Deursen, and S. Lenselink. Crawling Ajax-based web applications through dynamic analysis of user interface state changes. ACM Trans. on the Web (TWEB), 6(1):3:1–3:30, 2012.
[24]
A. Mesbah, A. van Deursen, and D. Roest. Invariant-based automatic testing of modern web applications. IEEE Trans. Softw. Eng., 38(1):35–53, 2012.
[25]
J. C. Miller and C. J. Maloney. Systematic mistake analysis of digital computer programs. Communication of ACM, 6(2):58–63, 1963.
[26]
A. S. Namin and J. H. Andrews. The influence of size and coverage on test suite effectiveness. In Proceedings of the International Symposium on Software Testing and Analysis (ISSTA), pages 57–68. ACM, 2009.
[27]
M. Pezze and M. Young. Software testing and analysis: process, principles, and techniques. John Wiley & Sons, 2008.
[28]
K. Sakamoto, K. Tomohiro, D. Hamura, H. Washizaki, and Y. Fukazawa. POGen: A test code generator based on template variable coverage in gray-box integration testing for web applications. In Fundamental Approaches to Software Engineering, volume 7793, pages 343–358, 2013.
[29]
S. Sampath, E. Gibson, S. Sprenkle, and L. Pollock. Coverage criteria for testing web applications. Computer and Information Sciences, University of Delaware, Tech. Rep, pages 2005–017, 2005.
[30]
D. Schuler and A. Zeller. Assessing oracle quality with checked coverage. In Proceedings of the International Conference on Software Testing, Verification and Validation (ICST), pages 90–99, 2011.
[31]
A. Taleghani and J. M. Atlee. State-space coverage estimation. In Proceedings of the IEEE/ACM International Conference on Automated Software Engineering, pages 459–467. IEEE Computer Society, 2009.
[32]
Y. Wei, B. Meyer, and M. Oriol. Is branch coverage a good measure of testing effectiveness? In Empirical Software Engineering and Verification, volume 7007 of Lecture Notes in Computer Science, pages 194–212, 2012.
[33]
C. Wohlin, P. Runeson, M. Hst, M. C. Ohlsson, B. Regnell, and A. Wessln. Experimentation in software engineering. Springer, 2012.
[34]
H. Zhu, P. A. V. Hall, and J. H. R. May. Software unit test coverage and adequacy. ACM Computing Survey, 29(4):366–427, 1997.
[35]
Y. Zou, C. Fang, Z. Chen, X. Zhang, and Z. Zhao. A hybrid coverage criterion for dynamic web testing. In Proceedings of International Conference on Software Engineering and Knowledge Engineering, 2013.

Cited By

View all
  • (2023)Code Coverage Criteria for Asynchronous ProgramsProceedings of the 31st ACM Joint European Software Engineering Conference and Symposium on the Foundations of Software Engineering10.1145/3611643.3616292(1307-1319)Online publication date: 30-Nov-2023
  • (2023)PTDETECTOR: An Automated JavaScript Front-end Library Detector2023 38th IEEE/ACM International Conference on Automated Software Engineering (ASE)10.1109/ASE56229.2023.00049(649-660)Online publication date: 11-Sep-2023
  • (2023)QExplore: An exploration strategy for dynamic web applications using guided searchJournal of Systems and Software10.1016/j.jss.2022.111512195(111512)Online publication date: Jan-2023
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
ISSTA 2014: Proceedings of the 2014 International Symposium on Software Testing and Analysis
July 2014
460 pages
ISBN:9781450326452
DOI:10.1145/2610384
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

In-Cooperation

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 21 July 2014

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. DOM
  2. Test adequacy criteria
  3. coverage
  4. web applications

Qualifiers

  • Research-article

Conference

ISSTA '14
Sponsor:

Acceptance Rates

Overall Acceptance Rate 58 of 213 submissions, 27%

Upcoming Conference

ISSTA '25

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)13
  • Downloads (Last 6 weeks)1
Reflects downloads up to 14 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2023)Code Coverage Criteria for Asynchronous ProgramsProceedings of the 31st ACM Joint European Software Engineering Conference and Symposium on the Foundations of Software Engineering10.1145/3611643.3616292(1307-1319)Online publication date: 30-Nov-2023
  • (2023)PTDETECTOR: An Automated JavaScript Front-end Library Detector2023 38th IEEE/ACM International Conference on Automated Software Engineering (ASE)10.1109/ASE56229.2023.00049(649-660)Online publication date: 11-Sep-2023
  • (2023)QExplore: An exploration strategy for dynamic web applications using guided searchJournal of Systems and Software10.1016/j.jss.2022.111512195(111512)Online publication date: Jan-2023
  • (2021)Mutation Analysis for Assessing End-to-End Web Tests2021 IEEE International Conference on Software Maintenance and Evolution (ICSME)10.1109/ICSME52107.2021.00023(183-194)Online publication date: Sep-2021
  • (2021)An automated model-based approach to repair test suites of evolving web applicationsJournal of Systems and Software10.1016/j.jss.2020.110841171(110841)Online publication date: Jan-2021
  • (2021)Web Test Automation: Insights from the Grey LiteratureSOFSEM 2021: Theory and Practice of Computer Science10.1007/978-3-030-67731-2_35(472-485)Online publication date: 11-Jan-2021
  • (2020)Comparing Coverage Criteria for Dynamic Web application: An Empirical EvaluationComputer Standards & Interfaces10.1016/j.csi.2020.103467(103467)Online publication date: Aug-2020
  • (2019)Systematic Mapping on Quality in Web Application Testing2019 1st International Informatics and Software Engineering Conference (UBMYK)10.1109/UBMYK48245.2019.8965472(1-5)Online publication date: Nov-2019
  • (2019)Exploring output-based coverage for testing PHP web applicationsAutomated Software Engineering10.1007/s10515-018-0246-526:1(59-85)Online publication date: 1-Mar-2019
  • (2018)An experience report on applying software testing academic results in industryEmpirical Software Engineering10.1007/s10664-017-9570-923:4(1959-1981)Online publication date: 1-Aug-2018
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media