Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/503209.503244acmconferencesArticle/Chapter ViewAbstractPublication PagesfseConference Proceedingsconference-collections
Article

Coverage criteria for GUI testing

Published: 01 September 2001 Publication History

Abstract

A widespread recognition of the usefulness of graphical user interfaces (GUIs) has established their importance as critical components of today's software. GUIs have characteristics different from traditional software, and conventional testing techniques do not directly apply to GUIs. This paper's focus is on coverage critieria for GUIs, important rules that provide an objective measure of test quality. We present new coverage criteria to help determine whether a GUI has been adequately tested. These coverage criteria use events and event sequences to specify a measure of test adequacy. Since the total number of permutations of event sequences in any non-trivial GUI is extremely large, the GUI's hierarchical structure is exploited to identify the important event sequences to be tested. A GUI is decomposed into GUI components, each of which is used as a basic unit of testing. A representation of a GUI component, called an event-flow graph, identifies the interaction of events within a component and intra-component criteria are used to evaluate the adequacy of tests on these events. The hierarchical relationship among components is represented by an integration tree, and inter-component coverage criteria are used to evaluate the adequacy of test sequences that cross components. Algorithms are given to construct event-flow graphs and an integration tree for a given GUI, and to evaluate the coverage of a given test suite with respect to the new coverage criteria. A case study illustrates the usefulness of the coverage report to guide further testing and an important correlation between event-based coverage of a GUI and statement coverage of its software's underlying code.

References

[1]
D. Chays, S. Dan, P. G. Frankl, F. I. Vokolos, and E. J. Weyuker. A framework for testing database applications. In Proceedings of the PO00 International Symposium on Software Testing and Analysis (ISSTA), pages 147-157, 2000.
[2]
J. S. Gourlay. A mathematicM framework for the investigation of testing. IEEE Transactions on Software En9ineering, 9(6):686-709, Nov. 1983.
[3]
M. L. Hammontree, J. J. Hendrickson, and B. W. Hensley. Integrated data capture and analysis tools for research and testing an graphical user interfaces. In Proceedings of the Conference on Human Factors in Computing Systems, pages 431-432, New York, NY, USA, May 1992. ACM Press.
[4]
M. J. Harrold and M. L. Sofia. Interprocedual data flow testing. In R. A. Kemmerer, editor, Proceedings of the ACM SIGSOFT '89 Third Symposium on Testing, Analysis, and Verification (TA V3), pages 158-167, 1989.
[5]
P. C. Jorgensen and C. Erickson. Object-oriented integration testing. Communications of the A CM, 37(9):30-38, Sept. 1994.
[6]
D. J. Kasik and H. G. George. Toward automatic generation of novice user test scripts. In Proceedings of the Conference on Human Factors in Computing Systems : Common Ground, pages 244-251, New York, 13-18 Apr. 1996. ACM Press.
[7]
L. R. Kepple. The black art of GUI testing. Dr. Dobb's Journal of Software Tools, 19(2):40, Feb. 1994.
[8]
A. M. Memon, M. E. Pollack, and M. L. Sofia. Using a goal-driven approach to generate test cases for GUIs. In Proceedings of the 21st International Conference on Software En9ineerin9, pages 257-266. ACM Press, May 1999.
[9]
A. M. Memon, M. E. Pollack, and M. L. Sofia. Automated test oracles for GUIs. In D. S. Rosenblum, editor, Proceedings of the A CM SIGSOFT 8th International Symposium on the Foundations of Software Engineering (FSE-O0), pages 30-39, NY, Nov. 8-10 2000. ACM Press.
[10]
A. M. Memon, M. E. Pollack, and M. L. Sofia. A planning-based approach to GUI testing. In Proceedings of The 13th International Software/Internet Quality Week, May 2000.
[11]
A. M. Memon, M. E. Pollack, and M. L. Sofia. Hierarchical GUI test case generation using automated planning. IEEE Transactions on Software Engineering, 27(2):144-155, Feb. 2001.
[12]
T. Ostrand, A. Anodide, H. Foster, and T. Goradia. A visual test development environment for GUI systems. In Proceedings of the A CM SIGSOFT International Symposium on Software Testing and Analysis (ISSTA-98), pages 82-92, New York, Mar.2-5 1998. ACM Press.
[13]
S. Rapps and E. J. Weyuker. Selecting software test data using data flow information. IEEE Transactions on Software Engineering, 11(4):367-375, Apr. 1985.
[14]
R. K. Shehady and D. P. Siewiorek. A method to automate user interface testing using variable finite state machines. In Proceedings of The Twenty-Seventh Annual International Symposium on Fault-Tolerant Computing (FTCS'97), pages 80-88, Washington- Brussels - Tokyo, June 1997. IEEE Press.
[15]
L. The. Stress Tests For GUI Programs. Datamation, 38(18):37, Sept. 1992.
[16]
E. J. Weyuker. The applicability of program schema results to programs. International Journal of Computer and Information Sciences, 8(5):387-403, Oct. 1979.
[17]
E. J. Weyuker. Translatability and decidability questions for restricted classes of program schemas. SIAM Journal on Computing, 8(4):587-598, 1979.
[18]
L. White and H. Almezen. Generating test cases for GUI responsibilities using complete interaction sequences. In Proceedings of the International Symposium on Software Reliability Engineering, pages 110-121, Oct. 8-11 2000.
[19]
S. Wolfram. Mathematiea: A System for Doing Mathematics by Computer. Addison-Wesley, Reading, Massachusetts, 1988.
[20]
H. Zhu and P. Hall. Test data adequacy measurements. Software Engineering Journal, 8(1):21-30, Jan. 1993.

Cited By

View all
  • (2024)Augmented testing to support manual GUI-based regression testing: An empirical studyEmpirical Software Engineering10.1007/s10664-024-10522-z29:6Online publication date: 17-Aug-2024
  • (2023)Semantic Similarity-Based Mobile Application Isomorphic Graphical User Interface IdentificationMathematics10.3390/math1103052711:3(527)Online publication date: 18-Jan-2023
  • (2023)Automata-Based Trace Analysis for Aiding Diagnosing GUI Testing Tools for AndroidProceedings of the 31st ACM Joint European Software Engineering Conference and Symposium on the Foundations of Software Engineering10.1145/3611643.3616361(592-604)Online publication date: 30-Nov-2023
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
ESEC/FSE-9: Proceedings of the 8th European software engineering conference held jointly with 9th ACM SIGSOFT international symposium on Foundations of software engineering
September 2001
329 pages
ISBN:1581133901
DOI:10.1145/503209
  • Conference Chairs:
  • A. Min Tjoa,
  • Volker Gruhn
  • cover image ACM SIGSOFT Software Engineering Notes
    ACM SIGSOFT Software Engineering Notes  Volume 26, Issue 5
    Sept. 2001
    329 pages
    ISSN:0163-5948
    DOI:10.1145/503271
    Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 01 September 2001

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. GUI test coverage
  2. GUI testing
  3. component testing
  4. event-based coverage
  5. event-flow graph
  6. integration tree

Qualifiers

  • Article

Conference

ESEC/FSE01
Sponsor:

Acceptance Rates

ESEC/FSE-9 Paper Acceptance Rate 29 of 137 submissions, 21%;
Overall Acceptance Rate 112 of 543 submissions, 21%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)61
  • Downloads (Last 6 weeks)8
Reflects downloads up to 21 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Augmented testing to support manual GUI-based regression testing: An empirical studyEmpirical Software Engineering10.1007/s10664-024-10522-z29:6Online publication date: 17-Aug-2024
  • (2023)Semantic Similarity-Based Mobile Application Isomorphic Graphical User Interface IdentificationMathematics10.3390/math1103052711:3(527)Online publication date: 18-Jan-2023
  • (2023)Automata-Based Trace Analysis for Aiding Diagnosing GUI Testing Tools for AndroidProceedings of the 31st ACM Joint European Software Engineering Conference and Symposium on the Foundations of Software Engineering10.1145/3611643.3616361(592-604)Online publication date: 30-Nov-2023
  • (2023)Incremental Testing in Software Product Lines—An Event Based ApproachIEEE Access10.1109/ACCESS.2023.323418611(2384-2395)Online publication date: 2023
  • (2022)VITAS : Guided Model-based VUI Testing of VPA AppsProceedings of the 37th IEEE/ACM International Conference on Automated Software Engineering10.1145/3551349.3556957(1-12)Online publication date: 10-Oct-2022
  • (2022)A taxonomy of metrics for GUI-based testing research: A systematic literature reviewInformation and Software Technology10.1016/j.infsof.2022.107062152(107062)Online publication date: Dec-2022
  • (2021)CAT: Change-focused Android GUI Testing2021 IEEE International Conference on Software Maintenance and Evolution (ICSME)10.1109/ICSME52107.2021.00047(460-470)Online publication date: Sep-2021
  • (2021)Construction of GUI Elements Recognition Model for AI Testing based on Deep Learning2021 8th International Conference on Dependable Systems and Their Applications (DSA)10.1109/DSA52907.2021.00075(508-515)Online publication date: Aug-2021
  • (2021)Software Quality Enhancement Using Hybrid Model of DevOpsIntelligent Systems10.1007/978-981-16-2248-9_28(281-288)Online publication date: 22-Jul-2021
  • (2020)Models in Graphical User Interface Testing: Study Design2020 Turkish National Software Engineering Symposium (UYMS)10.1109/UYMS50627.2020.9247072(1-6)Online publication date: 7-Oct-2020
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media