Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/1985793.1986000acmconferencesArticle/Chapter ViewAbstractPublication PagesicseConference Proceedingsconference-collections
extended-abstract

GATE: game-based testing environment

Published: 21 May 2011 Publication History

Abstract

In this paper, we propose a game-based public testing mechanism called GATE. The purpose of GATE is to make use of the rich human resource on the Internet to help increase effectiveness in software testing and improve test adequacy. GATE facilitates public testing in three main steps: 1) decompose the test criterion satisfaction problem into many smaller sub-model satisfaction problems; 2) construct games for each individual sub-models and presenting the games to the public through web servers; 3) collect and convert public users' action sequence data into real test cases which guarantee to cover not adequately tested elements.
A preliminary study on apache-commons-math library shows that 44% of the branches have not been adequately tested by state of the art automatic test generation techniques. Among these branches, at least 42% are decomposable by GATE into smaller sub-problems. These elements naturally become the potential targets of GATE for public game-based testing.

References

[1]
S. Cooper, A. Treuille, J. Barbero, A. Leaver-Fay, K. Tuite, F. Khatib, A. C. Snyder, M. Beenen, D. Salesin, D. Baker, and Z. Popovi . The challenge of designing scientific discovery games. In Proc. FDG, 2010.
[2]
L. M. de Moura and N. Bjørner. Z3: An efficient SMT solver. In Proc. TACAS, pages 337--340, 2008.
[3]
E. Dijkstra. A Discipline of Programming. Prentice Hall, Englewood Cliffs, 1976.
[4]
B. Dutertre and L. de Moura. System description: Yices 1.0. In Proc. SMT-COMP, 2006.
[5]
H. Jaygarl, S. Kim, T. Xie, and C. K. Chang. OCAT: Object Capture-based Automated Testing. In Proc. ISSTA, 2010.
[6]
C. C. Michael, G. McGraw, and M. A. Schatz. Generating software test data by evolution. IEEE Trans. Softw. Eng., 27:1085--1110, December 2001.
[7]
C. Pacheco, S. K. Lahiri, M. D. Ernst, and T. Ball. Feedback-directed random test generation. In Proc. ICSE, pages 75--84, Minneapolis, MN, USA, May 23--25, 2007.
[8]
R. Pandita, T. Xie, N. Tillmann, and J. de Halleux. Guided test generation for coverage criteria. In Proc. ICSM, 2010.
[9]
N. Tillmann and J. de Halleux. Pex-white box test generation for .NET. In Proc. TAP, pages 134--153, 2008.
[10]
L. Zhang, T. Xie, L. Zhang, N. Tillmann, J. de Halleux, and H. Mei. Test generation via dynamic symbolic execution for mutation testing. In Proc. ICSM, 2010.

Cited By

View all
  • (2016)On the gamification of human-centric traceability tasks in software testing and coding2016 IEEE 14th International Conference on Software Engineering Research, Management and Applications (SERA)10.1109/SERA.2016.7516146(193-200)Online publication date: Jun-2016
  • (2012)BodhiProceedings of the 2012 IEEE Sixth International Conference on Software Security and Reliability Companion10.1109/SERE-C.2012.35(168-173)Online publication date: 20-Jun-2012

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
ICSE '11: Proceedings of the 33rd International Conference on Software Engineering
May 2011
1258 pages
ISBN:9781450304450
DOI:10.1145/1985793
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 21 May 2011

Check for updates

Author Tags

  1. code coverage
  2. human computation
  3. testing

Qualifiers

  • Extended-abstract

Conference

ICSE11
Sponsor:
ICSE11: International Conference on Software Engineering
May 21 - 28, 2011
HI, Waikiki, Honolulu, USA

Acceptance Rates

Overall Acceptance Rate 276 of 1,856 submissions, 15%

Upcoming Conference

ICSE 2025

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 14 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2016)On the gamification of human-centric traceability tasks in software testing and coding2016 IEEE 14th International Conference on Software Engineering Research, Management and Applications (SERA)10.1109/SERA.2016.7516146(193-200)Online publication date: Jun-2016
  • (2012)BodhiProceedings of the 2012 IEEE Sixth International Conference on Software Security and Reliability Companion10.1109/SERE-C.2012.35(168-173)Online publication date: 20-Jun-2012

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media