Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/2993288.2993296acmotherconferencesArticle/Chapter ViewAbstractPublication PagessastConference Proceedingsconference-collections
research-article

SS-BDD: Automated Acceptance Testing for Spreadsheets

Published: 19 September 2016 Publication History

Abstract

Current Spreadsheet Applications, such as Excel and Google Sheets, provide innumerous built-in facilities, including arithmetic, financial and statistical operations, as well as conditional expressions. Thus, users with little or no formal training in programming can use Spreadsheet Applications to implement their own Spreadsheet Programs. In fact, Spreadsheet Applications have become one of the most popular end-user programming environments nowadays. However, these applications also ease the introduction of errors in Spreadsheet Programs. Minor mistakes in formulas can mislead decisionmaking processes, resulting in uncountable costs to organizations. In general, end-user programmers are unaware of the potential risks that the uncontrolled construction of Spreadsheet Programs can cause. Therefore, a major focus of this paper is to offer an automated approach that makes programmers aware of introduced faults, so that they can build high quality Spreadsheet Programs. In particular, we propose SS-BDD, a framework for building and running Spreadsheets test scenarios, which relies on the use of Behavior Driven Development (BDD). We used SS-BDD to test three different Spreadsheet Programs. Our experience shows that SS-BDD can be used to build end-user friendly test scenarios which can achieve high fault-detection effectiveness.

References

[1]
S. Ditlea. Spreadsheets Can be Hazardous to Your Health. Personal Computing, 11(1):60--69, 1987.
[2]
M. M. Burnett, C. Cook, J. Summet, G. Rothermel, and C. Wallace. End-User Software Engineering with Assertions. IEEE Int. Conf. on Software Engineering, pp. 93--103, 2003.
[3]
R. R. Panko. Spreadsheet Errors: What We Know. What We Think We Can Do. Symp. of the European Spreadsheet Risks Interest Group (EuSpRIG), 2000.
[4]
G. Rothermel, M. M. Burnett, L. Li, C. DuPuis, and A. Sheretov. A Methodology for Testing Spreadsheets. ACM Transactions on Software Engineering and Methodology, pp. 110--147, 2001.
[5]
K. Rajalingham, D. R. Chadwick, and B. Knight. Classification of Spreadsheet Errors. Symp. of the European Spread- sheet Risks Interest Group (EuSpRIG), 2001.
[6]
D. Janzen, D.H. Saiedian, test-driven development: concepts, taxonomy, and future directions, Computer, Vol. 38, no. 9, pp. 43--50, Sept, 2005.
[7]
D. Chelimsky, D. Astels, Z. Dennis, A. Hellesoy, D. North. The RSpec book: Behaviour Driven Development with RSpec, cucumber and friends, Pragmatic Bookshelf, 2010.
[8]
D. North, Introducing BDD, 2006. Available at: http://dannorth.net/introducing-bdd {Accessed July 07, 2016}.
[9]
C. Solis and X. Wang, "A Study of the Characteristics of Behaviour Driven Development," 2011 37th EUROMICRO Conference on Software Engineering and Advanced Applications, Oulu, 2011, pp. 383--387.
[10]
Burnett, M., Cook, C., Rothermel, G.: End User Software Engineering. Communications of the ACM 47(9) (September 2004)
[11]
Panko, R.: What We Know About Spreadsheet Errors. Journal of End User Computing 10(2), 15--21 (Spring 1998)
[12]
Alan Rust, Brian Bishop, Kevin McDaid. Test-Driven Development: Can It Work for Spreadsheet Engineering?. Extreme Programming and Agile Processes in Software Engineering. LNCS. Volume 4044. pp 209--210. 2006.
[13]
Chris Chambers and Martin Erwig. Automatic detection of dimension errors in spreadsheets. J. Vis. Lang. Comput. Volume 20, Issue 4, 269-283. 2009
[14]
M. Fisher, G. Rothermel, D. Brown, M. Cao, C. Cook, and B. Burnett. Integrating Automated Test Generation into the WYSIWYT Spreadsheet Testing Methodology. ACM Trans. on Software Engineering and Methodology, 2006.
[15]
R. Abraham and M. Erwig. Goal-Directed Debugging of Spreadsheets. In IEEE Int. Symp. on Visual Languages and Human-Centric Computing, pages 37--44, 2005.
[16]
R. Abraham, M. Erwig, S. Kollmansberger, and E. Seifert. Visual Specifications of Correct Spreadsheets. In IEEE Int. Symp. on Visual Languages and Human-Centric Computing, pages 189--196, 2005.
[17]
Y. Ahmad, T. Antoniu, S. Goldwater, and S. Krishnamurthi. A Type System for Statically Detecting Spreadsheet Errors. In 18th IEEE Int. Conf. on Automated Software Engineering,
[18]
T. Antoniu, P. A. Steckler, S. Krishnamurthi, E. Neuwirth, and M. Felleisen. Validating the Unit Correctness of Spreadsheet Programs. In 26th IEEE Int. Conf. on Software Engineering, pages 439--448, 2004.
[19]
G. Engels and M. Erwig. ClassSheets: Automatic Generation of Spreadsheet Applications from Object-Oriented Specifications. In 20th IEEE/ACM Int. Conf. on Automated Software Engineering, pages 124--133, 2005.
[20]
M.M. Burnett, A. Sheretov, B. Ren, G. Rothermel, Testing homo- geneous spreadsheet grids with the "What You See Is What You Test" methodology, IEEE Transactions on Software Engineering 29 (6) (2002) 576--594.
[21]
M. Fisher II, M. Cao, G. Rothermel, C. Cook, M.M. Burnett, Automated test case generation for spreadsheets, in: IEEE International Conference on Software Engineering, 2002, pp. 141--151.
[22]
R. Abraham, M. Erwig, AutoTest: a tool for automatic test case generation in spreadsheets, in: IEEE International Symposium on Visual Languages and Human-Centric Computing, 2006, pp. 43--50.
[23]
J. Lawrence, R. Abraham, M.M. Burnett, M. Erwig, Sharing reasoning about faults in spreadsheets: an empirical study, in: IEEE International Symposium on Visual Languages and Human-Centric Computing, 2006, pp. 35--42.
[24]
T. Antoniu, P.A. Steckler, S. Krishnamurthi, E. Neuwirth, M. Felleisen, Validating the unit correctness of spreadsheet programs, in: 26th IEEE International Conference on Software Engineering, 2004, pp. 439--448.
[25]
R. Abraham, M. Erwig, UCheck: a spreadsheet unit checker for end users, Journal of Visual Languages and Computing 18 (1) (2007) 71--95.
[26]
Michael J. Coblenz, Andrew J. Ko, Brad A. Myers, Using objects of measurement to detect spreadsheet errors, in: IEEE Symposium on Visual Languages and Human-Centric Computing, 2005, pp. 314--316.
[27]
P. Brown, J. Gould, An experimental study of people creating spreadsheets, ACM Transactions on Office Information Systems 5 (1987) 258--272.
[28]
J. Caulkins, E. Morrison, T. Weidemann, Spreadsheet errors and decision making: evidence from field interviews, Journal of End User Computing. 2006
[29]
Stephen G. Powell, Kenneth R. Baker, and Barry Lawson. 2008. A critical review of the literature on spreadsheet errors. Decis. Support Syst. 46, 1 (December 2008), 128--138.
[30]
Dietmar Jannach, Thomas Schmitz, Birgit Hofer, Franz Wotawa, Avoiding, finding and fixing spreadsheet errors -- A survey of automated approaches for spreadsheet QA, Journal of Systems and Software, Volume 94, August 2014, Pages 129--150, ISSN 0164-1212.
[31]
Panko, R.R., Port, D.N., 2012. End User Computing: The Dark Matter (and Dark Energy) of Corporate IT. In: Proceedings of the 45th Hawaii International Conference on System Sciences (HICSS 2012), Wailea, HI, USA, pp. 4603--4612.
[32]
Galletta, D.F., Abraham, D., Louadi, M.E., Lekse, W., Pollalis, Y.A., Sampler, J.L., 1993. An empirical study of spreadsheet error-finding performance. Account. Manage.
[33]
Rothermel, G., Li, L., Dupuis, C., Burnett, M., 1998. What you see is what you test: a methodology for testing form-based visual programs. In: Proceedings of the 20th International Conference on Software Engineering (ICSE 1998), Kyoto, Japan, pp. 198--207.
[34]
Binder, R., Testing Object-Oriented Systems. Models, Patterns, and Tools, Addison-Wesley, 1999.
[35]
Myers, G.J, The Art of Software Testing, Wiley, 2004.
[36]
Voas, J. and McGraw, G. Software Fault Injection: Inoculating Programs Against Erros. Wiley, 1998.
[37]
Xie, T., Zhao, J., A Framework and Tool Supports for Generating Test Inputs of AspectJ Programs. AOSD'06, 2006.

Cited By

View all

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Other conferences
SAST '16: Proceedings of the 1st Brazilian Symposium on Systematic and Automated Software Testing
September 2016
154 pages
ISBN:9781450347662
DOI:10.1145/2993288
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

In-Cooperation

  • SBC: Sociedade Brasileira de Computação

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 19 September 2016

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Behavior Driven Development
  2. Software Testing Tool
  3. Spreadsheet Testing

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Funding Sources

  • CNPq

Conference

SAST '16

Acceptance Rates

SAST '16 Paper Acceptance Rate 15 of 34 submissions, 44%;
Overall Acceptance Rate 45 of 92 submissions, 49%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)1
  • Downloads (Last 6 weeks)0
Reflects downloads up to 14 Nov 2024

Other Metrics

Citations

Cited By

View all

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media