Nothing Special   »   [go: up one dir, main page]

Skip to main content
Log in

Collecting Feedback during Software Engineering Experiments

  • Original Article
  • Published:
Empirical Software Engineering Aims and scope Submit manuscript

Abstract

Objective: To improve the qualitative data obtained from software engineering experiments by gathering feedback during experiments. Rationale: Existing techniques for collecting quantitative and qualitative data from software engineering experiments do not provide sufficient information to validate or explain all our results. Therefore, we would like a cost-effective and unobtrusive method of collecting feedback from subjects during an experiment to augment other sources of data. Design of study: We formulated a set of qualitative questions that might be answered by collecting feedback during software engineering experiments. We then developed a tool to collect such feedback from experimental subjects. This feedback-collection tool was used in four different experiments and we evaluated the usefulness of the feedback obtained in the context of each experiment. The feedback data was triangulated with other sources of quantitative and qualitative data collected for the experiments. Results: We have demonstrated that the collection of feedback during experiments provides useful additional data to: validate the data obtained from other sources about solution times and quality of solutions; check process conformance; understand problem solving processes; identify problems with experiments; and understand subjects’ perception of experiments. Conclusions: Feedback collection has proved useful in four experiments and we intend to use the feedback-collection tool in a range of other experiments to further explore the cost-effectiveness and limitations of this technique. It is also necessary to carry out a systematic study to more fully understand the impact of the feedback-collecting tool on subjects’ performance in experiments.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Abelow, D. 1993. Automating feedback on software product use. CASE Trends December: 15–17.

  • Anda, B., and Sjøberg, D. I. K. 2003. Applying use cases to design versus validate class diagrams—A controlled experiment using a professional modelling tool. IEEE International Symposium on Empirical Software Engineering (ISESE 2003). Rome, Italy, 50–60.

  • Anderson, J. R. 1987. Methodologies for studying human knowledge. Behavioural and Brain Science 10:467–505.

    Article  Google Scholar 

  • Arisholm, E., Sjøberg, D. I. K., Carelius, G., and Lindsjørn, Y. 2002. A web-based support environment for software engineering experiments. Nordic Journal of Computing 9(4): 231–247.

    Google Scholar 

  • Arisholm, E., Ali, S. A., and Hove, S. E. 2003. An initial controlled experiment to evaluate the effect of UML design documentation on the maintainability of object-oriented software in a realistic programming environment. Simula Research Laboratory, Technical Report 2003–2004.

  • Aserinsky, E., and Kleitman, N., 1953. Regularly occurring periods of eye mobility and concomitant phenomena during sleep. Science 118: 273–374.

    CAS  PubMed  Google Scholar 

  • Basili, V. R., Shull, F., and Lanubile, F. 1999. Building knowledge through families of experiments. IEEE Transactions on Software Engineering 25(4): July–Aug., 456–473.

    Google Scholar 

  • Bernard, H., Killorth, P., and Sailer, L. 1982. Informant accurracy in social-network data. An experimental attempt to predict actual communication from recall data. Social Science Research 11: 30–36.

    Google Scholar 

  • Berry, D. C., and Broadbent, D. E. 1990. The role of instruction and verbalization in improving performance on complex search tasks. Behaviour and Information Technology 9(3): May–June, 175–190.

    Google Scholar 

  • Boren, M. T., and Ramey, J., 2000. Thinking-aloud: Reconciling theory and practice. IEEE Transactions on Professional Communication 43(3): September, 261–278.

    Google Scholar 

  • Campbell, D. T., and Stanley, J. C. 1963. Experimental and Quasi-Experimental Designs for Research. Boston, MA, USA: Houghton Mifflin Company.

    Google Scholar 

  • Chen, H., and Nilan, M. 1998. An exploration of web users’ internal experience: Application of the experience sampling method to the web environment. WebNet 98 World Conference. Orlando, Florida.

  • Conrath, D. W., Higgins, C. A., and McClean, R. J. 1983. A comparison of questionnaire versus diary data. Social Networks 5: 315–322.

    Google Scholar 

  • Davidson, G. V., Shorter, L., Crum, A., and Lane, J. 1996. Children’s use of learning strategies and decision making in hypertext computer lesson. ED MEDIA 96, Conference on Educational Multimedia and Hypermedia. Assoc. Adv. Comp. Educ., Charlottesville, VA, USA.

  • Denning, S., Hoiem, D., Simpson, M., and Sullivan, K. 1990. The value of thinking-aloud protocols in industry: A case study of microsoft. Proceedings of the Human Factors Society–34th Annual Meeting. Santa Monica, 1285–1289.

  • Ericsson, K. A. 2003. Valid and non-reactive verbalisation of thoughts. In Preparation.

  • Ericsson, K. A., and Simon, H. A. 1993. Protocol Analysis: Verbal Reports as Data. Cambridge, Massachusetts: The MIT Press.

    Google Scholar 

  • Ericsson, K. A., and Simon, H. A. 1998. How to study thinking in everyday life: Contrasting think-aloud protocols with descriptions and explanations of thinking. Mind, Culture and Activity 5(3): 178–186.

    Google Scholar 

  • Evans, H., Atkinson, M., Brown, M., Cargill, J., Crease, M., Draper, S., Gray, P., and Thomas, R. 2003. The pervasiveness of evolution in GRUMPS software. Software-Practice & Experience 33(2): 99–120.

    MATH  Google Scholar 

  • Freund, R. J., and Wilson, W. J. 1998. Regression Analysis: Statistical Modeling of a Response Variable. Academic Press.

  • Garner, R. 1988. Verbal-report data on cognitive and metacognitive strategies. In: Weinstein C. E. et al., (eds), Learning and Study Strategies: Issues in Assessment, Instruction, and Evaluation. San Diego, CA: Academic Press, Inc., 63–100.

    Google Scholar 

  • Genest, M., and Turk, D. C. (1981). Think-aloud approaches to cognitive assessment. In Merluzzi C. R. et al. (eds), Cognitive Assessment. Guilford Press, New York, pp. 223–269.

    Google Scholar 

  • Haak, M. J., and Jong, D. T. M. 2003. Exploring two methods of usability testing: Concurrent versus retrospective think-aloud protocols. IEEE Computer Society 285–287.

  • Hurlburt, R. T. 1979. Random sampling of cognitions and behaviour. Journal of Research in Personality 13: 103–111.

    Google Scholar 

  • Ivory, M. I., and Hearst, M. A. 2001. The state of the art in automating usability evaluation of user interfaces. ACM Computing Surveys 33(4): 470–516.

    Google Scholar 

  • Jungk, A., Thull, B., Hoeft, A., and Rau, G. 2000. Evaluation of two new ecological interface approaches for the anesthesia workplace. Journal of Clinical Monitoring and Computing 16(4): 243–258.

    Google Scholar 

  • Karahasanović, A. 2000. SEMT–A tool for finding impacts of schema changes. NWPER’2000 Nordic Workshop on Programming Environment Research. Lillehammer, Norway, pp. 60–75.

  • Karahasanović, A. 2002. Supporting Application Consistency in Evolving Object-Oriented Systems by Impacts Analysis and Visualisation, Ph.D. Thesis, Faculty of Mathematics and Natural Science, University of Oslo, Unipub, ISSN 1501–7710, Nr. 234.

  • Karahasanović, A., and Sjøberg, D. I. K. 2001. Visualising impacts of database schema changes—A controlled experiment. 2001 IEEE Symposium on Visual/Multimedia Approaches to Programming and Software Engineering. Stresa, Italy: IEEE Computer Society, 358–365.

  • Karahasanović, A., and Sjøberg, D. I. K. 2002. Visualising impacts of change in evolving object-oriented systems: An explorative study. International Workshop on Graph-Based Tools GraBaTs’02. Barcelona, Spain, pp. 22–31.

  • Karahasanović, A., Sjøberg, D. I. K., and Jørgensen, M. 2001. Data collection in software engineering experiments. Information Resources Management Association International Conference, Soft. Eng. Track. Toronto, Ontario, Canada, pp. 1027–1028.

  • Larson, R., and Csikszentmihalyi, M. 1983. The experience sampling method. In R. H. T. (ed), Naturalistic Approaches to Studying Social Interaction. Jossey-Bass, San Francisco, pp. 42–56.

    Google Scholar 

  • Nathan, M. J. 1991. A simple learning environment improves mathematical reasoning. Intelligent Tutoring Media 2(3–4): 101–111.

    Google Scholar 

  • Nielsen, J. 1993. Usability Engineering. AP Professional.

  • Patel, V. L., Arocha, J. F., Diermeier, M., How, J., and Mottur-Pilson, C. 2001. Cognitive psychological studies of representation and use of clinical practice guidelines. International Journal of Medical Informatics 63(3): 147–167.

    Google Scholar 

  • Sanderson, P. M. 1990. Verbal protocol analysis in three experimental domains using SHAPA. Proceedings of the Human Factors Society, 34th Annual Meeting. Santa Monica, CA, pp. 1280–1284.

  • Seaman, C. 1999. Qualitative methods in empirical studies of software engineering. IEEE Transactions on Software Engineering 25(4): July/August, 557–572.

    Google Scholar 

  • Singer, J., and Vinson, N. 2001. Why and how research ethics matters to you. Yes, you! Empirical Software Engineering 6: 287–290.

    Google Scholar 

  • von Mayrhauser, A., and Lang, S. 1999. A coding scheme to support systematic analysis of software comprehension. IEEE Transactions on Software Engineering 25(4): 526–540.

    Google Scholar 

  • Vygotsky, L. S. 1994. Thought and Language. Cambridge, MA: MIT Press.

    Google Scholar 

  • Welland R., Sjøberg D., and Atkinson, M. 1997. Empirical analysis based on automatic tool logging. Empirical Assessment & Evaluation in Software Engineering (EASE97). Keele, UK.

  • Wohlin, C., Runeson, P., Höst, M., Ohlsson, M. C., Regnell, B., and Wesslén, A. 1999. Experimentation in Software Engineering–An Introduction. Boston: Kluwer Academic Publishers.

    Google Scholar 

  • Wright, R. B., and Converse, S. A. 1992. Method bias and concurrent verbal protocol in software usability testing. Proc Human Factors Society 36th Annual Meeting. Atlanta, Georgia, pp. 1220–1224.

  • Yin, R. K. 1994. Case Study Research. Thousand Oaks, California: SAGE Publications.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Amela Karahasanoviæ.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Karahasanoviæ, A., Anda, B., Arisholm, E. et al. Collecting Feedback during Software Engineering Experiments. Empir Software Eng 10, 113–147 (2005). https://doi.org/10.1007/s10664-004-6189-4

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10664-004-6189-4

Keywords

Navigation