Nothing Special   »   [go: up one dir, main page]

Skip to main content
Log in

Exploring Performance Assurance Practices and Challenges in Agile Software Development: An Ethnographic Study

  • Published:
Empirical Software Engineering Aims and scope Submit manuscript

Abstract

Background

Agile principles play a pivotal role in modern software development. Unfortunately, the assessment of non-functional software properties, such as performance, can be challenging in Agile Software Development (ASD). Agile mentality tends to favor functional development over non-functional quality assurance. Additionally, frequent code changes and software releases make impractical the use of classical performance assurance approaches.

Objective

This paper investigates the current practices, problems and challenges of performance assurance in a real context of ASD. To the best of our knowledge, this is the first empirical study that specifically investigate performance assurance in ASD daily work.

Method

Through a 6-months industry collaboration with a large software organization that adopts ASD, we investigated practical and management problems in handling performance assurance activities. The research was conducted in line with ethnographic research, which guided towards building knowledge from participatory observations, unstructured interviews and reviews of documentations.

Results

The study shows that the case organization still relies on a waterfall-like approach for performance assurance. Such an approach showed to be inadequate for ASD, thereby leading to a sub-optimal management of performance assessment activities. We distilled three key challenges when trying to improve the performance assurance process: (i) managing performance assessment activities, (ii) continuous performance assessment and (iii) defining the performance assessment effort.

Conclusions

The assessment of software performance in the context of ASD is still far from being flawless. The lack of guidelines and well-established practices induces the adoption of approaches that can be obsolete and inadequate for ASD. Further research is needed to improve the performance management in this context, and to enable effective continuous performance assessment.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Notes

  1. 15th State of Agile Survey. https://bit.ly/3azEj5r

  2. Due to the sensitivity of the results presented here-in, the organization chose to stay incognito. Therefore, in this paper, we use fictitious names of the company and the product.

  3. 15th State of Agile Survey. https://bit.ly/3azEj5r

  4. Atlassian Confluence https://www.atlassian.com/software/confluence

  5. Microsoft Azure DevOps Server, https://azure.microsoft.com/it-it/services/devops/server/

  6. Selenium WebDriver, https://www.selenium.dev

  7. Test plans, Azure DevOps Service. https://bit.ly/3aRlmdv

  8. User story, Agile Alliance. https://bit.ly/369HxtY

  9. Azure DevOps Service, bug management. https://bit.ly/3thSZgZ

  10. Microsoft .NET. https://bit.ly/2Muc3If

  11. NET Core, ThreadPool Starvation. http://bit.ly/3ozYnII

  12. Nonfunctional Requirements - Scaled Agile Framework. https://bit.ly/3ohrZun

  13. Definition of Done, Agile Alliance. http://bit.ly/2YbdkWS

References

  • Alghmadi HM, Syer MD, Shang W, Hassan AE (2016) An automated approach for recommending when to stop performance tests. In: 2016 IEEE International Conference on Software Maintenance and Evolution (ICSME), pp 279–289 https://doi.org/10.1109/ICSME.2016.46

  • Alsaqaf W, Daneva M, Wieringa R (2017) Quality requirements in large-scale distributed agile projects – a systematic literature review. In: Grünbacher P, Perini A (eds) Requirements engineering: foundation for software quality. Springer International Publishing, Cham, pp 219–234

  • Alsaqaf W, Daneva M, Wieringa R (2019) Quality requirements challenges in the context of large-scale distributed agile: An empirical study. Information and Software Technology 110:39–55. https://doi.org/10.1016/j.infsof.2019.01.009. http://www.sciencedirect.com/science/article/pii/S0950584918300739

    Article  Google Scholar 

  • Anderson B (1997) Work, ethnography and system design. In: Kent A, Williams J G (eds) The encyclopedia of microcomputers, vol 20. USA, Marcel Dekker, New York, NY, pp 159–183

  • Auer K, Beck K (1996) Lazy optimization: patterns for efficient smalltalk programming, Addison-Wesley longman publishing co., inc. USA, pp 19–42

  • Beck K, Beedle M, van Bennekum A, Cockburn A, Cunningham W, Fowler M, Grenning J, Highsmith J, Hunt A, Jeffries R, Kern J, Marick B, Martin RC, Mellor S, Schwaber K, Sutherland J, Thomas D (2001) Manifesto for agile software development. http://www.agilemanifesto.org/

  • Behutiye W, Karhapää P, López L, Burgués X, Martínez-Fernández S, Vollmer AM, Rodríguez P, Franch X, Oivo M (2020a) Management of quality requirements in agile and rapid software development: A systematic mapping study. Information and Software Technology 123:106225. https://doi.org/10.1016/j.infsof.2019.106225, http://www.sciencedirect.com/science/article/pii/S095058491930240X

    Article  Google Scholar 

  • Behutiye W, Seppänen P, Rodríguez P, Oivo M (2020b) Documentation of quality requirements in agile software development. In: Proceedings of the Evaluation and Assessment in Software Engineering, Association for Computing Machinery, New York, NY, USA, EASE ’20, pp 250–259 https://doi.org/10.1145/3383219.3383245

  • Brutlag J (2009) Speed matters. http://services.google.com/fh/files/blogs/google_delayexp.pdf, [online] http://services.google.com/fh/files/blogs/google_delayexp.pdf

  • Cao L, Ramesh B (2008) Agile requirements engineering practices: an empirical study. IEEE Softw 25(1):60–67. https://doi.org/10.1109/MS.2008.1

    Article  Google Scholar 

  • Chen TH, Syer MD, Shang W, Jiang ZM, Hassan AE, Nasser M, Flora P (2017) Analytics-driven load testing: An industrial experience report on load testing of large-scale systems. In: 2017 IEEE/ACM 39th International Conference on Software Engineering: Software Engineering in Practice Track (ICSE-SEIP), pp 243–252 https://doi.org/10.1109/ICSE-SEIP.2017.26

  • Ho C-W, Johnson MJ, Williams L, Maximilien EM (2006) On agile performance requirements specification and testing. In: AGILE 2006 (AGILE’06), pp 6 pp.–52, https://doi.org/10.1109/AGILE.2006.41

  • Daly D (2021) Creating a virtuous cycle in performance testing at mongodb. In: Proceedings of the ACM/SPEC International Conference on Performance Engineering, Association for Computing Machinery, New York, NY, USA, ICPE ’21, p 33–41 https://doi.org/10.1145/3427921.3450234

  • Daly D, Brown W, Ingo H, O’Leary J, Bradford D (2020) The use of change point detection to identify software performance regressions in a continuous integration system. In: Proceedings of the ACM/SPEC International Conference on . In: Performance Engineering, Association for Computing Machinery, New York, NY, USA, ICPE ’20, pp 67–75 https://doi.org/10.1145/3358960.3375791

  • Ding Z, Chen J, Shang W (2020) Towards the use of the readily available tests from the release pipeline as performance tests: Are we there yet?. In: Proceedings of the ACM/IEEE 42nd International Conference on Software Engineering, Association for Computing Machinery, New York, NY, USA, ICSE ’20, p 1435–1446 https://doi.org/10.1145/3377811.3380351

  • Dittrich Y (2002) Doing empirical research on software development: Finding a path between understanding, intervention, and method development. In: Social Thinking-Software Practice, The MIT Press, pp 243–262

  • Fagerström M, Ismail EE, Liebel G, Guliani R, Larsson F, Nordling K, Knauss E, Pelliccione P (2016) Verdict machinery: On the need to automatically make sense of test results. In: Proceedings of the 25th International Symposium on Software Testing and Analysis, Association for Computing Machinery, New York, NY, USA, pp 225–234 https://doi.org/10.1145/2931037.2931064

  • Fetterman DM (2019) Ethnography: Step-by-step. Sage Publications

  • Fitzgerald B, Stol K, O’Sullivan R, O’Brien D (2013) Scaling agile methods to regulated environments: An industry case study. In: 2013 35th International Conference on Software Engineering (ICSE), pp 863–872 https://doi.org/10.1109/ICSE.2013.6606635

  • Fowler M (2002) Yet another optimisation article. IEEE Softw 19 (3):20–21. https://doi.org/10.1109/MS.2002.1003448

    Article  Google Scholar 

  • Geertz C (1988) Works and lives : the anthropologist as author. Stanford University Press, Stanford, Calif

    Google Scholar 

  • Hanssen GK, Haugset B, Stålhane T, Myklebust T, Kulbrandstad I (2016) Quality assurance in scrum applied to safety critical software. In: Sharp H, Hall T (eds) Agile processes, in software engineering, and extreme programming. Springer International Publishing, Cham, pp 92–103

  • Heikkila VT, Paasivaara M, Lassenius C (2013) Scrumbut, but does it matter? a mixed-method study of the planning process of a multi-team scrum organization. In: 2013 ACM / IEEE International Symposium on Empirical Software Engineering and Measurement, pp 85–94 https://doi.org/10.1109/ESEM.2013.27

  • Inayat I, Salim SS, Marczak S, Daneva M, Shamshirband S (2015) A systematic literature review on agile requirements engineering practices and challenges. Computers in Human Behavior 51:915–929. https://doi.org/10.1016/j.chb.2014.10.046. http://www.sciencedirect.com/science/article/pii/S074756321400569X, computing for Human Learning, Behaviour and Collaboration in the Social and Mobile Networks Era

    Article  Google Scholar 

  • Jiang ZM, Hassan AE (2015) A survey on load testing of large-scale software systems. IEEE Trans Softw Eng 41(11):1091–1118. https://doi.org/10.1109/TSE.2015.2445340

    Article  Google Scholar 

  • Johnson MJ, Ho C, Maximilien EM, Williams L (2007) Incorporating performance testing in test-driven development. IEEE Softw 24(3):67–73. https://doi.org/10.1109/MS.2007.77

    Article  Google Scholar 

  • Kasauli R, Knauss E, Horkoff J, Liebel G, de Oliveira Neto FG (2021) Requirements engineering challenges and practices in large-scale agile system development. Journal of Systems and Software 172:110851. https://doi.org/10.1016/j.jss.2020.110851. http://www.sciencedirect.com/science/article/pii/S0164121220302417

    Article  Google Scholar 

  • Knuth DE (2007) Computer Programming as an Art, Association for Computing Machinery, New York, NY, USA, p 1974. https://doi.org/10.1145/1283920.1283929

  • Laaber C, Leitner P (2018) An evaluation of open-source software microbenchmark suites for continuous performance assessment. In: Proceedings of the 15th International Conference on Mining Software Repositories, Association for Computing Machinery, New York, NY, USA, MSR ’18, pp 119–130 https://doi.org/10.1145/3196398.3196407

  • Laaber C, Würsten S, Gall H C, Leitner P (2020) Dynamically reconfiguring software microbenchmarks: reducing execution time without sacrificing result quality. In: Devanbu P, Cohen M B, Zimmermann T (eds) ESEC/FSE ’20: 28Th ACM joint european software engineering conference and symposium on the foundations of software engineering, virtual event, USA, November 8-13, 2020, ACM. https://doi.org/10.1145/3368089.3409683, pp 989–1001

  • McGrath JE (1995) Methodology matters: doing research in the behavioral and social sciences. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA, pp 152–169

    Google Scholar 

  • Meyer H (2009) Manufacturing Execution Systems: Optimal Design, Planning, and Deployment. McGraw-Hill Education, New York. https://www.accessengineeringlibrary.com/content/book/9780071623834

    Google Scholar 

  • Nguyen TH, Adams B, Jiang ZM, Hassan AE, Nasser M, Flora P (2012) Automated detection of performance regressions using statistical process control techniques. In: Proceedings of the 3rd ACM/SPEC International Conference on Performance Engineering, Association for Computing Machinery, New York, NY, USA, ICPE ’12, p 299–310 https://doi.org/10.1145/2188286.2188344

  • Ramesh B, Cao L, Baskerville RL (2010) Agile requirements engineering practices and challenges: an empirical study. Inf Syst J 20(5):449–480. https://doi.org/10.1111/j.1365-2575.2007.00259.x

    Article  Google Scholar 

  • Rubin KS (2012) Essential Scrum: A Practical Guide to the Most Popular Agile Process, 1st edn. Addison-Wesley Professional

  • Sharp H, Dittrich Y, de Souza CRB (2016) The role of ethnographic studies in empirical software engineering. IEEE Transactions on Software Engineering 42(8):786–804. https://doi.org/10.1109/TSE.2016.2519887

    Article  Google Scholar 

  • Smith C, Williams L (2001) Performance Solutions: A Practical Guide to Creating Responsive, Scalable Software. Addison-Wesley object technology series, Addison-Wesley, https://books.google.it/books?id=X5VlQgAACAAJ

  • Woodside M, Franks G, Petriu DC (2007) The future of software performance engineering. In: 2007 Future of Software Engineering, IEEE Computer Society, USA, FOSE ’07, pp 171–187, https://doi.org/10.1109/FOSE.2007.32

Download references

Acknowledgements

This work was supported by the project “Software Performance in Agile/DevOps context (funded within Programma Operativo'' Nazionale Ricerca e Innovazione 2014- 2020) and “Territori Aperti” (a project funded by Fondo Territori Lavoro e Conoscenza CGIL, CSIL and UIL). I would like to thank Vittorio Cortellessa for making this industry collaboration possible. I would also like to thank him for the useful suggestions and comments that were helpful in improving the paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Luca Traini.

Additional information

Communicated by: Tse-Hsun (Peter) Chen, Weiyi Shang, Cor-Paul Bezemer, Andre van Hoorn and Catia Trubiani

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This article belongs to the Topical Collection: Software Performance

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Traini, L. Exploring Performance Assurance Practices and Challenges in Agile Software Development: An Ethnographic Study. Empir Software Eng 27, 74 (2022). https://doi.org/10.1007/s10664-021-10069-3

Download citation

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s10664-021-10069-3

Keywords

Navigation