Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3463274.3463841acmotherconferencesArticle/Chapter ViewAbstractPublication PageseaseConference Proceedingsconference-collections
research-article

Challenges when Applying Repertory Grid Technique for Software Practices

Published: 21 June 2021 Publication History

Abstract

The Repertory Grid Technique (RGT) has been applied within the software engineering domain to investigate a variety of topics. These include topics relating to architectural knowledge, team level tacit knowledge, and project success mechanisms. The technique is based on Personal Construct Theory (PCT) and is claimed to be suitable for gaining a deep understanding of peoples’ perspectives on a topic. The essence of RGT is a consideration of similarities and differences, for example, between different project instances. In this paper, we describe a case study in which we applied the technique with the aim of eliciting practitioners’ viewpoints on contextual factors for situated software practices. We interviewed twelve practitioners in three organisations. We found that the RGT approach was challenging to implement for several reasons. Participants had difficulty in choosing specific instances of a software practice, identifying similarities and differences tended to be problematic and causal pathways were not always easy to establish. Our contributions are the highlighting of the challenges that may occur when implementing this technique, an analysis of the issues encountered and some possible mitigation approaches. These may serve as support for SE researchers considering an RGT based study.

References

[1]
David Avison and Jan Pries-Heje. 2008. Flexible Information Systems Development: Designing an Appropriate Methodology for Different Situations. In Enterprise information systems : 9th International Conference, ICEIS 2007, J. Filipe, J. Cordeiro, and J. Cardoso (Eds.). Springer, Berlin, Heidelberg, 212–224.
[2]
Kent Beck. 2000. eXtreme Programming eXplained - Embrace Change. Addison-Wesley, United States of America.
[3]
Barry W. Boehm. 1988. A Spiral Model of Software Development and Enhancement. IEEE Computer May, 11 (1988), 61–71.
[4]
Denny Borsboom and Keith A. Markus. 2013. Truth and Evidence in Validity Theory. 50, 1 (03 2013), 110–114.
[5]
Virginia Braun and Victoria Clarke. 2006. Using thematic analysis in psychology. Qualitative Research in Psychology 3, 2 (2006), 77–101. https://doi.org/10.1191/1478088706qp063oa
[6]
Paul Clarke and Rory V. O’Connor. 2012. The situational factors that affect the software development process: Towards a comprehensive reference framework. Information and Software Technology 54 (2012), 433–447.
[7]
Michael Cusumano, Alan MacCormack, Chris Kemerer, and Bill Crandall. 2003. Software Development Worldwide: The State of the Practice. IEEE Software 20, 6 (Nov/Dec 2003), 28–34.
[8]
Mariana de Azevedo Santos, Paulo Henrique de Souza Bermejo, Marcelo Silva de Oliveira, and Adriano Olimpio Tonelli. 2011. Agile Practices: An Assessment of Perception of Value of Professionals on the Quality Criteria in Performance of Projects. Journal of Software Engineering and Applications 4 (2011), 700–709.
[9]
S. Easterbrook, J. Singer, MA. Storey, and D. Damian. 2008. Selecting empirical methods for software engineering research. In Guide to Advanced Empirical Software Engineering, F. Shull and J. Singer and D.I.K Sjøberg (Ed.). Springer International Publishing, London, UK, 285–311.
[10]
Helen M. Edwards, Sharon McDonald, and S. Michelle Young. 2009. The repertory grid technique: Its place in empirical software engineering research. Information and Software Technology 51 (2009), 785–798.
[11]
Kathleen M. Eisenhardt and Melissa E. Graebner. 2007. Theory Building from Cases: Opportunities and Challenges. Academy of Management Journal 50, 1 (2007), 25–32.
[12]
Fay Fransella, Richard Bell, and Don Bannister. 2004. A manual for repertory grid technique. John Wiley & Sons.
[13]
Joana G. Geraldi, Elmar Kutsch, and Neil Turner. 2011. Towards a conceptualisation of quality in information technology projects. International Journal of Project Management 29 (2011), 557–567.
[14]
Dennis Neil Hinkle. 1965. The Change of Personal Constructs from the Viewpoint of a Theory of Construct Implications. Ph.D. Dissertation. The Ohio State University.
[15]
Devi Jankowicz. 2004. The Easy Guide to Repertory Grids. John Wiley and Sons, Ltd.
[16]
George A. Kelly. 2017. A brief introduction to personal construct theory. Connstruttivismi 4(2017), 3–25. https://doi.org/10.23826/2017.01.003.025
[17]
Carole L. Kimberlin and Almut G. Winterstein. 2008. Validity and reliability of measurement instruments used in research.American Journal of Health-System Pharmacy 65 (2008), 2276–2284.
[18]
Diana Kirk and Stephen G. MacDonell. 2016. An Ontological Analysis of a Proposed Theory for Software Development. In Software Technologies - ICSOFT 2015, P. Lorenz et al. (Ed.). Communications in Computer and Information Science (CCIS), Vol. 586. Springer International Publishing, Switzerland, 1–17. https://doi.org/10.1007/978-3-319-30142-6_9
[19]
Diana Kirk and Stephen G. MacDonell. 2018. Evolving a Model for Software Process Context: An Exploratory Study. In Proceedings of the 13th International Conference on Software Technologies (ICSOFT 18). SCITEPRESS, Porto, Portugal, 296–303.
[20]
Jil Klünder, Dzejlana Karajic, Paolo Tell, Oliver Karras, Christian Münkel, Jürgen Münch, Stephen G. MacDonell, Regina Hebig, and Marco Kuhrmann. 2020. Determining Context Factors for Hybrid Development with Trained Models. In ICSSP ’20, Proceedings of the 2013 International Conference on Software and System Processes. Association for Computing Machinery, 61–70. https://doi.org/10.1145/3379177.3388898
[21]
Bojan Korenini. 2012. Conducting Consistent Laddering Interviews Using CLAD. Advances in Methodology and Statistics 9, 2 (2012), 155–174.
[22]
Marco Kuhrmann and Jürgen Münch. 2019. SPI is Dead, isn’t it? Clear the Stage for Continuous Learning!. In ICSSP ’19, Proceedings of the 2019 International Conference on Software and System Processes. Association for Computing Machinery, Montréal, Canada, 9–13.
[23]
Cynthia A. Lengnick-Hall and Robert J. Griffith. 2011. Evidence-based versus tinkerable knowledge as strategic assets: A new perspective on the interplay between innovation and application. Journal of Engineering and Technology Management 28 (2011), 147–167.
[24]
Robert W. Lissitz and Karen Samuelson. 2007. Further Clarification Regarding Validity and Education. Educational Researcher 36, 8 (2007), 482–484.
[25]
Jeff De Luca. n.d. Agile Software Development using Feature Driven Development (FDD). http://www.nebulon.com/fdd/
[26]
A. MacCormack, W. Crandall, P. Henderson, and P. Toft. 2012. Do you need a new product-development strategy?Research Technology Management 55, 1 (2012), 34–43.
[27]
Zainab Masood, Rashina Hoda, and Kelly Blincoe. 2020. Real World Scrum A Grounded Theory of Variations in Practice. IEEE Transactions on Software Engineering(2020).
[28]
Samuel Messick. 1995. Validity of Psychological Assessment. American Psychologist 50, 9 (1995), 741–749.
[29]
Sune Dueholm Müller, Pernille Kræmmergaard, and Lars Mathiassen. 2009. Managing Cultural variation in Software Process Improvement: A Comparison of Methods for Subculture Assessment. IEEE Transactions on Engineering Management 56, 4 (2009), 584–599.
[30]
Wanda Orlikowski. 2002. Knowing in Practice: Enabling a Collective Capability in Distributed Organizing. Organization Science 13, 3 (2002), 249–273.
[31]
Oleg Pankratz and Dirk Basten. 2018. Opening the black box: Managers’ perceptions of IS project success mechanisms. Information and Management 55 (2018), 381–395.
[32]
Kai Petersen and Claes Wohlin. 2009. A comparison of issues and advantages in agile and incremental development between state of the art and an industrial case. Journal of Systems and Software 82 (2009), 1479–1490.
[33]
Kai Petersen and Claes Wohlin. 2009. Context in Industrial Software Engineering Research. In Proceedings of the Third International Symposium on Empirical Software Engineering and Measurement (ESEM 2009). The Institute of Electrical and Electronic Engineers, Inc., Orlando, Florida, 401–404.
[34]
Winston Royce. 1970. Managing the Development of Large Software Systems. In Proceedings, IEEE WestCon. The Institute of Electrical and Electronic Engineers, Inc., 328–339.
[35]
Sharon Ryan and Rory V. O’Connor. 2009. Development of a team measure for tacit knowledge in software development teams. Journal of Systems and Software 82 (2009), 229–240.
[36]
Klaas-Jan Stol and Brian Fitzgerald. 2018. The ABC of Software Engineering Research. ACM Transactions on Software Engineering and Methodology 27, 3(2018), 11.1–11.51.
[37]
Felix Tan and M. Gordon Hunter. 2002. The Repertory Grid Technique: A Method for the Study of Cognition in Information Systems. MIS Quarterly 26, 1 (2002), 39–57.
[38]
Dan Tofan, Matthias Galster, and Paris Avgeriou. 2011. Capturing Tacit Archirectural Knowledge Using the Repertory Grid Technique (NIER Track). In Proceedings of the 33rd International Conference on Software Engineering (ICSE’11). Association for Computing Machinery, Honolulu, HI, USA, 916–919.
[39]
Rodney Turner, Ann Ledwith, and John Kelly. 2010. Project management in small to medium-sized enterprises: Matching processes to the nature of the firm. International Journal of Project Management 28 (2010), 744–755.
[40]
Robin Whittemore, Susan K. Chase, and Carol Lynn Mandle. 2001. Validity in Qualitative Research. Qualitative Health Research 11, 4 (2001), 522–537.
[41]
S. Michelle Young, Helen M. Edwards, Sharon McDonald, and J. Barrie Thompson. 2005. Personality Characteristics in an XP Team: A Repertory Grid Study. In Proceedings of the 2005 Workshop on Human and Social Factors of Software Engineering (HSSE ’05). Association for Computing Machinery, 1–7. https://doi.org/10.1145/1083106.1083123

Cited By

View all

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Other conferences
EASE '21: Proceedings of the 25th International Conference on Evaluation and Assessment in Software Engineering
June 2021
417 pages
ISBN:9781450390538
DOI:10.1145/3463274
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 21 June 2021

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Industry Study
  2. Personal Construct Theory
  3. Repertory Grid Technique
  4. Software Development Context

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

EASE 2021

Acceptance Rates

Overall Acceptance Rate 71 of 232 submissions, 31%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)10
  • Downloads (Last 6 weeks)1
Reflects downloads up to 07 Mar 2025

Other Metrics

Citations

Cited By

View all

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media