Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3524842.3527957acmconferencesArticle/Chapter ViewAbstractPublication PagesicseConference Proceedingsconference-collections
research-article

How heated is it?: understanding GitHub locked issues

Published: 17 October 2022 Publication History

Abstract

Although issues of open source software are created to discuss and solve technical problems, conversations can become heated, with discussants getting angry and/or agitated for a variety of reasons, such as poor suggestions or violation of community conventions. To prevent and mitigate discussions from getting heated, tools like GitHub have introduced the ability to lock issue discussions that violate the code of conduct or other community guidelines. Despite some early research on locked issues, there is a lack of understanding of how communities use this feature and of potential threats to validity for researchers relying on a dataset of locked issues as an oracle for heated discussions. To address this gap, we (i) quantitatively analyzed 79 GitHub projects that have at least one issue locked as too heated, and (ii) qualitatively analyzed all issues locked as too heated of the 79 projects, a total of 205 issues comprising 5,511 comments. We found that projects have different behaviors when locking issues: while 54 locked less than 10% of their closed issues, 14 projects locked more than 90% of their closed issues. Additionally, locked issues tend to have a similar number of comments, participants, and emoji reactions to non-locked issues. For the 205 issues locked as too heated, we found that one-third do not contain any uncivil discourse, and only 8.82% of the analyzed comments are actually uncivil. Finally, we found that the locking justifications provided by maintainers do not always match the label used to lock the issue. Based on our results, we identified three pitfalls to avoid when using the GitHub locked issues data and we provide recommendations for researchers and practitioners.

References

[1]
Deeksha Arya, Wenting Wang, Jin LC Guo, and Jinghui Cheng. 2019. Analysis and detection of information types of open source software issue discussions. In 2019 IEEE/ACM 41st International Conference on Software Engineering (ICSE). IEEE, 454--464.
[2]
Jinghui Cheng and Jin LC Guo. 2019. Activity-based analysis of open source software contributors: Roles and dynamics. In 2019 IEEE/ACM 12th International Workshop on Cooperative and Human Aspects of Software Engineering (CHASE). IEEE, 11--18.
[3]
Jithin Cheriyan, Bastin Tony Roy Savarimuthu, and Stephen Craneñeld. 2021. Towards offensive language detection and reduction in four Software Engineering communities. In Evaluation and Assessment in Software Engineering. 254--259.
[4]
Marcia W DiStaso and Denise Sevick Bortree. 2012. Multi-method analysis of transparency in social media practices: Survey, interviews and content analysis. Public Relations Review 38, 3 (2012), 511--514.
[5]
GitHub Docs. [n. d.]. GitHub Community Guidelines. https://docs.github.com/en/github/site-policy/github-community-guidelines Accessed on Jan. 02, 2022.
[6]
GitHub Docs. [n.d.]. GitHub event types. https://docs.github.com/en/developers/webhooks-and-events/events/github-event-types Accessed on Jan. 02, 2022.
[7]
GitHub Docs. [n.d.]. Moderation - Locking conversations. https://docs.github.com/en/communities/moderating-comments-and-conversations/locking-conversations Accessed on Jan. 02, 2022.
[8]
Felipe Ebert, Fernando Castor, Nicole Novielli, and Alexander Serebrenik. 2019. Confusion in code reviews: Reasons, impacts, and coping strategies. In 2019 IEEE 26th international conference on software analysis, evolution and reengineering (SANER). IEEE, 49--60.
[9]
Carolyn D. Egelman, Emerson Murphy-Hill, Elizabeth Kammer, Margaret Morrow Hodges, Collin Green, Ciera Jaspan, and James Lin. 2020. Predicting Developers' Negative Feelings about Code Review. In Proceedings of the ACM/IEEE 42nd International Conference on Software Engineering (Seoul, South Korea) (ICSE '20). Association for Computing Machinery, New York, NY, USA, 174--185.
[10]
Isabella Ferreira, Jinghui Cheng, and Bram Adams. 2021. The" Shut the f** k up" Phenomenon: Characterizing Incivility in Open Source Code Review Discussions. Proceedings of the ACM on Human-Computer Interaction 5, CSCW2 (2021), 1--35.
[11]
Armstrong Foundjem. 2019. Release synchronization in software ecosystems. In 2019IEEE/ACM 41st International Conference on Software Engineering: Companion Proceedings (ICSE-Companion). IEEE, 135--137.
[12]
Petra Heck and Andy Zaidman. 2013. An analysis of requirements evolution in open source projects: Recommendations for issue trackers. In Proceedings of the 2013 International workshop on principles of software evolution. 43--52.
[13]
Zach Holman. 2014. Locking conversations, https://github.blog/2014-06-09-locking-conversations/
[14]
Mohammad Rañqul Islam. 2018. Sample size and its role in Central Limit Theorem (CLT). Computational and Applied Mathematics Journal 4, 1 (2018), 1--7.
[15]
Renee Li, Pavitthra Pandurangan, Hana Frluckaj, and Laura Dabbish. 2021. Code of Conduct Conversations in Open Source Software Projects on Github. Proceedings of the ACM on Human-Computer Interaction 5,CSCW1 (2021), 1--31.
[16]
Nora McDonald, Sarita Schoenebeck, and Andrea Forte. 2019. Reliability and inter-rater reliability in qualitative research: Norms and guidelines for CSCW and HCI practice. Proceedings of the ACM on Human-Computer Interaction 3. CSCW (2019), 1--23.
[17]
Courtney Miller, Sophie Cohen, Daniel Klug, Bogdan Vasilescu, and Christian Kästner. 2022. Did You Miss My Comment or What? Understanding Toxicity in Open Source Discussions. (5 2022).
[18]
Naveen Raman, Minxuan Cao, Yulia Tsvetkov, Christian Kästner, and Bogdan Vasilescu. 2020. Stress and burnout in open source: Toward finding, understanding, and mitigating unhealthy interactions. In Proceedings of the ACM/IEEE 42nd International Conference on Software Engineering: New Ideas and Emerging Results. 57--60.
[19]
David Ramel. 2019. Santa Hat Icon in vs code creates 'Santagate,' locks Down Repository. https://visualstudiomagazine.com/articles/2019/12/20/santagate.aspx
[20]
Marnie E Rice and Grant T Harris. 2005. Comparing effect sizes in follow-up studies: ROC Area, Cohen's d, and r. Law and human behavior 29, 5 (2005). 615--620.
[21]
Johnny Saldaña. 2015. The coding manual for qualitative researchers (3rd ed.). Sage.
[22]
Joni Salminen, Sercan Sengün, Juan Corporan, Soon-gyo Jung, and Bernard J Jansen. 2020. Topic-driven toxicity: Exploring the relationship between online toxicity and news topics. PloS one 15, 2 (2020), e0228723.
[23]
JaydebSarker,AsifKamalTurzo,andAmiangshuBosu. 2020. A Benchmark Study of the Contemporary Toxicity Detectors on Software Engineering Interactions. In 2020 27th Asia-Pacific Software Engineering Conference (APSEC). IEEE, 218--227.
[24]
Teyon Son, Tao Xiao, Dong Wang, Raula Gaikovina Kula, Takashi Ishio, and Kenichi Matsumoto. 2021. More Than React: Investigating The Role of EmojiReaction in GitHub Pull Requests. arXiv preprint arXiv.2108.08094 (2021).
[25]
Anselm Strauss and Juliet Corbin. 1990. Open coding. Basics of qualitative research: Grounded theory procedures and techniques 2, 1990 (1990), 101--121.
[26]
Anselm L Strauss. 1987. Qualitative analysis for social scientists. Cambridge university press.
[27]
David R Thomas. 2003. A general inductive approach for qualitative data analysis. (2003).
[28]
Parastou Tourani, Bram Adams, and Alexander Serebrenik. 2017. Code of conduct in open source projects. In 2017 IEEE 24th international conference on software analysis, evolution and reengineering (SANER). IEEE, 24--33.
[29]
Bogdan Vasilescu, Vladimir Filkov, and Alexander Serebrenik. 2015. Perceptions of diversity on git hub: A user survey. In 2015 IEEE/ACM 8th International Workshop on Cooperative and Human Aspects of Software Engineering. IEEE, 50--56.
[30]
Bogdan Vasilescu, Daryl Posnett, Baishakhi Ray, Mark GJ van den Brand, Alexander Serebrenik, Premkumar Devanbu, and Vladimir Filkov. 2015. Gender and tenure diversity in GitHub teams. In Proceedings of the 33rd annual ACM conference on human factors in computing systems. 3789--3798.
[31]
Anthony J Viera, Joanne M Garrett, et al. 2005. Understanding interobserver agreement: the kappa statistic. Fam med 37, 5 (2005), 360--363.
[32]
Maike Vollstedt and Sebastian Rezat. 2019. An introduction to grounded theory with a special focus on axial coding and the coding paradigm. Compendium for early career researchers in mathematics education 13 (2019), 81--100.
[33]
Elise Whitley and Jonathan Ball. 2002. Statistics review 6: Nonparametric methods. Critical care 6, 6 (2002), 1--5.

Cited By

View all
  • (2024)A First Look at Self-Admitted Miscommunications in GitHub IssuesProceedings of the 39th IEEE/ACM International Conference on Automated Software Engineering Workshops10.1145/3691621.3694942(118-127)Online publication date: 27-Oct-2024
  • (2024)Predicting open source contributor turnover from value-related discussions: An analysis of GitHub issuesProceedings of the IEEE/ACM 46th International Conference on Software Engineering10.1145/3597503.3623340(1-13)Online publication date: 20-May-2024
  • (2024)Incivility detection in open source code review and issue discussionsJournal of Systems and Software10.1016/j.jss.2023.111935209:COnline publication date: 14-Mar-2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
MSR '22: Proceedings of the 19th International Conference on Mining Software Repositories
May 2022
815 pages
ISBN:9781450393034
DOI:10.1145/3524842
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

In-Cooperation

  • IEEE CS

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 17 October 2022

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. civility
  2. github locked issues
  3. heated discussions
  4. incivility

Qualifiers

  • Research-article

Funding Sources

  • Natural Sciences and Engineering Research Council of Canada

Conference

MSR '22
Sponsor:

Upcoming Conference

ICSE 2025

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)45
  • Downloads (Last 6 weeks)2
Reflects downloads up to 19 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)A First Look at Self-Admitted Miscommunications in GitHub IssuesProceedings of the 39th IEEE/ACM International Conference on Automated Software Engineering Workshops10.1145/3691621.3694942(118-127)Online publication date: 27-Oct-2024
  • (2024)Predicting open source contributor turnover from value-related discussions: An analysis of GitHub issuesProceedings of the IEEE/ACM 46th International Conference on Software Engineering10.1145/3597503.3623340(1-13)Online publication date: 20-May-2024
  • (2024)Incivility detection in open source code review and issue discussionsJournal of Systems and Software10.1016/j.jss.2023.111935209:COnline publication date: 14-Mar-2024
  • (2023)Towards Understanding Emotions in Informal Developer Interactions: A Gitter Chat StudyProceedings of the 31st ACM Joint European Software Engineering Conference and Symposium on the Foundations of Software Engineering10.1145/3611643.3613084(2097-2101)Online publication date: 30-Nov-2023
  • (undefined)Incivility Detection in Open Source Code Review and Issue DiscussionsSSRN Electronic Journal10.2139/ssrn.4156317

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media