Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3452918.3458796acmconferencesArticle/Chapter ViewAbstractPublication PagesimxConference Proceedingsconference-collections
research-article
Public Access

Moderation Visibility: Mapping the Strategies of Volunteer Moderators in Live Streaming Micro Communities

Published: 23 June 2021 Publication History

Abstract

Volunteer moderators actively engage in online content management, such as removing toxic content and sanctioning anti-normative behaviors in user-governed communities. The synchronicity and ephemerality of live-streaming communities pose unique moderation challenges. Based on interviews with 21 volunteer moderators on Twitch, we mapped out 13 moderation strategies and presented them in relation to the bad act, enabling us to categorize from proactive and reactive perspectives and identify communicative and technical interventions. We found that the act of moderation involves highly visible and performative activities in the chat and invisible activities involving coordination and sanction. The juxtaposition of real-time individual decision-making with collaborative discussions and the dual nature of visible and invisible activities of moderators provide a unique lens into a role that relies heavily on both the social and technical. We also discuss how the affordances of live-streaming contribute to these unique activities.

References

[1]
Reuben Binns, Michael Veale, Max Van Kleek, and Nigel Shadbolt. 2017. Like trainer, like bot? Inheritance of bias in algorithmic content moderation. In Proceedings of the 9th International Conference on Social Informatics. 11. https://doi.org/10.1007/978-3-319-67256-4_32
[2]
Virginia Braun and Victoria Clarke. 2006. Using thematic analysis in psychology. Qualitative Research in Psychology 3, 2 (1 2006), 77–101. https://doi.org/10.1191/1478088706qp063oa
[3]
Simon Bründl and Hess Thomas. 2016. Why do users broadcast? Examining individual motives and social capital on social live streaming platforms. In Proceedings of the 20th Pacific Asia Conference on Information Systems. 332. https://aisel.aisnet.org/pacis2016/332/
[4]
Jie Cai and Donghee Yvette Wohn. 2019. Categorizing Live Streaming Moderation Tools : An Analysis of Twitch. International Journal of Interactive Communication Systems and Technologies 9, 2 (2019), 36–50.
[5]
Jie Cai and Donghee Yvete Wohn. 2019. What are effective strategies of handling harassment on twitch? Users’ perspectives. In Companion of the ACM Conference on Computer Supported Cooperative Work. 166–170. https://doi.org/10.1145/3311957.3359478
[6]
Jie Cai, Donghee Yvette Wohn, Ankit Mittal, and Dhanush Sureshbabu. 2018. Utilitarian and hedonic motivations for live streaming shopping. In Proceedings of the 2018 ACM International Conference on Interactive Experiences for TV and Online Video - TVX ’18. New York, NY: ACM, 81–88. https://doi.org/10.1145/3210825.3210837
[7]
Stevie Chancellor, Andrea Hu, and Munmun De Choudhury. 2018. Norms matter: Contrasting social support around behavior change in online weight loss communities. In Conference on Human Factors in Computing Systems - Proceedings. 1–14. https://doi.org/10.1145/3173574.3174240
[8]
Stevie Chancellor, Jessica Pater, Trustin Clear, Eric Gilbert, and Munmun De Choudhury. 2016. Thyghgapp: Instagram content moderation and lexical variation in Pro-Eating disorder communities. In Proceedings of the ACM Conference on Computer Supported Cooperative Work, CSCW, Vol. 27. 1201–1213. https://doi.org/10.1145/2818048.2819963
[9]
Eshwar Chandrasekharan, Mattia Samory, Shagun Jhaver, Hunter Charvat, Amy Bruckman, Cliff Lampe, Jacob Eisenstein, and Eric Gilbert. 2018. The Internet’s hidden rules: An empirical study of Reddit norm violations at micro, meso, and macro scales. In Proceedings of the ACM on Human-Computer Interaction, Vol. 2. CSCW, 25. https://doi.org/10.1145/3274301
[10]
Eshwar Chandrasekharan, Mattia Samory, Anirudh Srinivasan, and Eric Gilbert. 2017. The bag of communities: Identifying abusive behavior online with preexisting internet data. In Proceedings of 2017 CHI Conference on Human Factors in Computing Systems. ACM, 3175–3187. https://doi.org/10.1145/3025453.3026018
[11]
Jonathan P. Chang and Cristian Danescu-Niculescu-Mizil. 2019. Trajectories of blocked community members: redemption, recidivism and departure. In Proceedings of the 2019 World Wide Web Conference. 12. https://doi.org/10.1145/3308558.3313638
[12]
Chia Chen Chen and Yi Chen Lin. 2018. What drives live-stream usage intention? The perspectives of flow, entertainment, social interaction, and endorsement. Telematics and Informatics 35, 1 (2018), 293–303. https://doi.org/10.1016/j.tele.2017.12.003
[13]
Maxime Clément and Matthieu J. Guitton. 2015. Interacting with bots online: Users’ reactions to actions of automated programs in Wikipedia. Computers in Human Behavior 50, 1 (2015), 66–75. https://doi.org/10.1016/j.chb.2015.03.078
[14]
Kate Crawford and Tarleton Gillespie. 2016. What is a flag for? Social media reporting tools and the vocabulary of complaint. New Media and Society 18, 3 (2016), 410–428. https://doi.org/10.1177/1461444814543163
[15]
Michael A. De Vito, Darren Gergle, and Jeremy Birnholtz. 2017. ”Algorithms ruin everything”: #RIPTwitter, folk theories, and resistance to algorithmic change in social media. In Proceedings of 2017 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, 3163–3174. https://doi.org/10.1145/3025453.3025659
[16]
Casey Fiesler, Jialun ”Aaron” Jiang, Joshua McCann, Kyle Frye, and Jed R Brubaker. 2018. Reddit Rules! Characterizing an Ecosystem of Governance. In Proceedings of the 6th International AAAI Conference on Weblogs and Social Media. 72–81. https://doi.org/10.1016/j.tvjl.2007.05.023
[17]
Mathilde B Friedländer. 2017. Streamer motives and user-generated content on social live-streaming services. Journal of Information Science Theory and Practice 55, 11 (2017), 65–84. https://doi.org/10.1633/JISTaP.2017.5.1.5
[18]
Darren Geeter. 2019. Twitch created a business around watching video games - here’s how Amazon has changed the service since buying it in 2014. https://www.cnbc.com/2019/02/26/history-of-twitch-gaming-livestreaming-and-youtube.html
[19]
Ysabel Gerrard. 2018. Beyond the hashtag: Circumventing content moderation on social media. New Media and Society 20, 12 (2018), 4492–4511. https://doi.org/10.1177/1461444818776611
[20]
Tarleton Gillespie. 2010. The politics of ’platforms’. New Media and Society 12, 3 (2010), 347–364. https://doi.org/10.1177/1461444809342738
[21]
Tarleton Gillespie. 2017. Governance of and by platforms. In SAGE Handbook of Social Media, Jean Burgess, Thomas Poell, and Alice Marwick (Eds.).
[22]
Tarleton Gillespie. 2018. Custodians of the internet: Platforms, content moderation, and the hidden decisions that shape social media. Yale University Press, New Haven. 1–288 pages. https://www.scopus.com/inward/record.uri?eid=2-s2.0-85051469782&partnerID=40&md5=8d850b5298b7e5dc1a1fcf4c427fe3da
[23]
Robert Gorwa, Reuben Binns, and Christian Katzenbach. 2020. Algorithmic content moderation: Technical and political challenges in the automation of platform governance. Big Data and Society 7, 1 (2020), 1–15. https://doi.org/10.1177/2053951719897945
[24]
Oliver L Haimson and John C Tang. 2017. What Makes Live Events Engaging on Facebook Live, Periscope, and Snapchat. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems - CHI ’17. 48–60. https://doi.org/10.1145/3025453.3025642
[25]
William A Hamilton, Oliver Garretson, and Andruid Kerne. 2014. Streaming on twitch: fostering participatory communities of play within live mixed media. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 1315–1324. https://doi.org/10.1145/2556288.2557048
[26]
Susan Herring, Kirk Job-Sluder, Rebecca Scheckler, and Sasha Barab. 2002. Searching for safety online: Managing ”trolling” in a feminist forum. Information Society 18, 5 (2002), 371–384. https://doi.org/10.1080/01972240290108186
[27]
Jane Im, Sonali Tandon, Eshwar Chandrasekharan, Taylor Denby, and Eric Gilbert. 2020. Synthesized Social Signals: Computationally-Derived Social Signals from Account Histories. In Proceedings of the Conference on Human Factors in Computing Systems. 1–12. https://doi.org/10.1145/3313831.3376383
[28]
Shagun Jhaver, Darren Scott Appling, Eric Gilbert, and Amy Bruckman. 2019. “Did you suspect the post would be removed?”: Understanding user reactions to content removals on reddit. In Proceedings of the ACM on Human-Computer Interaction, Vol. 3. CSCW, 33. https://doi.org/10.1145/3359294
[29]
Shagun Jhaver, Iris Birman, Eric Gilbert, and Amy Bruckman. 2019. Human-Machine Collaboration for Content Regulation: The Case of Reddit Automoderator. ACM Transition Human-Computer Interaction(2019), 35. https://doi.org/10.1145/3338243
[30]
Shagun Jhaver, Amy Bruckman, and Eric Gilbert. 2019. Does transparency in moderation really matter?: User behavior after content removal explanations on reddit. In Proceedings of the ACM on Human-Computer Interaction, Vol. 3. 27. https://doi.org/10.1145/3359252
[31]
Shagun Jhaver, Sucheta Ghoshal, Amy Bruckman, and Eric Gilbert. 2018. Online Harassment and Content Moderation: The Case of Blocklists. ACM Transactions on Computer-Human Interaction 25, 2(2018), 1–33. https://doi.org/10.1145/3185593
[32]
Jialun Jiang, Charles Kiene, Skyler Middler, Jed R Brubaker, and Casey Fiesler. 2019. Moderation challenges in voice-based online communities on discord. In Proceedings of the ACM on Human-Computer Interaction, Vol. 3. 23. https://doi.org/10.1145/3359157
[33]
Mladen Karan and Jaň Snajder. 2019. Preemptive Toxic Language Detection in Wikipedia Comments Using Thread-Level Context. In Proceedings of the Third Workshop on Abusive Language Online. Association for Computational Linguistics, Stroudsburg, PA, USA, 129–134. https://doi.org/10.18653/v1/W19-3514
[34]
Sara Kiesler, Robert E. Kraut, Paul Resnick, and Aniket Kittur. 2012. Regulating Behavior in Online Communities. In Building Successful Online Communities: Evidence-Based Social Design, Robert E. Kraut and Paul Resnick (Eds.). The MIT Press, Chapter 4, 125–177. https://doi.org/10.7551/mitpress/8472.003.0005
[35]
Kate Klonick. 2018. The new governors: The people, rules, and processes governing online speech. Harvard Law Review 131(2018), 1598–1670.
[36]
Jie Li, Xinning Gui, Yubo Kou, and Yukun Li. 2019. Live streaming as co-performance: Dynamics between center and periphery in theatrical engagement. In Proceedings of the ACM on Human-Computer Interaction, Vol. 3. CSCW, 22. https://doi.org/10.1145/3359166
[37]
Xiaocen Liu. 2016. Live streaming in China: boom market, business model and risk regulation. Journal of Residuals Science & Technology 13, 8(2016), 1–7. https://doi.org/10.12783/issn.1544-8053/13/8/284
[38]
Claudia Lo. 2018. When All You Have is a Banhammer: The Social and Communicative Work of Volunteer Moderators. Ph.D. Dissertation. Massachusetts Insitute of Technology. https://cmsw.mit.edu/banhammer-social-communicative-work-volunteer-moderators/
[39]
Kiel Long, John Vines, Selina Sutton, Phillip Brooker, Tom Feltwell, Ben Kirman, Julie Barnett, and Shaun Lawson. 2017. “Could you define that in bot terms?”: Requesting, creating and using bots on Reddit. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems - CHI ’17. 3488–3500. https://doi.org/10.1145/3025453.3025830
[40]
Luck Marthinusen. 2017. Social media trends in 2018: Live streaming dominates the social media landscape. https://www.mo.agency/blog/social-media-trends-2018-streaming
[41]
J. Nathan Matias. 2019. The Civic Labor of Volunteer Moderators Online. Social Media + Society 5, 2 (2019), 12. https://doi.org/10.1177/2056305119836778
[42]
Aiden McGillicuddy, Jean Grégoire Bernard, and Jocelyn Cranefield. 2016. Controlling bad behavior in online communities: An examination of moderation work. In 2016 International Conference on Information Systems, ICIS 2016. 11. https://slashdot.org/moderation.shtml
[43]
Sarah Myers West. 2018. Censored, suspended, shadowbanned: User interpretations of content moderation on social media platforms. New Media and Society 20, 11 (2018), 4366–4383. https://doi.org/10.1177/1461444818773059
[44]
E. Ostrom. 1990. Governing the commons: the evolution of institutions for collective action. Cambridge University Press. 280 pages. https://doi.org/10.2307/3146384
[45]
William Clyde Partin. 2019. Watch me pay: Twitch and the cultural economy of surveillance. Surveillance and Society 17, 1-2 (3 2019), 153–160. https://doi.org/10.24908/ss.v17i1/2.13021
[46]
Sarah T Roberts. 2016. Commercial content moderation: Digital laborers’ dirty work. In The Intersectional Internet: Race, Sex, Class and Culture Online, Safiya Umoja Noble and Brendesha Tynes (Eds.). NY: Peter Lang, New York, Chapter Commercial, 147–160. https://doi.org/10.1007/s13398-014-0173-7.2
[47]
Sarah T. Roberts. 2017. Content moderation. https://doi.org/10.1007/3-540-35375-5
[48]
Katrin Scheibe, Kaja J. Fietkiewicz, and Wolfgang G. Stock. 2016. Information behavior on social live streaming services. Journal of Information Science Theory and Practice 4, 2 (2016), 6–20. https://doi.org/10.1633/jistap.2016.4.2.1
[49]
Joseph Seering, Juan Pablo Flores, Saiph Savage, and Jessica Hammer. 2018. The Social Roles of Bots: Situating Bots in Discussions in Online Communities. Proceedings of the ACM on Human-Computer Interaction 2, CSCW(2018), 1–29. https://doi.org/10.1145/3274426
[50]
Joseph Seering, Robert Kraut, and Laura Dabbish. 2017. Shaping pro and anti-social behavior on Twitch through moderation and example-setting. In Proceedings of the 2017 ACM Conference on Computer Supported Cooperative Work and Social Computing - CSCW ’17. 111–125. https://doi.org/10.1145/2998181.2998277
[51]
Joseph Seering, Michal Luria, Geoff Kaufman, and Jessica Hammer. 2019. Beyond dyadic interactions: Considering chatbots as community members. In Proceedings of 2019 CHI Conference on Human Factors in Computing Systems - CHI ’ 2019. 13. https://doi.org/10.1145/3290605.3300680
[52]
Joseph Seering, Tony Wang, Jina Yoon, and Geoff Kaufman. 2019. Moderator engagement and community development in the age of algorithms. New Media and Society(2019), 1–28. https://doi.org/10.1177/ToBeAssigned
[53]
Tim Squirrell. 2019. Platform dialectics: The relationships between volunteer moderators and end users on reddit. New Media and Society(2019). https://doi.org/10.1177/1461444819834317
[54]
Kumar Bhargav Srinivasan, Cristian Danescu-Niculescu-Mizil, Lillian Lee, and Chenhao Tan. 2019. Content removal as a moderation strategy: Compliance and other outcomes in the changemyview community. In Proceedings of the ACM on Human-Computer Interaction, Vol. 3. CSCW, 21. https://doi.org/10.1145/3359265
[55]
Lucy Suchman. 1995. Making Work Visible. Communication of the ACM 38, 9 (1995), 56–64. https://doi.org/10.4324/9781315648088
[56]
T. Taylor. 2018. Twitch and the Work of Play. American Journal of Play 11, 1 (2018), 65.
[57]
Andreas Veglis. 2014. Moderation techniques for social media content. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Vol. 8531. 137–148. https://doi.org/10.1007/978-3-319-07632-4_13
[58]
Donghee Yvette Wohn. 2019. Volunteer moderators in Twitch micro communities: How they get involved, the roles they play, and the emotional labor they experience. In Proceedings of 2019 ACM Conference on Human Factors in Computing Systems. 1–13. https://doi.org/10.1145/3290605.3300390
[59]
Donghee Yvette Wohn and Guo Freeman. 2020. Audience Management Practices of Live Streamers on Twitch. In IMX 2020 - Proceedings of the 2020 ACM International Conference on Interactive Media Experiences. Association for Computing Machinery, Inc, New York, NY, USA, 106–116. https://doi.org/10.1145/3391614.3393653
[60]
Donghee Yvette Wohn, Guo Freeman, and Caitlin McLaughlin. 2018. Explaining viewers’ emotional, instrumental, and financial support provision for live streamers. In Proceedings of 2018 CHI Conference on Human Factors in Computing Systems. ACM Press, New York, NY, 1–13. https://doi.org/10.1145/3173574.3174048
[61]
Yu Chu Yeh. 2010. Analyzing online behaviors, roles, and learning communities via online discussions. Educational Technology and Society 13, 1 (2010), 140–151. http://www.jstor.org/stable/pdf/jeductechsoci.13.1.140.pdf
[62]
Amy X. Zhang and Justin Cranshaw. 2018. Making sense of group chat through collaborative tagging and summarization. Proceedings of the ACM on Human-Computer Interaction 2, CSCW(2018), 27. https://doi.org/10.1145/3274465

Cited By

View all
  • (2024)Metaverse risks and harms among US youth: Experiences, gender differences, and prevention and response measuresNew Media & Society10.1177/14614448241284413Online publication date: 12-Oct-2024
  • (2024)The psychology of volunteer moderators: Tradeoffs between participation, belonging, and norms in online community governanceNew Media & Society10.1177/14614448241259028Online publication date: 29-Jul-2024
  • (2024)Chillbot: Content Moderation in the BackchannelProceedings of the ACM on Human-Computer Interaction10.1145/36869418:CSCW2(1-26)Online publication date: 8-Nov-2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
IMX '21: Proceedings of the 2021 ACM International Conference on Interactive Media Experiences
June 2021
331 pages
ISBN:9781450383899
DOI:10.1145/3452918
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 23 June 2021

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. content moderation
  2. live streaming
  3. moderation strategies
  4. volunteer moderators
  5. workflow

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Funding Sources

Conference

IMX '21

Acceptance Rates

Overall Acceptance Rate 69 of 245 submissions, 28%

Upcoming Conference

IMX '25

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)362
  • Downloads (Last 6 weeks)57
Reflects downloads up to 12 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Metaverse risks and harms among US youth: Experiences, gender differences, and prevention and response measuresNew Media & Society10.1177/14614448241284413Online publication date: 12-Oct-2024
  • (2024)The psychology of volunteer moderators: Tradeoffs between participation, belonging, and norms in online community governanceNew Media & Society10.1177/14614448241259028Online publication date: 29-Jul-2024
  • (2024)Chillbot: Content Moderation in the BackchannelProceedings of the ACM on Human-Computer Interaction10.1145/36869418:CSCW2(1-26)Online publication date: 8-Nov-2024
  • (2024)Towards a Design Framework for Data-Driven Game Streaming: A Multi-Stakeholder ApproachProceedings of the ACM on Human-Computer Interaction10.1145/36771078:CHI PLAY(1-28)Online publication date: 15-Oct-2024
  • (2024)Labeling in the Dark: Exploring Content Creators’ and Consumers’ Experiences with Content Classification for Child Safety on YouTubeProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661565(1518-1532)Online publication date: 1-Jul-2024
  • (2024)Meditating in Live Stream: An Autoethnographic and Interview Study to Investigate Motivations, Interactions and ChallengesProceedings of the ACM on Human-Computer Interaction10.1145/36374178:CSCW1(1-33)Online publication date: 26-Apr-2024
  • (2024)Content Moderation Justice and Fairness on Social Media: Comparisons Across Different Contexts and PlatformsExtended Abstracts of the CHI Conference on Human Factors in Computing Systems10.1145/3613905.3650882(1-9)Online publication date: 11-May-2024
  • (2024)Third-Party Developers and Tool Development For Community Management on Live Streaming Platform TwitchProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642787(1-18)Online publication date: 11-May-2024
  • (2024)“People are Way too Obsessed with Rank”: Trust System in Social Virtual RealityComputer Supported Cooperative Work (CSCW)10.1007/s10606-024-09498-7Online publication date: 4-May-2024
  • (2023)Hate Raids on Twitch: Understanding Real-Time Human-Bot Coordinated Attacks in Live Streaming CommunitiesProceedings of the ACM on Human-Computer Interaction10.1145/36101917:CSCW2(1-28)Online publication date: 4-Oct-2023
  • Show More Cited By

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Get Access

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media