Nothing Special   »   [go: up one dir, main page]

Conceptualizing and Rethinking the Design of Cross-platform Creator Moderation

Renkai Ma, College of Information Sciences and Technology, Pennsylvania State University, United States, renkai@psu.edu

My doctoral research explores how content creators experience creator moderation across different platforms through mixed methods. Through qualitative methods such as semi-structured interview, my prior work has gained understanding of the socioeconomic implications of creator moderation on creators, the fairness and bureaucracy challenges creators face, and what transparency design requires to address these challenges. My proposed future work will first quantitatively identify what algorithmic moderation design affects creators’ perceived transparency, fairness, and accountability of creator moderation across different platforms. Then, I will co-design with creators and commercial moderators from different platforms to rethink what and how creator moderation can take creators’ interests into account. My dissertation aims to contribute to HCI and CSCW fields by conceptualizing the notion of creator moderation and detailing design considerations for empowering creators.

CCS Concepts: Human-centered computing, Collaborative and social computing

KEYWORDS: Creator moderation, content moderation, content creators

ACM Reference Format:
Renkai Ma. 2023. Conceptualizing and Rethinking the Design of Cross-platform Creator Moderation. In Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems (CHI EA '23), April 23-28, 2023, Hamburg, Germany. ACM, New York, NY, USA, 5 Pages. https://doi.org/10.1145/3544549.3577049

1 RESEARCH SITUATION

I am a third-year, full-time Ph.D. candidate in the Informatics program at The Pennsylvania State University's College of Information Sciences and Technology, where I am advised by Dr. Yubo Kou. The projected completion date of my doctoral degree will be around May 2024 or summer 2024. I have never attended a doctoral consortium at any previous SIGCHI conferences in the past. The themes of my research are HCI, CSCW, and, specifically, creator moderation.

2 CONTEXT AND MOTIVATION

Recent years have witnessed the growth of “creator economy,” where independent content creators rely on platforms such as YouTube, Twitch, Instagram, and TikTok to gain fanbase and income and further develop content creation as a profession. As 50 million people identify themselves as creators [12] and two million have made content creation their full-time job [17], creators usually want to establish their branding [13,32,38] that can be competitive and distinguishable from other creators [20] to sustain their careers [11]. Against this context, it is common that creators work across platforms to diversify their income and content creation [17,36].

But being creators does not mean absolute freedom of content creation, and monetizing content does not indicate secure self-employment. That is because creators need would go through creator moderation [29,30], where multiple governance mechanisms exercise control beyond creator's content or speech to their careers such as income [7], visibility [6], and audience engagement. And oftentimes, creator moderation negatively impacts creators. For example, creators complained that Facebook cut their income compared to the amount they had typically earned from videos [8]; and creators with racial and sexual minority groups complained their advertising income was unfairly reduced on YouTube compared with other creators [1,2,34].

However, as HCI and CSCW researchers have gained more understanding of users’ experience with content moderation (e.g., [9,19,22,23]), we still lack systematic knowledge of creators’ experiences with creator moderation. For example, researchers have uncovered various challenges users encounter when they experience content moderation: (1) opacity of moderation decision-making [15,18,35], (2) unfairness of moderation decisions [22,40,41], (3) procedural obstacles to appeal moderation decisions [5,25], and more. But will creators also encounter similar challenges of creator moderation systems? If so, how do they react to or handle these challenges?

Especially, as more creators nowadays work on more than one platform, relatively little work has paid attention to whether and how current moderation designs work with creators across platforms. Aiming for end-user advocacy and empowerment, HCI and CSCW researchers have started to design fairer, more transparent, and contestable moderation mechanisms (e.g., [14,24,41,43]). But it remains unknown whether and how those design structures fit creators’ interests or workflows across platforms. Communication researchers have touched on cross-platform creators in terms of how their labor for branding is shaped by certain platform affordances [13,27,32,38]. But still, from a design perspective, it has not fully understood whether and how those designs, including creator moderation, take creators’ interests and voice into account across platforms.

So, informed by prior work from different disciplines, my doctoral research aims to answer three primary research questions:

RQ1: What does creator moderation entail?

RQ2: How do creators experience creator moderation?

RQ3: What creator moderation design can better work with creators across platforms?

3 WORK TO DATE

My work is on the way but yet to fully answer all RQs. Specifically, with four completed publications [2831], I have conducted a series of case studies on YouTube first to gain an initial understanding of creator moderation because YouTube is currently the largest video sharing platform and among the first to allow users to monetize their content [21]. Then, I plan to conduct two more studies to better conceptualize creator moderation and its design across platforms, such as YouTube, Twitch, Instagram, and TikTok, as media outlets have reported the scale of creators is larger on these four than others in the US till 2022 [17,26,37]. In the following two subsections, I will detail how my published work and proposed studies address the research questions sequentially.

3.1 Study 1: Creator Moderation's Socioeconomic Implications

The study [28] explores what creator moderation entails and creators’ interactions with it (partially answered RQ1). Through a thematic analysis on online discussion data collected from a subreddit, r/youtube, nearly the largest YouTube-related online community, we identified that, beyond content moderation, creator moderation exercises socioeconomic impacts on creators. That is, creators on YouTube encounter the algorithmic opacity of moderation decisions as the platform largely implements algorithms (e.g., machine learning) in creator moderation. And such opacity led creators’ video creation work to be precarious. Thus, creators strived to cope with the precarity of creator careers by gaining and applying knowledge of creator moderation and diversifying income through multiple crowdfunding platforms. This study lays out solid ground for my future work of diving deeper into understanding how creators go through and make sense of creator moderation design.

3.2 Study 2: Fairness and Bureaucracy Challenges in Creator Moderation

Through 28 semi-structured interviews with creators who experienced moderation on YouTube, we examined how creators experience creator moderation, which oftentimes challenged creators’ career development (partially answered RQ2). We found that creators generated fairness perception given different contexts of moderation decisions they received [29], and encountered bureaucracy of moderation system [30]. First, creators developed their perceived unfairness when they encountered (1) unequal moderation treatments through cross-comparisons, (2) inconsistent decisions, processes, system actions, or those inconsistent with content policies, and (3) the lack of their voice in multiple algorithmic visibility decision-making processes. Second, I found that creators on YouTube would first experience algorithmic bureaucracy, where YouTube's moderation system fails to adapt decision-making to creators’ novel and localized content, and then experience organizational bureaucracy, where creators are not in a privileged position to appeal moderation decisions. These two sets of findings have deepened our understanding of creators’ experiences with creator moderation and depicted the typical creator moderation phases, including content rule articulation, rule enforcement or moderation decision-making, and moderation re-examination (e.g., appeal).

3.3 Study 3: Transparency Design of Creator Moderation

As prior researchers have largely viewed enhanced transparency as an approach to combat bureaucracy [33] and fairness [3,42] challenges, we also recognize the importance of transparency in creator moderation. So, drawing from prior understanding of creator moderation phases, this study addresses what transparency design requires in different phases given creators’ moderation experience on YouTube (i.e., partially answered RQ3). We found that creators hoped the moderation system to present moderation decisions saliently, explain moderation rationales profoundly, afford effective communication from human agents who are hired by the platform, and offer learning opportunities for better working with creator moderation. This study shows creator moderation needs to maintain dynamic moderation transparency that both balances transparency efforts and negative moderation impacts to value creators’ labor.

4 FUTURE DISSERTATION WORK

In my future work, I will map the conceptual understanding of creator moderation I gained from my prior work into understanding how moderation design can empower creators across platforms (i.e., who self-identifies to work at least on two platforms). Study 4 and 5 aim to fully answer RQ1-RQ3.

4.1 Study 4: Improving Cross-platform Creator Moderation Design by Identifying Creators’ Interests

This study will identify what types of ex-ante proactive algorithmic decisions, which are related to creators’ best interests, affect creators’ perceived transparency, fairness, and accountability of creator moderation across platforms. Prior research, including mine, has uncovered that content creators primarily concern about their interests and careers in terms of content visibility or reach [4,10,29], income [7,28], and audience engagement [16,32] gained on platforms. Thus, I will design a controlled experiment with a series of scenario-based questions in terms of ex-ante proactive algorithmic moderation involving these creators’ personal interests to inquire about their perceptions of moderation and the rationales behind these perceptions. I will conduct both qualitative and statistical analysis to analyze survey responses, and results will include (1) who the creators preferring ex-ante proactive algorithmic moderation are (2) how effective such ex-ante proactive algorithmic moderation can improve creators’ perceived transparency, fairness, and accountability of creator moderation, and (3) what algorithmic moderation design better can work with different creators who hold different perspectives.

4.2 Study 5: Co-designing Cross-platform Creator Moderation with Creators and Commercial Moderators

In this study, I aim to conduct participatory design (PD) [39] workshops to systematically and holistically elicit a better understanding of what creator moderation means to creators and what creator moderation structures can better involve their voice and interests. The PD workshop procedure includes identifying creators and commercial moderators with certain inclusion criteria, material preparation, exploration of work, co-design activity, and evaluation activity with focus group discussion. I expect co-designing with creators and commercial moderators would empower both actors under platform governance to reconcile conflicts with platforms, implicating better creator moderation structures.

5 CURRENT AND EXPECTED CONTRIBUTIONS

My doctoral research aims to offer two primary contributions to the HCI and CSCW fields. First, my research contributes the conceptualization and notion of creator moderation and how it is contextualized across different platforms. Second, as HCI research values end-user advocacy, my research contributes design and policy implications that detail how to advocate and empower content creators in cross-platform creator moderation.

ACKNOWLEDGMENTS

This work is partially supported by NSF grant no. 2006854. I am grateful for 49 content creators who have already participated in our work and shared their stories with us. Last, I sincerely thank my advisor, Dr. Yubo Kou, for his generous guidance and tremendous support.

REFERENCES

Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the owner/author(s).

CHI EA '23, April 23–28, 2023, Hamburg, Germany

© 2023 Copyright held by the owner/author(s).
ACM ISBN 978-1-4503-9422-2/23/04.
DOI: https://doi.org/10.1145/3544549.3577049