2 Context and Motivation
Recent years have witnessed the growth of “creator economy,” where independent content creators rely on platforms such as YouTube, Twitch, Instagram, and TikTok to gain fanbase and income and further develop content creation as a profession. As 50 million people identify themselves as creators [
12] and two million have made content creation their full-time job [
17], creators usually want to establish their branding [
13,
32,
38] that can be competitive and distinguishable from other creators [
20] to sustain their careers [
11]. Against this context, it is common that creators work across platforms to diversify their income and content creation [
17,
36].
But being creators does not mean absolute freedom of content creation, and monetizing content does not indicate secure self-employment. That is because creators need would go through creator moderation [
29,
30], where multiple governance mechanisms exercise control beyond creator's content or speech to their careers such as income [
7], visibility [
6], and audience engagement. And oftentimes, creator moderation negatively impacts creators. For example, creators complained that Facebook cut their income compared to the amount they had typically earned from videos [
8]; and creators with racial and sexual minority groups complained their advertising income was unfairly reduced on YouTube compared with other creators [
1,
2,
34].
However, as HCI and CSCW researchers have gained more understanding of users’ experience with content moderation (e.g., [
9,
19,
22,
23]), we still lack systematic knowledge of creators’ experiences with creator moderation. For example, researchers have uncovered various challenges users encounter when they experience content moderation: (1) opacity of moderation decision-making [
15,
18,
35], (2) unfairness of moderation decisions [
22,
40,
41], (3) procedural obstacles to appeal moderation decisions [
5,
25], and more. But will creators also encounter similar challenges of creator moderation systems? If so, how do they react to or handle these challenges?
Especially, as more creators nowadays work on more than one platform, relatively little work has paid attention to whether and how current moderation designs work with creators across platforms. Aiming for end-user advocacy and empowerment, HCI and CSCW researchers have started to design fairer, more transparent, and contestable moderation mechanisms (e.g., [
14,
24,
41,
43]). But it remains unknown whether and how those design structures fit creators’ interests or workflows across platforms. Communication researchers have touched on cross-platform creators in terms of how their labor for branding is shaped by certain platform affordances [
13,
27,
32,
38]. But still, from a design perspective, it has not fully understood whether and how those designs, including creator moderation, take creators’ interests and voice into account across platforms.
So, informed by prior work from different disciplines, my doctoral research aims to answer three primary research questions:
RQ1: What does creator moderation entail?
RQ2: How do creators experience creator moderation?
RQ3: What creator moderation design can better work with creators across platforms?
3 Work to Date
My work is on the way but yet to fully answer all RQs. Specifically, with four completed publications [
28–
31], I have conducted a series of case studies on YouTube first to gain an initial understanding of creator moderation because YouTube is currently the largest video sharing platform and among the first to allow users to monetize their content [
21]. Then, I plan to conduct two more studies to better conceptualize creator moderation and its design across platforms, such as YouTube, Twitch, Instagram, and TikTok, as media outlets have reported the scale of creators is larger on these four than others in the US till 2022 [
17,
26,
37]. In the following two subsections, I will detail how my published work and proposed studies address the research questions sequentially.
3.1 Study 1: Creator Moderation's Socioeconomic Implications
The study [
28] explores what creator moderation entails and creators’ interactions with it (partially answered
RQ1). Through a thematic analysis on online discussion data collected from a subreddit, r/youtube, nearly the largest YouTube-related online community, we identified that, beyond content moderation, creator moderation exercises socioeconomic impacts on creators. That is, creators on YouTube encounter the algorithmic opacity of moderation decisions as the platform largely implements algorithms (e.g., machine learning) in creator moderation. And such opacity led creators’ video creation work to be precarious. Thus, creators strived to cope with the precarity of creator careers by gaining and applying knowledge of creator moderation and diversifying income through multiple crowdfunding platforms. This study lays out solid ground for my future work of diving deeper into understanding how creators go through and make sense of creator moderation design.
3.2 Study 2: Fairness and Bureaucracy Challenges in Creator Moderation
Through 28 semi-structured interviews with creators who experienced moderation on YouTube, we examined how creators experience creator moderation, which oftentimes challenged creators’ career development (partially answered
RQ2). We found that creators generated fairness perception given different contexts of moderation decisions they received [
29], and encountered bureaucracy of moderation system [
30]. First, creators developed their perceived unfairness when they encountered (1) unequal moderation treatments through cross-comparisons, (2) inconsistent decisions, processes, system actions, or those inconsistent with content policies, and (3) the lack of their voice in multiple algorithmic visibility decision-making processes. Second, I found that creators on YouTube would first experience algorithmic bureaucracy, where YouTube's moderation system fails to adapt decision-making to creators’ novel and localized content, and then experience organizational bureaucracy, where creators are not in a privileged position to appeal moderation decisions. These two sets of findings have deepened our understanding of creators’ experiences with creator moderation and depicted the typical creator moderation phases, including content rule articulation, rule enforcement or moderation decision-making, and moderation re-examination (e.g., appeal).
3.3 Study 3: Transparency Design of Creator Moderation
As prior researchers have largely viewed enhanced transparency as an approach to combat bureaucracy [
33] and fairness [
3,
42] challenges, we also recognize the importance of transparency in creator moderation. So, drawing from prior understanding of creator moderation phases, this study addresses what transparency design requires in different phases given creators’ moderation experience on YouTube (i.e., partially answered
RQ3). We found that creators hoped the moderation system to present moderation decisions saliently, explain moderation rationales profoundly, afford effective communication from human agents who are hired by the platform, and offer learning opportunities for better working with creator moderation. This study shows creator moderation needs to maintain dynamic moderation transparency that both balances transparency efforts and negative moderation impacts to value creators’ labor.