An Analysis of Censorship on TikTok


In the book Selfie Democracy, Losh (2022) discusses how the internet and social media were seen as revolutionary tools that would help create a more equitable and politically transparent society for all of us. He specifically discusses how big tech companies flourished with little to no government intervention during both the Obama and the Trump administrations. Though social media has given a voice to many marginalized people, it is far from the utopian free democratic place for dialogue it was once imagined to be. Losh (2022) warns against an overly optimistic view of social media and calls for more government regulation to help curb many of the issues that have cropped up with the proliferation of social media. For example, Marwick & Lewis (2017) discuss ways the alt-right has used social media and the internet to gain power, such as memes, trolling & harassment, bots, and deliberately spreading misinformation. Foer (2018) also discusses the rise of ‘deep fakes’ and the negative consequences these videos may have for democracy and society. Kenyon (2016) further outlines how misinformation and conspiracy theories have flourished on social media. 


Censorship & Moderation Policies 

Another issue facing social media is censorship, and content moderation policies that some argue are not being administered properly. Many people think that social media platforms censor too much content, while others believe they need to do more to censor misinformation and hateful content (Novaic, 2020). Research has found that many issues with content moderation stem from the fact that “The artificial intelligence tools that automate the process of moderating and enforcing community standards on the sites don’t recognize the intent or background of those doing the posting” (Novaic, 2020). Though there have been issues with content moderation on all social media sites, in this essay, I will be specifically analyzing content moderation on TikTok, due to its popularity and especially young user base. It is, in fact, “the second most used social media app below Youtube, with 67% of U.S. teens saying they use it” (Vogels, E., Gelles-Watnick, R., Massarat, N., 2022). TikTok currently uses both AI and humans to moderate content (McIntyre, Bradbury, Perrigo, 2022). ByteDance, the Chinese company that owns TikTok, claims that none of the American-made content is moderated in China. However, “former U.S. employees said moderators based in Beijing had the final call on whether flagged videos were approved. The former employees said their attempts to persuade Chinese teams not to block or penalize certain videos were routinely ignored” (Harwell & Romm, 2019). 


Censorship of Marginalized Groups & Activist  

The platform’s deficient content moderation has sometimes led to certain marginalized groups or activists having their posts or accounts taken down. For example, during the Black Lives Matter protests, “posts with the hashtags #BlackLivesMatter and #GeorgeFloyd were shown to have zero views” at one point when they actually had much more (Shead, 2020). Some creators claimed that their videos had been “taken down, muted or hidden from followers” after posting Black Lives Matter-related content (McCluskey, 2020). Furthermore, Ohleiser (2021) discusses how the TikTok creator Ziggi Tyler, found that “When he tried to enter certain phrases in his bio, some of them—’Black lives matter,’ ‘supporting black people,’ ‘supporting black voices,’ and ‘supporting Black success’—were flagged as inappropriate content. But white versions of the same phrases were acceptable”. When questioned about this, a TikTok representative told Ziggi that what he “was seeing was a result of an automatic filter set to block words associated with hate speech. The system, it said, was ‘erroneously set to flag phrases without respect to word order'” Ohleiser (2021).  

Moreover, internal documents from the early stages of the app show that moderators were instructed “to suppress posts created by users deemed too ugly, poor, or disabled” in order to attract new users to the app (Biddle, S., Victor Ribeiro, P., Dias, T., 2020). However, Josh Gartner, a U.S. TikTok spokesperson, states that those policies are no longer in effect. Other research conducted by Köver & Reuter (2019) found that moderators were also being told to flag videos by people with disabilities, queer people, and fat people. The reasoning given for this decision was apparently to protect these “at risk” individuals from bullying and harassment on the site (Köver & Reuter, 2019). Additionally, people have found that at one point TikTok had blocked the phrase ‘Asian women’ and the hashtag #intersex” (Ohleiser, 2021). Videos about the 2019 – 2020 Hong Kong protests and the hashtag #TianamenSquare are also systematically removed (Harwell & Romm, 2019).  



TikTok has quickly become an extremely popular platform among youth and adults alike. In fact, “It more than doubled its worldwide user base between 2019 and 2021 (291.4 million to 655.9 million)” (Insider Intelligence, 2022). Additionally, “A small but growing share of U.S. adults say they regularly get news on TikTok. This is in contrast with many other social media sites, where news consumption has either declined or stayed about the same in recent years” (Matsa, 2022). Therefore, as social media companies like TikTok grow, we must be aware of how their content moderation affects both what we see and what we do not see online. Furthermore, Day (2021) discusses how some people have resorted to coded language in order to prevent TikTok from taking down their videos. Day (2021) states that this is a problem because “social media is an especially popular tool for activism and resource sharing.”  Ohlheiser (2021) argues that “part of the answer is one of the most longstanding stories in tech: hire, and listen to, people from a diversity of backgrounds.” However, she, alongside many other scholars, states that, ultimately, there needs to be greater transparency in how these companies rank and moderate content as a whole. Losh (2022) further argues that the government needs to hold social media companies legally responsible for their actions. As he states, “We don’t just need digital literacy; we need digital rights” (pg. 252). 


Biddle, S. & Victor Ribeiro, P. & Dias, T. (2020, Nov 15). Invisible Censorship: TikTok Moderators Told Moderators to Suppress Posts by “Ugly” people and the Poor to Attract New Users. The Intercept.  


Day, F. (2021, Dec 10 ). Are Censorship Algorithms Changing TikTok’s Culture? One Zone. 


Foer, F. (2018, May). The Era of Fake Video Begins.The Atlantic. 


Harwell, D. & Romm, T. (2019, September 17). Don’t look for the Hong Kong protests on TikTok. You won’t find them. The Sydney Morning Herald. 


Harwell, D & Romm, T. (2019, Nov 5). Inside TikTok: A culture clash where U.S. views about censorship often were overridden by the Chinese bosses. The Washington Post. 


Insider Intelligence. (2022, June 1). TikTok users worldwide (2020-2025). Insider Intelligence.,%2C%20Snapchat%2C%20and%20Twitter

Kenyon, G. (2016, January 9). The Man Who Studies the Spread of Ignorance. BBC. 


Köver, C. &  Reuter, M. (2019, February 12). TikTok curbed reach for people with disabilities. 


Losh, E. (2022). Selfie Democracy: The New Digital Politics of Disruption and Insurrection. The MIT Press. 


Marwick, A. & Lewis, R. (2017, May 15). Media Manipulation and Disinformation Online. Data & Society. 


Matsa, E. (2022, October 21). More Americans are getting news on TikTok, bucking the trend on other social media sites. Pew Research Center. 


McCluskey, M. (2020, July 22). These TikTok Creators Say They’re Still Being Suppressed for Posting Black Lives Matter Content. Times.


McIntyre, N., Bradbury, R. Perrigo, B. (2022, October 20). Behind TikTok’s Boom: A Legion of Truamatised, $10-A-Day Content Moderators. The Bureau of Investigative Journalism.,the%20much%2Dneeded%20extra%20cash


Novaic, I. (2020, August 8). Censorship on social media? It’s not what you think. CBS News. 


Ohlheiser, A. (2021, July 13). Welcome to TikTok’s endless cycle of censorship and mistakes. Technology Review. 


Shead, S. (2020, June 2).TikTok apologizes after being accused of censoring #BlackLivesMatter posts. NBC. 


Vogels, E. & Gelles-Watnick, R. & Massarat, N. (2022, August 10). Teens, Social Media and Technology 2022. Pew Research Centre.

Leave a Comment

Your email address will not be published. Required fields are marked *

css.php Skip to content