This article is about censorship by TikTok itself. For censorship of TikTok by governments and organizations, see Censorship of TikTok.
There are reports of TikTok censoring political content related to China and other countries as well as content from minority creators. TikTok says that its initial content moderation policies, many of which are no longer applicable, were aimed at reducing divisiveness and were not politically motivated.
TikTok's content moderation policies have been criticized as non-transparent. Internal guidelines against the promotion of violence, separatism, and "demonization of countries" could be used to prohibit content related to the 1989 Tiananmen Square protests and massacre, Falun Gong, Tibet, Taiwan, Chechnya, Northern Ireland, the Cambodian genocide, the 1998 Indonesian riots, Kurdish nationalism, ethnic conflicts between blacks and whites or between different Islamic sects. A more specific list banned criticism of world leaders, including past and present ones from Russia, the United States, Japan, North and South Korea, India, Indonesia, and Turkey.[1][2]
In September 2019, The Washington Post reported allegations from former U.S. employees that TikTok censored content sensitive for Beijing as well as political discussions unrelated to China. Topics such as Donald Trump and the 2019–2020 Hong Kong protests were noticeably rarer on TikTok compared to other platforms. TikTok said it would replace its Beijing-based moderation with regional teams operating under greater autonomy in terms of content moderation.[3][4] On 27 November 2019, TikTok temporarily suspended the account of Feroza Aziz after she posted a video (disguised as a makeup tutorial) which drew attention to the Xinjiang internment camps.[5][6] TikTok later apologized and claimed that her account, which they soon reinstated, was flagged because of her joke about Osama Bin Laden in another post.[7] In July 2020, TikTok took down a video about the Xinjiang internment camps after it gained millions of views. It is available again with over six million views as of June 2024. The video's creator has also reported other instances where she was banned or restricted, including from livestreaming, after speaking on government or politics.[8][9]
TikTok's policies ban content related to a specific list of foreign leaders such as Vladimir Putin, Donald Trump, Barack Obama, and Mahatma Gandhi because it can stir controversy and attacks on political views.[10] Its policies also ban content critical of Turkish president Recep Tayyip Erdoğan and content considered to be supporting Kurdish nationalism.[11] TikTok was reported to have censored users who were supportive of the Citizenship Amendment Act protests in India and those who promote peace between Hindus and Muslims.[12]
In March 2020, internal documents leaked to The Intercept revealed that moderators had been instructed to censor political speech in livestreams, banning those who harmed "national honor" or who broadcast streams about "state organs such as police".[13][14][15] In response to censorship concerns, TikTok's parent company hired K&L Gates, including former U.S. Congressmen Bart Gordon and Jeff Denham, to advise it on its content moderation policies.[16][17][18]
ByteDance and TikTok said their early guidelines were global and aimed at reducing online harassment and divisiveness when the platforms were still growing. They have been replaced by versions customized by local teams for users in different regions. The company invited UK lawmakers to examine its algorithm.[22][23][24]
In January 2021, TikTok banned Trump-related content deemed to be inciting violence.[25] On 3 February, it received praise from Russian officials due to its co-operation with them in the removal of "forbidden" content, mostly related to protests in Russia.[26][27] A March 2021 study by the Citizen Lab found that TikTok did not censor searches politically but was inconclusive about whether posts are censored.[28][29]
In February 2022, German newspaper Frankfurter Allgemeine Zeitung reported that automatic subtitles in videos containing terms such as "reeducation camp," "internment camp," or "labor camp" were replaced with asterisks.[30] TikTok is said to operate a suspicious filtering system in Germany that bans words related to Nazism such as "Auschwitz".[31]
In response to the 2022 Russian invasion of Ukraine, TikTok banned new Russian posts and livestreams.[32][33][34]Tracking Exposed, a user data rights group, learned of what was likely a technical glitch that became exploited by pro-Russia posters. It stated that although this and other loopholes were patched by TikTok before the end of March, the initial failure to correctly implement the restrictions, in addition to the effects from Kremlin's "fake news" laws, contributed to the formation of a "splInternet ... dominated by pro-war content" in Russia.[35][36]
A 2023 paper by the Internet Governance Project at Georgia Institute of Technology concluded that TikTok is "not exporting censorship, either directly by blocking material, or indirectly via its recommendation algorithm."[37]
In March 2023, basketball player Enes Kanter Freedom was banned from TikTok after repeated warnings but subsequently restored when TikTok CEO Shou Zi Chew testified before the U.S. Congress. TikTok said it stands by previous strikes against Freedom but a moderation error had pushed his account over the line leading to a ban. After regaining his account, Freedom said he would continue criticizing the Chinese government on the platform. At the time, TikTok in the United States featured many videos that would have been censored within China, including hashtags such as #Uyghur treatment (278 million views), #TiananmenSquare (18 million views) and #FreeTibet (13 million views).[38] In May, the Acton Institute was suspended after it promoted videos about the imprisonment of Jimmy Lai and the Chinese government's crackdown on the pro-democracy camp in Hong Kong.[39] The suspension raised "deep concern" by lawmakers on the United States House Select Committee on Strategic Competition between the United States and the Chinese Communist Party.[40]
During the 2023 Israel–Hamas war, TikTok was accused of refusing to run ads by family members of Israelis taken hostage by Hamas.[41] TikTok was also accused by Malaysia's minister of communications, Fahmi Fadzil of suppressing pro-Palestinian content. The company stated it banned praising Hamas and removed more than 775,000 videos and 14,000 livestreams.[42][43]
A December 2023 study by Rutgers University researchers working under the name Network Contagion Research Institute (NCRI) found a "strong possibility that content on TikTok is either amplified or suppressed based on its alignment with the interests of the Chinese government."[44] Commenting on the study, The New York Times stated, "[a]lready, there is evidence that China uses TikTok as a propaganda tool. Posts related to subjects that the Chinese government wants to suppress — like Hong Kong protests and Tibet — are strangely missing from the platform."[45] The researchers subsequently found that TikTok removed the ability to analyze hashtags of sensitive topics.[46] TikTok said it restricted the number of hashtags that can be searched under its Creative Center because it was "misused to draw inaccurate conclusions".[47][48]
A historian from the Cato Institute stated that there were "basic errors" in the Rutgers University study and criticized the uncritical news coverage that followed. The study compares data from before TikTok even existed to show the app has fewer hashtags about historically sensitive topics, distorting the findings.[49][47]
In August 2024, Bloomberg reported that the Rutgers University NCRI released a new report based on user journey data.[50] By searching for four keywords—Uyghur, Xinjiang, Tibet, and Tiananmen—on TikTok, YouTube, and Instagram, the researchers found that TikTok’s algorithm displayed a higher percentage of positive, neutral, or irrelevant content related to China’s human rights abuses compared to both Instagram and YouTube.[50] The researchers also found that users spending three hours or more daily on the app were significantly more positive about China’s human rights records than non-users. TikTok dismissed NCRI’s study, stating it does not reflect the real user experience.[50]
Minority groups
LGBTQ+ and Disabled People
In 2019, The Guardian reported that TikTok's efforts to provide locally-sensitive moderation had resulted in the removal of content that could be perceived as being positive towards LGBTQ+ people or LGBTQ+ rights (such as same-sex couples holding hands) in countries such as Turkey.[11]
In December 2019, TikTok admitted that it aimed to "reduce bullying" in the comments of videos by artificially reducing the viral potential of videos its algorithm identified as being made by LGBTQ+ people.[51] That same month, the German website Netzpolitik.org reported that TikTok also artificially reduced the viral potential of videos its algorithm identified as being made by "fat people [and] people with facial disfigurement, autism, Down syndrome, [or] disabled people or people with some facial problems". Those affected may not have their video shown outside of their native country or have it show up on the "For You" page, TikTok's personalized algorithmic homepage feed.[51] According to The Verge, some lesbians on TikTok refer to themselves jokingly as "le dolla bean", referring to the spelling of "le$bian" used to avoid TikTok removing the video. Hicks told The Verge that "it became this whole joke because things that have the word 'lesbian' in them were either getting flagged for the deletion or causing the users' accounts to get in trouble".[52]
In 2020, TikTok was accused of censoring transgender users following reports of transgender users having videos being removed or muted. Transgender users on TikTok have complained of censorship after their posts were removed.[53] The BBC reported that the LGBTQ+ charity Stonewall stated that such actions had "sent a damaging message to young trans people using the platform for support". TikTok issued a statement claiming that they "categorically do not remove any content based on expression of gender identity".[54]
In September 2020, the Australian Strategic Policy Institute reported that certain LGBTQ+ hashtags have been restricted in Bosnia, Russia, and Jordan. TikTok admitted restricting hashtags in certain countries, citing local laws for some hashtag restrictions and other hashtags due to being primarily used for pornographic use. TikTok also claimed that some hashtags had been moderated by mistake and the issue subsequently fixed and other hashtags alleged to have been censored had never been used by actual video creators yet.[55]
In May 2021, American intersex activistPidgeon Pagonis reported that the "intersex" hashtag had become unavailable on TikTok for the second time. TikTok told The Verge that the tag had been removed by mistake and was subsequently restored in both instances, which led to public speculation about whether the hashtag was censored.[52]
TikTok has since apologized and instituted a ban against anti-LGBTQ ideology, with the exceptions of places such as China, the Middle East, and parts of Europe where additional censorship laws may apply.[52][56][55]
Black People
In instances of protesting against acts of racism and racism as a whole, users have felt that there was a change in the popularity of their content, such as their content not showing up as frequently or even at all.[57]
On May 7, 2020, in honor of the upcoming birthday of Malcolm X on May 19, TikTok user Lex Scott encouraged viewers to protest TikTok's suppression of African-American creators by changing their profile pictures to the black power fist symbol, following black creators, and unfollowing creators who did not support the initiative. This was termed the #ImBlackMovement. Thousands of TikTok users followed suit, and the hashtag #BlackVoicesHeard reached over 6 million views by the morning of May 19.[58]
After the murder of George Floyd sparked racial unrest in the United States and protests around the world on May 25, 2020, TikTok creators claimed that TikTok was deliberately suppressing videos that used the hashtags #BlackLivesMatter and #GeorgeFloyd, with these videos appearing to receive no views. TikTok released a statement apologizing for this, claiming that a technical glitch had caused the display error and that the hashtags had received over 2 billion views.[59] Hicks argued that LGBTQ+ people and people of color have found that the guidelines are enforced "wildly differently", meaning their content will be suppressed or removed for supposed violations and that reports of harassment from other users are not acted upon: "Not only is it hurting their ability to speak and be seen on the app, but it's also allowing them to get attacked and have hate speech thrown their way."[52] He told CNN that he welcomed TikTok's public pledge of support to the Black community after the 2020 police killing of George Floyd and that he applied to the company because he felt its corporate value "really resonated with me."[60] The phrase "Black Lives Matter" and several related ones were labeled as inappropriate content.[61]
In 2021, TikTok apologized and vowed to do better after an app called for black creators to be treated more fairly amid accusations of censorship and content suppression was suspended. TikTok has since apologized for racism but many Black creators say little has changed.[62]
Commentary
According to technology historian Mar Hicks, creators on TikTok feel that they have to be overly cautious about what they post "because the rules change at any given moment [and] there's no transparency".[52] Hicks said that the sudden disappearance of tags, intentional or not, has "incredibly problematic effects and negative effects on communities that are already marginalized and erased". The muddiness around content removal and moderation on TikTok is an ongoing frustration for the app's users. TikTok has community guidelines, but there is no public list of specific words and phrases that are banned, and it is not clear how much moderation is done algorithmically versus by actual people.[52]
Censorship on Douyin
China heavily regulates how TikTok's sister app Douyin is used by minors in the country, especially after 2018.[63] Under government pressure, ByteDance introduced parental controls and a "teenage mode" that shows only whitelisted content, such as knowledge sharing, and bans pranks, superstition, dance clubs, and pro-LGBT content.[a][56]
References
^Strictly legal explainers are still available on topics such as same-sex marriage.
^Faddoul, Marc; Romano, Salvatore; Rama, Ilir; Kerby, Natalie; Giorgi, Giulia (13 April 2022). "Content Restrictions on TikTok in Russia following the Ukrainian War"(PDF). Tracking Exposed. Archived(PDF) from the original on 16 December 2022. Retrieved 26 November 2022. cannot be solely attributed to TikTok's content restriction policies. The 'fake news' law ... is likely to have also increased the level of self-censorship ... likely to be a technical glitch ... these loopholes and tried to patch them
^Mueller, Miller; Farhat, Karim. "TikTok and US national security"(PDF). Internet Governance Project. Archived(PDF) from the original on 29 March 2023. Retrieved 11 July 2023.
^Ryan, Fergus; Fritz, Audrey; Impiombato, Daria (2020). "TikTok and WeChat: Curating and controlling global information flows". Tiktok and Wechat: 04–24. JSTORresrep26120.5.