“Unalive.”
“Accountant.”
“R’d.”
“Yt.”
“Bw” or “Bm.”
The above words are easily decipherable to any avid TikTok user. “Unalive” translates to suicide. “Accountant” refers to any form of sex worker or OnlyFans user. “R’d” means raped. “Yt” signifies white people while “Bw” and “Bm” means black women and black men. There are many more code words than I listed above, all the result of users trying to avoid being “shadow banned.”
A shadow ban is when a social media platform restricts a user’s account, either by suppressing one’s content or removing the account entirely. Unlike a normal ban or suppression, however, the user is not notified that they have been punished by the platform.
On TikTok, each user has a personal video feed—the For You Page, or FYP—that the algorithm has curated for them. The algorithm monitors what videos a user likes, comments on, watches entirely, or skips past. If you want to become a popular creator on the app or go viral, the goal is to get your videos on as many people’s FYPs as possible. Not every video will have an equal opportunity to land on FYPs. TikTok’s algorithm watches for any post that could potentially violate the platform’s Community Guidelines, the do’s and don’ts of the app. If a post violates these guidelines, the video will be removed from the platform entirely. If there are repeated violations by a user, TikTok will either restrict or ban the account.
And yet, TikTok explains that they also consider the “breadth of our audience” to “determine what content is eligible for algorithmic promotion.” To translate—even if a user’s video does not violate the Community Guidelines, TikTok still might not promote it. This is a shadow ban—putting a user in an arbitrary internet time-out. The fact that companies don’t notify the user that their account has been shadow banned fosters paranoia and confusion. Users are cautious not to break the unspoken rules of TikTok, hence their reliance on the abbreviations I listed above.
How did shadow banning become so integral to TikTok?
In September 2016 a Chinese technology company named ByteDance launched TikTok. It was a platform for users to post dancing and lip-syncing videos or short comedy sketches under fifteen seconds. TikTok filled a social media need—a space for light-hearted, silly videos and memes. By 2018, the app had been downloaded over a billion times by users all over the globe.
As other businesses faltered or collapsed in the face of the pandemic, TikTok experienced enormous growth, especially among U.S. users. In June 2020, TikTok had 92 million active users in the U.S., dwarfing the 11 million who used it in January 2018. By July 2020, TikTok had 689 million active users around the world. Since then, it has only kept growing. Users still post choreographed dance routines to bad music or comedy sketches like in the olden days of 2019. But now popular creators can pay their rent by posting high-production content. Brands and companies, from Ryanair to Chipotle, make sure their social media teams maintain a viral TikTok page. And, like the other social media platform giants, TikTok has faced scrutiny and lawsuits from the U.S. government questioning its data harvesting practices.
But perhaps the most significant change in TikTok has been the transformation of its purpose. The pandemic brought the app an influx of new users, bored on their couches, deprived of real-life interaction. As a result, TikTok became a hub for conversation, information, and news—not just memes. From users trying to prove that Covid was a myth by licking public toilets to accounts reacting to the George Floyd murder and the resulting BLM protests, TikTok now had millions of people trying to wedge in their own opinion and experiences.
So, what to do when faced with a tidal wave of users and content? TikTok now confronted the same challenges as other social media platforms: Can you have unchecked free speech on the internet? What responsibility does a platform have in establishing parameters around millions of different conversations?
All social media platforms have tried to develop safeguards to keep their company out of trouble. One of the most relied on tactics is shadow banning. Instagram, Twitter, and even Craigslist have been accused of shadow banning users. At times it seems completely random. And at times the reasons are more sinister. In 2019, TikTok blocked a 17-year old user after she posted a video criticizing China’s treatment of Uighur Muslims. TikTok later reinstated her account after public pressure but denied using a shadow ban. In 2020, the app admitted to restricting LGTBQ+ language in Russia and Arabic-speaking countries, explaining it as an act of “localized” moderation. In the same year, TikTok may have penalized black creators during the height of the BLM protests—even when they stopped discussing the movement.
Most absurdly, in 2020 The Intercept obtained TikTok policy documents that showed the social media platform shadow banning users deemed “ugly or poor.” The documents also justified the suppression of users who had an “abnormal body shape,” “ugly facial looks,” and an “obvious beer belly,” as well as videos where a “shooting environment is shabby and dilapidated.”
It is understandable that TikTok would want to remove or restrict accounts that are posting content that promotes crime, sexual grooming or assault—the big “no no’s” of their guidelines. The issue is that the algorithm goes beyond this and suppresses accounts with no rhyme or reason, while failing to notify the individual user even though, according to TikTok’s own rules, it has an obligation to do so. In few of these examples are creators actively promoting content that violates TikTok’s Community Guidelines or committing foul play.
Shadow banning is a temporary solution to the question that consumes all social media companies: How can we moderate the activity on our platforms and maintain a healthy internet ecosystem? But if shadow banning proves anything, it’s that arbitrary, wide-scale censorship cannot be the answer to the internet’s problems. It seldom works for long. Prolific social media users will always figure out a means to say what they want to say and post what they want to post. The human brain can come up with enough synonyms to get its message across. Despite TikTok’s efforts, the algorithm cannot control what their users talk about—and nor should it.
I deleted TikTok a few weeks ago. Even with all the safeguards the app has tried to put in place, my FYP continually dragged me into spaces of the internet I simply did not want to be. As someone who has dealt with depression, anxiety, and poor body image—as most Gen Zers have—I felt that my phone was only amplifying my worst inner thoughts no matter how much I asked it to stop. Instead of trying to tell my FYP what I wanted to see, I just left. The content they showed me didn’t make much sense. I still wasn’t having fun. I know myself better than TikTok could ever know me, and the same goes for the rest of us.
In the end, the answer of how to maintain a healthy internet space does not lie in Community Guidelines, shadow banning, or whatever policies a platform might come up with. The answer lies in the individuals who fuel the whole mess. We need to be our own moderators, conscious of our words and aware of who could potentially hear them. We should reserve our voices for when they will be best heard, rather than shouting out to a sea of strangers.
Don’t rely on a robot to understand who you are, what you’re trying to say, or what you’re allowed to see. Just sign off.
Beatrice Frum is a student at Occidental College studying Comparative Literature and community manager at Persuasion.
Follow Persuasion on Twitter, LinkedIn, and YouTube to keep up with our latest articles, podcasts, and events, as well as updates from excellent writers across our network.
For some reason I am hearing the that Madonna song in my head with slightly different lyrics:
"Living in a virtual world, and I am a virtual girl."
I will never install or use TikTok.
If TikTok no longer provides the fun stuff for which it was originally designed won't people stop using it?