Skip to main content

Technology

TikTok pushes harmful content to users as often as every 39 seconds

In one case, TikTok’s algorithm recommended suicidal content to a research posing as a teen in as little as 2.6 minutes after joining.

TikTok
TikTok | Shutterstock

December 15, 2022 5:41pm

Updated: February 19, 2023 2:18pm

Chinese social media platform TikTok recommends potentially harmful content, such as self-harm and eating disorder, to some teenage users within minutes of joining, according to a new report.

The study, conducted by the Center for Countering Digital Hate, involved researchers setting up TikTok accounts posing as 13-year-olds, the youngest allowed on the platform, and engaged with videos about body image and mental health by watching and liking them, reports CBS News.

TikTok’s algorithm quickly connected the “users” to content depicting self-harm, with suicidal content recommended in as few as 2.6 minutes after registering an account.

Eating disorder video clips was recommended within as few as 8 minutes. Researchers found 56 TikTok hashtags for eating disorder videos with over 13.2 billion views over the course of the study.

"The new report by the Center for Countering Digital Hate underscores why it is way past time for TikTok to take steps to address the platform's dangerous algorithmic amplification," James P. Steyer, Founder and CEO of Common Sense Media, told CBS.

"TikTok's algorithm is bombarding teens with harmful content that promote suicide, eating disorders, and body image issues that is fueling the teens' mental health crisis."

TikTok and other social media platforms have been criticized for harming young users’ mental health, especially teen girls.

Psychologists concluded that a spike in young girls with Tourette-like verbal and motor tics were spurred, in part, by the consumption of TikTok videos of people who said they had Tourette syndrome. Videos containing the hashtag #tourettes have 5.6 billion view total, up from just 1.25 billion in January 2021.

A spokesperson for TikTok challenged the methodology of the study and said the company “regularly” consults with health experts and deletes any videos that violate its policies.

Lawmakers announced this week that seeks to ban TikTok from operating in the U.S. over national security and data privacy concerns.