TikTok has a Nazi problem
Trending Now

TikTok has a Nazi problem


ISD had reported this account, along with 49 others, for security breach in June. TikTok Policies hate speech, promoting violence against protected groups, promoting hateful ideologies, celebrating violent extremists, and Holocaust denial. In all cases, TikTok found no violations, and all accounts were initially allowed to remain active.

A month later, TikTok had banned 23 accounts, indicating that the platform is removing at least some violating content and channels over time. Before being removed, the 23 banned accounts had at least 2 million views.

The researchers also created new TikTok accounts to understand how Nazi content is promoted to new users by TikTok’s powerful algorithm.

Using an account created in late May, the researchers watched 10 videos from a network of pro-Nazi users, occasionally clicking on the comment sections but stopping before any kind of real engagement such as a like, comment or bookmark. The researchers also found 10 pro-Nazi accounts. When the researchers looked at the For You feed within the app, it only took the algorithm three videos to suggest a video showing a World War II-era Nazi soldier with a chart of U.S. murder rates on it, with perpetrators broken down by race. Later, a video of an AI-translated speech by Hitler appeared, superimposed over a recruitment poster for a white nationalist group.

Another account tracked by ISD researchers promoted even more extremist content in its main feed, with 70 percent of videos featuring self-identified Nazis or Nazi propaganda. After the account followed several pro-Nazi accounts to access content on privately set channels, the TikTok algorithm promoted it to follow other Nazi accounts as well. All of the first 10 accounts suggested to this account by TikTok used Nazi symbols or keywords in their usernames or profile photos, or showed Nazi propaganda in their videos.

“This is not surprising in any way,” says Abby Richards, a misinformation researcher specializing in TikTok. “These are things we found over and over again. I’ve certainly found them in my own research.”

Richards wrote White supremacist and extremist accelerationist content on the platform in 2022, which includes The case of neo-Nazi Paul MillerWhile serving a 41-month sentence on firearms-related charges, he appeared in a TikTok video that garnered more than 5 million views and 700,000 likes in three months before it was removed.

Markus Boesch, a researcher based at the University of Hamburg who monitors TikTok, tells WIRED that the report’s findings are “not very surprising,” and he doesn’t expect TikTok to do anything to fix the problem.

“I don’t know exactly where the problem is,” Bosch says. “TikTok says it has about 40,000 content moderators, and such obvious policy violations should be easy to spot. Still, because of the sheer volume [of content]And given the ability of bad actors to adapt quickly, I believe the entire misinformation problem cannot ultimately be solved, neither with AI nor with more moderators.”

TikTok said it has completed a mentorship program with Tech Against Terrorism, a group that seeks to disrupt the online activities of terrorists and helps TikTok identify online threats.

“Despite the proactive steps taken, TikTok remains a target for exploitation by extremist groups as its popularity grows,” Adam Hadley, executive director of Tech Against Terrorism, told WIRED. “The ISD study shows that a small number of violent extremists can wreak havoc on large platforms due to adversarial asymmetry. This report therefore underscores the need for cross-platform threat intelligence supported by better AI-powered content moderation. The report also reminds us that Telegram must also be held accountable for its role in the online extremist ecosystem.”

As Headley noted, the report’s findings show that the company’s current policies have significant flaws.

“I’ve always described TikTok as a messaging platform when it comes to people on the far-right,” Richards said. “More than anything, it’s just about repetition. It’s about being exposed to the same hateful narrative over and over again, because at a certain point you start to believe things after you see them enough, and they really start to influence your worldview.”



Leave a Reply

Your email address will not be published. Required fields are marked *