© 2020 WYSO
Our Community. Our Nation. Our World.
Play Live Radio
Next Up:
0:00
0:00
Available On Air Stations
NPR News

TikTok Tightens Crackdown On QAnon, Will Ban Accounts That Promote Disinformation

TikTok says it is banning all accounts that share content related to the QAnon conspiracy theory, hardening its previous policy on the far-right movement.
TikTok says it is banning all accounts that share content related to the QAnon conspiracy theory, hardening its previous policy on the far-right movement.

Updated 10:14 p.m. Monday ET

TikTok is toughening its stance against the QAnon conspiracy theory, expanding its ban to all content or accounts that promote videos advancing baseless ideas from the far-right online movement.

The action hardens the video-sharing app's previous enforcement against QAnon that targeted specific hashtags on the app that QAnon supporters have used to spread unfounded theories. Now, users that share QAnon-related content on TikTok will have their accounts deleted from the app.

"Content and accounts that promote QAnon violate our disinformation policy and we remove them from our platform," a TikTok spokesperson said in a statement to NPR. "We've also taken significant steps to make this content harder to find across search and hashtags by redirecting associated terms to our Community Guidelines."

TikTok's sweeping action against QAnon comes just as Facebook, Twitter, YouTube and other technology giants have announced bans on content from the Trump-supporting conspiracy theory. QAnon started in October 2017 and has amassed an enormous following online thanks largely to social media companies.

"There should be recognition of a thing that is good and significant, even if it's long overdue," said Angelo Carusone, president of the liberal nonprofit watchdog group Media Matters for America. "TikTok is recognizing that by the nature of the QAnon movement, you can't just get rid of their communities, the content itself is the problem."

Earlier this month, Media Matters identified more than a dozen hashtags TikTokkers used to spread QAnon conspiracy theories about President Trump's positive coronavirus test, false beliefs about Democratic presidential candidate Joe Biden and videos questioning the reality of the pandemic.

"We're talking about hundreds of millions of video views just for a limited segment of QAnon communities that we identified," Carusone said.

TikTok, which has 100 million monthly active users in the U.S., made its expanded ban against QAnon quietly in a statement to Media Matters, where it garnered little attention. A TikTok spokesperson confirmed the policy to NPR on Saturday. A company official said it has been TikTok's internal policy since August.

Hany Farid, a UC Berkeley computer science professor who is a member of TikTok's committee of outside content moderation experts, said there is tension within social networks over how to respond to misinformation without also amplifying the underlying theories.

"When you ban it, you give it credibility. You give it attention," Farid told NPR.

"But the movement got big enough and dangerous enough that people were looking at the landscape and saying, 'Yeah, this is completely out of control,' " he said. "Were they slow to do it? Probably. But platforms get criticized when they act too quickly. So there is a dilemma there."

TikTok uses a mix of artificial intelligence and thousands of human content moderators to try to curb troubling content. The Chinese-owned app is best-known for viral dance challenges and comedic performances.

According to TikTok's Community Guidelines, misinformation that "causes harm to individuals, our community or the larger public" is prohibited on the site, including medical misinformation, which QAnon has engaged in by pushing false notions about the deadly coronavirus.

Carusone of Media Matters said misinformation accounts on TikTok have been clever about avoiding detection by hijacking otherwise-benign hashtags, or creating new hashtags that are written slightly in code, among other strategies to evade efforts to curb the content.

"The test of this policy will be how much it affects the creation and germination of new QAnon content on TikTok," Carusone said. "If you know your video is going to be eliminated before it has a chance to spread, you're less likely to spend time polluting the TikTok pool."

The future of TikTok in the U.S. remains uncertain. A federal judge last month temporarily halted a Trump administration attempt to shut down the app. But a separate order from the White House for TikTok to divest from its Beijing owner or cease operations remains in place, with a deadline of Nov. 12 for TikTok to find an American buyer or close down its U.S. operations.

Trump officials cite national security concerns with TikTok's China-based corporate owner, ByteDance, but TikTok has long dismissed the effort as an crusade to score political points. The company says U.S. user data is controlled by an American-led team and that the Chinese government has never requested access to the data.

Copyright 2020 NPR. To see more, visit https://www.npr.org.