Advertisement

TikTok dumps QAnon channels, following Twitter’s crackdown

The takedown spotlights tech firms' fraught approach to stifling disinformation.
TikTok
(<a href="https://flic.kr/p/2jjP6YL">Solen Feyissa</a> / Flickr)

The only thing social media companies can seemingly agree upon when it comes to moderating content on their platforms is that QAnon crosses the line.

TikTok has removed a number of hashtags associated with the far-right conspiracy theory group, limiting the spread of the group that the FBI has described as a domestic terrorism threat. The company has made it more difficult for users to search for popular hashtags, reportedly including “QAnon” and “QAnonTruth,” among others, following a similar announcement from Twitter that it would remove 7,000 accounts and limit 150,000 more.

QAnon has pushed the unfounded conspiracy theory that President Donald Trump is fighting a “deep state” of government officials, celebrities and business leaders who secretly work as child sex traffickers and control global order. Its supporters frequently harass Trump critics, while believers have been linked to real-world acts of violence throughout the country. One supporter engaged in an armed standoff with police in 2018 at the Hoover Dam, while another is accused of killing a New York City mafia figure who he believed was a member of the deep state.

Sixty-six current or former congressional candidates for the 2020 elections have backed the theory in some form.

Advertisement

One TikTok hashtag which had more than 80 million views on July 22 had zero views by July 23, according to ABC News and Storyful, a social media analysis firm. A TikTok spokesperson told CyberScoop the content violated its community guidelines, and that the firm is taking steps to make it more difficult to use the search and hashtag functions to find disinformation.

“Over the past few months, especially since coronavirus, [QAnon] has been going mega-viral on TikTok,” said Alex Newhouse, digital research lead at the Center on Terrorism, Extremism, and Counterterrorism at Middlebury Institute. “So the disruption on TikTok is potentially more impactful than the one on Twitter because they essentially took out one of the biggest pathways toward consuming content.”

Digital extremism researchers typically view the communities they study in three phases: recruitment, radicalization and organization, when members begin carrying out coordinated attacks in the real world.

Broadly, QAnon’s Twitter community exists in the radicalization phase, when members amplify each other’s messaging and work to bring hatespeech and anti-semitism into the mainstream. On TikTok, however, the QAnon community still is working to recruit new members, typically by “sanding off some of their younger edges to get more people in,” Newhouse said.

After Twitter’s action on Tuesday, TikTok began removing an undisclosed number of hashtags, making it so that searches for common QAnon phrases would turn up no results. While TikTok has not removed any of the objectionable content, and the company’s algorithm will continue to push some videos, researchers say they are encouraged by the site’s action at a time when QAnon members have aggressively sought to attract younger, perhaps more naive social media members.

Advertisement

Accounts that still exist on TikTok demonstrate the youth-oriented recruitment approach. One popular account, hosted by a young man with a “Make America Great Again” hat on backwards, publishes videos that typically begin with personal anecdotes or a joke, before trying to poke holes in established scientific theories, or criticizing the media and popular celebrities, usually in a conversational tone.

That such videos still exist, though, spotlights tech firms’ mixed approach to removing hate speech, and political messaging meant to amplify destructive conspiracies. Facebook has removed some QAnon-affiliated activity, though much of the conspiracy’s activity persists in groups focused on issues like questioning vaccines and masks to fight illnesses like COVID-19, researchers said.

Prior to Twitter’s takedown, Reddit took perhaps the most aggressive approach against QAnon when, in 2018, it banned a number of communities, including one with more than 71,000 followers, focused on the conspiracy. (QAnon supporters moved to Reddit after the conspiracy moved out of the underground discussion site 8chan.)

Even after Reddit removed QAnon for inciting violence, in 2018, though, the site recently has become a hub of conversation focused on the false notion that online furniture retailer Wayfair was linked to a child-trafficking scheme. Rather than staying off Reddit, QAnon users moved from their banned forums to a Reddit’s r/conspiracy forum, a community with more than 1.3 million followers at press time.

Reddit did not respond to a request for comment on this story.

Advertisement

“There’s definitely more to be done but I think we should recognize a good thing when we see it,” Newhouse said.

Jeff Stone

Written by Jeff Stone

Jeff Stone is the editor-in-chief of CyberScoop, with a special interest in cybercrime, disinformation and the U.S. justice system. He previously worked as an editor at the Wall Street Journal, and covered technology policy for sites including the Christian Science Monitor and the International Business Times.

Latest Podcasts