Advertisement

TikTok says it will do more to slow spread of misinformation videos

The video-sharing app says that if its fact-checking process can't verify the information in a video, users will get a warning about sharing it.
TikTok
(<a href="https://flic.kr/p/2jjP6YL">Solen Feyissa</a> / Flickr)

TikTok says it will do more to slow down the sharing of information that can’t be completely fact-checked but potentially could be “inauthentic, misleading, or false.”

The video-sharing giant says in a blog post that it will put banners over content that “has been reviewed but cannot be conclusively validated.” The clip’s creator will be notified, and then users will get an “Are you sure you want to share this video?” message before being able to amplify it to their followers. The videos may also be rendered ineligible to be included on TikTok’s For You page, a section that drives a significant amount of traffic on the app.

TikTok says that it has been partnering with fact-checkers at PolitiFact, Lead Stories, and SciVerify and removing videos with information that is demonstrably false. The plan to put badges on potentially problematic content is geared toward situations where “fact checks are inconclusive or content is not able to be confirmed, especially during unfolding events,” the company says.

The moves come as the social media industry in general is facing continued pressure to stem misinformation, even as its major players spent much of 2020 touting their efforts to quell inauthentic behavior in advance of the U.S. elections. TikTok launched its own elections integrity page in October and clarified its hate speech policy in August. Like other social media companies, TikTok also targeted content touting the QAnon conspiracy theory.

Advertisement

TikTok was one of several technology firms that took broad steps to limit content related to the Jan. 6 riot at the U.S. Capitol, which was driven by false claims that President Joe Biden didn’t win the election.

A spokesperson for TikTok told the Verge that its fact-checking generally focuses on topics such as elections, vaccines, and climate change. House Democrats wrote to Facebook, Google and Twitter this week to do more about misinformation surrounding COVID-19 vaccines.

“We believe that media literacy is crucial to enhancing the online experience for everyone and are continuing to invest in product experiences that help promote an authentic and welcoming community,” writes Gina Hernandez, TikTok’s product manager for trust and safety, in the blog post.

TikTok says that in testing the new labeling, it found that it reduced the sharing rate by 24 percent, “while likes on such unsubstantiated content also decreased by 7%.”

With about 50 million daily active users, TikTok is a prime target for straight-up scams, too. The company took action last year against bogus ads for diet pills, mobile apps and other inauthentic campaigns last year after cybersecurity researchers from Tenable warned about the fraud.

Advertisement

The Chinese-owned company is still involved in a legal fight with the U.S. government over a 2020 executive order by President Donald Trump that effectively would ban TikTok from appearing in U.S. app stores. The Biden administration has not yet signaled whether it will drop that fight. For now, the new president is “firmly committed to making sure that Chinese companies cannot misappropriate and misuse American data,” Press Secretary Jen Psaki has said.

Latest Podcasts