Topline
TikTok unveiled new guidelines on Wednesday to tackle misinformation on its platform, announcing that it would flag unsubstantiated videos to limit their spread if its fact-checking partners are unable to verify the factuality of the content.
Key Facts
In a blog post on Wednesday, TikTok said it will not recommend videos to people’s “For You” feeds if fact checks of its content are inconclusive or if its veracity can’t be confirmed, thereby limiting their reach.
TikTok acknowledges the new rule is likely to have an impact on videos about unfolding events where accurate fact-checking may be difficult.
If a viewer comes across a flagged video, they will see a banner above it warning the user that the content is unverified.
The video’s uploader will also be notified about the warning label on their video and the fact that their video has been flagged as unverified.
Viewers attempting to share the video will also see a popup reminding them that the video has been flagged as unverified at which point they can cancel or choose to share anyway.
The new guideline will first roll out in the U.S. and Canada starting Wednesday, and will gradually launch across the globe in the “coming weeks.”
Tangent
TikTok relies on third-party partners for fact-checking — in the U.S. these partners include PolitiFact, Lead Stories and SciVerify, who work to assess videos about civic processes like election, public health, science and other issues.
Key Background
TikTok and other social media companies have stepped up their moderation efforts since Election Day. Days after the election TikTok moved to take down several videos espousing misinformation about voter fraud. The video-sharing platform cracked down on videos of Donald Trump making false claims about the election results after violent Trump supporters invaded the Capitol Building in an attempt to halt the certification of Joe Biden’s victory in the presidential election. The company also took down several hashtags including #stormthecapitol and #patriotparty in order to reduce the reach of content associated with them. However, TikTok’s latest move seems to acknowledge the limits of fact-checking misinformation, especially during unfolding events. The new guidelines would therefore allow the company to limit the spread of questionable videos while not taking them down. TikTok has always maintained that it sees itself as the home of fun light-hearted content but over the past few months, the company has sometimes begrudgingly attempted to tackle political misinformation.
Further Reading
TikTok to flag and downrank ‘unsubstantiated’ claims fact checkers can’t verify (TechCrunch)