While negotiations over the app’s sale in the US continue, TikTok has today released its latest transparency report, which provides specific detail on all of the videos that TikTok’s team has removed, for varying reason, over the last six months.
And there are some interesting points of note – first off, reflecting the app’s rising popularity, TikTok removed a lot more videos in the first six months of 2020 than it did in the preceding reporting period.
As per TikTok:
“In the first half of 2020 (January 1 – June 30), 104,543,719 videos were removed globally for violating our Community Guidelines or Terms of Service, which is less than 1% of all videos uploaded on TikTok.”
In its previous report, TikTok said that it had removed 49,247,689 videos between July and December 2019 – so it doubled its content removals in the first six months of 2020.
But still, in that report, TikTok also noted the same, that this represented less than 1% of all videos uploaded in the period. That means that TikTok has seen at least 10 billion video clips uploaded in the first half of 2020, a massive amount.
The increase in removals is reflected in the regional stats – here’s a comparison of removals for the top four regions, based on TikTok’s last two reports.
In some ways, that’s a concern, that TikTok is having to remove more content, but again, the data likely reflects increased usage, with the app holding its position at the top of the download charts, and adding more and more users throughout the year. Which is why several companies were willing to spend billions to acquire it.
In terms of why TikTok removed content, it’s provided this chart, with slightly confusing shading differentiation.
Yeah, I’m not 100% sure which is the top reason based on that color comparison, but the slightly darker shades at the top of the list are the key reasons for content removal.
The vast majority of these, TikTok says, were removed before anybody saw them, while its machine learning systems also detected even more violations:
“As a result of the coronavirus pandemic, we relied more heavily on technology to detect and automatically remove violating content in markets such as India, Brazil, and Pakistan. Of the total videos removed, 10,698,297 were flagged and removed automatically for violating our Community Guidelines. Those videos are not reflected in the chart above.”
Which likely suggests that TikTok’s systems are improving in this respect, though it is also worth noting that the platform remains under investigation in several regions over concerns related to how it protects younger users.
This is also an important chart, given the ongoing concerns around TikTok potentially sharing user data with the Chinese Government:
This is TikTok’s listing of the requests it received to remove or restrict content via government agencies in the first half of 2020. As you can see, there are no requests from China at all – which makes sense, given that TikTok itself isn’t available in China (China has its own version of the app called ‘Douyin’), but theoretically, this listing would also need to reflect any push from the CCP to limit the spread of, say, content relating to the Hong Kong protests, or the Tiananmen Square massacre, both of which have reportedly been restricted in the app.
That’s not reflected in this data, which could show that China’s influence over the app is not as significant as some have suggested. Or TikTok is not fully reporting such.
But TikTok has faced one particularly significant content issue of late, with a video depicting a man committing suicide being circulated through its network, and exposing many users to graphic footage.
TikTok says that this video was being shared as part of a coordinated effort by bad actors looking to test its systems:
“Through our investigations, we learned that groups operating on the dark web made plans to raid social media platforms including TikTok, in order to spread the video across the internet. What we saw was a group of users who were repeatedly attempting to upload the video to our platform.”
TikTok’s still working to remove the clip, but it’s also called on other social platforms to work with it to help coordinate their efforts in combating targeted efforts of this sort. Which the other platforms already do – Facebook, Google, LinkedIn, Microsoft, Reddit, Twitter and YouTube all work together and share information to help each other combat misinformation, and other forms of web misuse, while they’re also looking to improve their combined efforts to maximize response.
TikTok’s not part of that group yet – but maybe, with the platform seemingly securing its future in the US, it too will soon have a seat at the table in such discussions.
You can read TikTok’s latest Transparency Report here.