TikTok says it removed more than 49 million videos that broke its rules, between July and December 2019.
About a quarter of those videos were removed for containing adult nudes or sexual activity, the company said in its latest transparency report.
The video-sharing app also revealed that it had received around 500 requests for data from governments and police, and had complied with 480 of them.
The United States has suggested it is “looking into” whether to ban Chinese-owned enforcement.
On Monday, US Secretary of State Mike Pompeo suggested that downloading TikTok would put “private information of citizens in the hands of the Chinese Communist Party.”
He added that the United States government was considering whether to ban Chinese-owned applications: “We are taking this very seriously. We are certainly looking at it,” he said, in a Fox News interview.
The government in India has already banned the app, citing cyber security concerns.
TikTok is owned by the Chinese firm ByteDance. The app is not available in China, but ByteDance operates a similar app, called Douyin, which is available.
TikTok said it had not received any requests for data from the Chinese government or police, nor any requests from the Chinese government to remove content.
On Thursday, the Wall Street Journal published a report that suggested the business was considering establishing a new headquarters, outside of China.
TikTok told the BBC in a statement: “As we consider the best way forward, ByteDance is evaluating changes in the corporate structure of its TikTok business. We remain fully committed to protecting the privacy and security of our users as we build a platform that inspires creativity and brings joy to hundreds of millions of people around the world. “
US authorities are examining whether TikTok has complied with a 2019 agreement aimed at protecting the privacy of children under the age of 13.
The app says it offers a limited app experience, with additional security and privacy features for children under 13.
According to the TikTok transparency report:
- 25.5% of the deleted videos contained adult nudes or sexual acts.
- 24.8% broke their child protection policies, such as implicating a child in a crime or containing harmful imitative behavior.
- 21.5% showed illegal activities or “regulated goods”
- 3% were removed for harassment or intimidation
- Fewer than 1% were eliminated for hate speech or “inauthentic behavior”
The TikTok transparency report also revealed:
- The 49 million deleted videos represented less than 1% of the videos uploaded between July and December 2019
- 98.2% of deleted videos were watched by machine learning or moderators before being reported by users
TikTok only launched in 2017, and because it’s so new, we know much less about the platform than we do about Facebook, for example.
This report provides at least a little detail about the type of content it removes.
He has recently focused a lot on hate and extremism on platforms like TikTok, but fewer inches of a column on sexual content or the safety of minors.
However, about half of the deleted videos belonged to those two categories.
What we don’t know, of course, is how much harmful content has been lost by your moderators and machines.