Report: TikTok deleted around 49 million videos for content violations in six months between July and December 2019, according to the company’s latest transparency report, published on Thursday.
TikTok down 49 million videos for content violations from in six months
The video-sharing app also revealed it had received about 500 requests for data from governments and law enforcement agencies in 26 countries during the second half of 2019, and had complied with about 480 of them.
Less than 1% of all videos published on the platform are removed for content violations. About a quarter of those videos were deleted for containing adult nudity or sexual activity.Transparency report.
Other reasons included alcohol and drug taking, violence, self-harm or suicide. Less than 1% of the videos removed violated TikTok’s polices on hate speech, integrity and authenticity, and dangerous individuals and organizations. Among them 89.4% were taken down before they received any views.
Any information request we receive is carefully reviewed for legal sufficiency to determine, for example, whether the requesting entity is authorized to gather evidence in connection with a law enforcement investigation or to investigate an emergency involving imminent harm. If we believe that a report isn’t legally valid or doesn’t violate our standards, we may not action the content.TikTok
India, where the app was banned last week, had 16.5 million videos removed, which is roughly four times more than any other country.
None of them were from China or Hong Kong
The report revealed India had led the way in terms of largest number of removals, with 16,453,360 videos deleted. It was followed by the U.S. with 4,576,888, Pakistan with 3,728,162, the United Kingdom with 2,022,728 and Russia with 1,258,853.
According to TikTok’s transparency report:
- 25.5% of the deleted videos contained adult nudity or sexual acts.
- 24.8% broke its child-protection policies, such as implicating a child in a crime or containing harmful imitative behavior.
- 21.5% showed illegal activities or regulated goods.
- 3% were removed for harassment or bullying.
- Less than 1% were removed for hate speech or inauthentic behavior.
- The 49 million deleted videos represented less than 1% of videos uploaded between July and December 2019
- 98.2% of the deleted videos were spotted by machine learning or moderators before being reported by users
TikTok refused to disclose how many were taken down by human moderators and how many were removed by the automated software.
This report offers at least a little detail about the kind of content it takes down. There has been lots of focus recently on hate and extremism on platforms such TikTok, but fewer column inches about sexual content or the safety of minors.
TikTok has launched trust and safety hubs in Dublin, Singapore and Mountain View, California, as part of an effort to provide a more local approach to content moderation.