TikTok’s pledge to take immediate action against child predators has been challenged by an investigation. BBC Panorama investigation has claimed that TikTok is slow to act against adults who engage children in sexually explicit conversations, despite such conversations being flagged by users.

The allegations follow similar claims made last year by a previous BBC investigation, suggesting that the TikTok is still failing to address the perception that its network facilitates child predators.

TikTok slow to act on predators

A 23-year-old woman who produces TikTok videos for an internet search company, the investigative program created a mock account of a 14-year-old girl. Pictures of the 23-year-old were edited to make her look younger, while her posts were given hashtags e.g. #schoollife to indicate that she was under the age of consent.

Her TikTok account was soon followed by a number of older men, with a 34-year-old sending her a sexually explicit message, even after the women had told him she was 14.

TikTok has zero tolerance policy against grooming behaviors

The girl then reported the user and his comments to TikTok. However, the social media company took action only after BBC Panorama contacted it and provided details of its investigation, some four days after the initial report.

Our moderators did not intervene in the first instance because the child’s account had not made it clear the offending posts had been received via TikTok’s direct messaging (DM) facility. We have a duty to respect the privacy of our users. A report about a user’s account or comments will not generally trigger a review of direct messages. Because the report was against the account in general, not the specific direct messages, that’s why no action was taken.

TikTok Response

Parents need to be aware of the risks involved with letting their children use TikTok

The thing with TikTok is it’s fun, and I think whenever someone is having fun they’re not recognizing the dangers. Predators might be looking to groom a child, to exploit them, to get them to do something that could be harmful to them.

Lorin LaFave – Breck Foundation

Sex-image message

TikTok does not allow an account to receive or send direct messages if the user registers themselves as being under 16. But many of its youngest members get round this by lying about their date of birth when they join.

To simulate this, Panorama registered the account with a 16-year-old’s birth date. But in her profile, it stated the owner was a 14-year-old girl in its biography description. The team recruited a journalism graduate who creates TikTok for an internet search company.

Following the investigation, TikTok has now banned two accounts

Following the completion of BBC Panorama’s investigation, Two accounts as well as the devices used with them have been permanently banned by TikTok.

TikTok continuously working to make TikTok a hostile environment for predatory behavior. It’s the only platform that disables direct messaging for under-16s, that it allows direct messages between people over-16 only when they agree to follow each other, and that it prohibits the sharing of images and videos via direct messaging, regardless of age.

TikTok’s Chinese headquarters made key decisions regarding content moderation, leaving him and his colleagues largely powerless to address problems such as sexual predation.

It felt like not very much was being done to protect people. Beijing were hesitant to suspend accounts or take stronger action on users who were being abusive. They would pretty much only ever get a temporary ban of some form, like a week or something. Moderators at least on the team I was on did not have the ability to ban accounts. They have to ask Beijing to do that. There were a lot of sort of clashes with Beijing.

Panorama Investigation

TikTok’s algorithm could actually actively put young users in danger. The algorithm will feed you what you interact with so if you’re looking at a lot of kids dancing sexually, and you interact with that, it’s going to give you more kids dancing sexually.

Some users interact with that stuff because it is their interest. Maybe it’s a predator that they see these kids doing that and that’s their way of you know, engaging with the kids.

TikTok has a team of 10,000 moderators in 20 countries, no content is now moderated in China

To test this claim, BBC Panorama set up another fictitious account, this time for a 36-year-old man. Whenever presented with images of young girls in school uniform, the man’ liked them and watched videos to the end. Within half an hour, Panorama reports that his “For You” page was filled with images of under-age teens.

TikTok states that its Community Guidelines make it clear that it doesn’t allow the sexualization of minors on its platform, with sexualized content being blocked from appearing in the For You feed.

We use a combination of technology and moderation teams to identify and remove content that breaches our guidelines.

TikTok Spokesperson

It’s also worth pointing out that TikTok isn’t necessarily a special case in this respect. A similar complaint can be made against other social media platforms, with YouTube being another network that has been found to feed potentially provocative content to possible predators. This is a very hard problem to solve, since even if platforms remove tags from videos, algorithms tend to recommend videos which have been liked together.

Spread the love

Leave a Reply

Your email address will not be published. Required fields are marked *