TikTok announced a new set of Community Guidelines on December 15, that aim to strengthen its existing policies in areas like harassment, dangerous acts, self-harm and violence, alongside the introduction of four new features similarly focused on the community’s well-being. These include updated resources for those struggling with self-harm or suicide, opt-in viewing screens that hide distressing content, a text-to-voice feature to make TikTok more accessible and an expanded set of COVID-19-related resources.
While many of the topics were already covered by TikTok’s Community Guidelines ahead of today’s changes, the company said the updates add more specifics to each of the areas based on what behavior it’s seen on the platform, heard through community feedback and received via input from experts such as academics, civil society organizations and TikTok’s own Content Advisory Council.
One update is to the guidelines related to suicide and self-harm, which have now incorporated feedback and language used by mental health experts to avoid normalizing self-injury behaviors. Specifically, TikTok’s policy on eating disorder content has added considerations aimed at prohibiting the normalization of or the glorification of dangerous weight loss behaviors.
Stronger policies on bullying and harassment now further detail the types of content and behaviors that aren’t welcome on TikTok, including doxxing, cyberstalking and a more extensive policy on sexual harassment. This one is particularly interesting, given that there have been some cases of TikTok users figuring out where anti-masker nurses worked and at least one incident led to the nurse being put on leave. It’s unclear how TikTok will approach this sort of “doxxing” behavior, however, as it didn’t involve publishing a home address only alerting an employer.
Another update expanded the guidelines around TikTok’s dangerous acts policy to more explicitly limit, label or remove content depicting dangerous acts and challenges. Through a new “harmful activities” section to the minor safety policy, TikTok reiterates that content promoting dangerous dares, games and other acts that may jeopardize the safety of youth is prohibited.
Keeping our community safe is a commitment with no finish line. We recognize the responsibility we have to our users to be nimble in our detection and response when new kinds of content and behaviors emerge. To that end, we’ll keep advancing our policies, developing technology to automatically detect violative content, building features that help people manage their TikTok presence and content choices, and empowering our community to help us foster a trustworthy environment. Ultimately, we hope these updates enable people to have a positive and meaningful TikTok experience.TikTok
TikTok also updated its policy around dangerous individuals and organizations to focus on the issue of violent extremism. The new guidelines now describe in greater detail what’s considered a threat or incitement to violence, as well as the content that will be prohibited. This one is timely, too, as many Trump supporters have been pushing for a new civil war or other violence as a result of Trump losing the U.S. presidential election.
Its new policies out today also include changes that address what’s been more recent user behavior, like the calls for violence following the election.