TikTok has introduced a new system to notify the users when their videos are taken down along with reasons and helpful links

In August, TikTok started testing a new notification system to bring more clarity around video content removal. In the USA, these tests began and now, TikTok has announced to roll out this feature for users all over the world.

As per this new system, whenever TikTok moderators will remove a video, they will send a notification to the user who had posted that video. This notification message will contain the reason for removal, the offense that violated the app’s guidelines, and links to the Community Guidelines. Additionally, if there is some content which is pointing towards self-harm or suicide, then TikTok’s notification will present the user with some links that can help them if they reach out, including the nonprofit ‘befrienders.org.’ Numbers of suicide hotlines will not be added, but users will be directed to contact local law enforcement agencies. By the looks of it, it seems that the friends of such users will also be contacted.

This is surely a great update because previously when a video would be taken down by TikTok, it mostly left the users confused and angry as they did not understand the reason behind the removal of their content. And then, since they were left unaware, they used to repeat the same mistakes and again faced disappointment because of content removal.

Now, with this notification system, at least they will have an idea and they will also get a chance to go through the community guidelines again to avoid repeating those mistakes next time.

Secondly, directing suicidal users towards helpful links is also a good step. Facebook and Instagram also partner with several nonprofit organizations and associations that provide help to such users. They also provide mental awareness and a lot of support to the users. This step by various apps is especially significant now, amidst a global pandemic when depression and anxiety, and other mental and emotional disorders have started surfacing more than ever before.

TikTok has been trying to incorporate strict content moderation policies, perhaps because it has been in the line of fire for one reason or the other since last year. In the first half of this year, TikTok reported having removed 104.5 million videos because of the guideline violation.

Regardless of the reason, it seems that the app is taking content moderation very seriously, and that is a great thing.


Previous Post Next Post