TikTok has terminated over 7 million accounts from its platform in the 1st quarter of 2021 because they were owned by underage children

The short-form video application has announced that it has deleted 7 million plus accounts from its platform during the Q1 of 2021. The reason for confiscation of those accounts was only that they were possibly maintained by users who are below 13 years of age. The platform announced at the start of the 2021 that it is going to change its privacy policies as it is recommended to be used by only those users who are more than 13 years of age and the report says that 1/3 of the regular users in the America are below than 14 years.

TikTok is now taking some serious actions against the users who are underage as this platform has been criticized a lot for the content moderation and many minor children are using the platform. The app has made it very clear that only those people can take the experience of this platform who is over 13 years and if the app detects any account uploading the material without showing real age will be terminated from the platform. The app has a made separate section for minor children where their account will be set to private and they will not be able to upload video content or comment on other’s videos. According to child protection groups and some experts, some underage users are conveniently evading these rules by show their wrong date of birth.

The app is now looking much serious on this matter as it was imposed a penalty of about $5.7 million for the breach of children’s safety two years ago but now it does not want to repeat the same violation. In the mid of the prior year, the app launched a pairing feature that allowed the parents to link their profile with the accounts of their kids and they can even check the time that their kids are spending while using the platform and who is interacting with their kids as well and it further launched certain features at the end of the prior year where parents were given rights to allow their kids to discover content according to their choice. That was a good approach by the platform for protecting the privacy and rights of youngsters.

The platform has said that it has aggregately terminated the accounts of more than 11 million users for breaking the rules of the platform. The app has deleted approximately 62 million videos that become the 1% of all video content that has been uploaded on the application because of the defilements of its standards during the 1st quarter of the 2021. The platform has stated that its tool has identified the 91% of video content that has breached the rule including the violation of child protection, harassment, and other types of graphical video contents before they were even complained about by users. The platform has terminated most of the videos that were posted from America and after that these were uploading from Pakistan and Russia. The platform has also launched a safety overview program that will reward those people that will spot flaws and errors.

The platform has said that it received almost 33 genuine reports of vulnerabilities and it fixed almost 29 of them and now the platform is working to post its report every quarter that is a good step.

Photo Illustration by Nikolas Kokovlis/NurPhoto via Getty Images
Previous Post Next Post