TikTok Called Out By Children’s Rights Activists For Lack Of Privacy And Security Features For Younger Audiences

TikTok is being outlined as a dangerous app for younger audiences by various groups advocating children’s rights. According to them, the platform fails to provide the same level of safety and protective features for kids that are commonly employed for adults.

The news comes after the groups carried out a comprehensive study of leading social media apps which included the likes of WhatsApp, Instagram, and of course TikTok too. Obviously, these three were selected based on their popularity with the younger generations.

The researchers took into account some of the leading nations of the world including Brazil, the US, the UK, and Indonesia too. And that’s when it was found how these apps discriminated between the audiences involved, paying more attention to adults than children.

The privacy policies were studied including the terms and conditions outlined, not to mention the default settings too. All in all, an overall trend was seen across a number of leading global markets.

There were huge differences in different nations when children’s experiences were taken into consideration. This research which was conducted through the efforts of Fairplay highlights how TikTok had huge problems when compared to others.

The app was singled out by more than 40 digital rights groups and children’s activists who felt the time had come for TikTok to take the matter seriously and make huge amendments. These changes came in the form of suggestive safety programs for kids where designs could be customized globally and not only focused in specific places like Europe.

In case you’re wondering why Europe was singled out, well, that’s because it was the first to notice the issue and hence its regulators were the first ones to take immediate action against safeguarding the digital rights of kids.

After taking a glance at some alarming pointers from the report, nations were quick to join hands and issue a letter to the platform’s CEO. Here is where they advised him to take into consideration a number of design discriminations that could well be appreciated from the study’s findings.

Some discrepancies were related to how the platform offers a series of experiences that they call age-appropriate for youngsters like changing settings to the private category. This was seen in nations like the UK and other parts of the EU.

However, in other places, 17-year-olds defaulted to making use of public accounts.

Meanwhile, there were some alarming findings seen in some countries where TikTok failed to give instructions on its policies in a language that would be easily comprehensible for minors. Similarly, the study went about pointing out how TikTok struggled with transparency too, in terms of what exactly the minimum age is to use the app.

This just makes it so much harder for youngsters to know if they’re eligible or not to make full use of the application.

And the fact that most users of the app don’t live in Europe and hence can be found elsewhere, clearly shows how there needs to be a change of rules soon so the majority of users can enjoy a safe and age-appropriate journey while using the app.

This study is more evidence of how more needs to be done in the world of social media to ensure children are constantly protected online, giving them the same degree of security as their adult counterparts.

The common trend seen here is that apps seem to be more concerned about achieving their commercial targets like enhanced engagements which take place at the cost of other youngsters.

When asked to summarize the findings of the report, Fairplay revealed how there is a clear gap in regulation and that just leaves so many young people at risk of the app’s business model.


Read next: A new study points out the biggest threat to the potential of TikTok as it lacks massive earnings for creators compared to rivals
Previous Post Next Post