Children Exposed to Inappropriate Content Immediately After Creating Social Media Accounts, Study Reveals

SOCIAL MEDIA ISN’T SAFE FOR KIDS… at least according to a recent report that stated children are being exposed to inappropriate online content sometimes right after they set up their social media accounts.

For this report, a number of researchers fetched details from real teenagers (13-17 years old) revolving around their likes and dislikes and created avatars based on these details. Shockingly, it didn’t take long for these avatar accounts to start receiving a plethora of inappropriate stuff.

As per Abi Perry, a researcher at Revealing Reality (responsible for conducting the said research), the fake profiles were exposed to graphic self-harm imagery and pictures of cuts and razors. Additionally, they were presented with material recommending weird diets.

Not only that but the avatars were also approached by unknown adults pretty swiftly.

The research in question was authorized by the Children's Commissioner for England, as well as, 5Rights Foundation, an organization that prioritizes children’s safety.

Tony Stower, director of external engagement at 5Rights Foundation, said that in the offline world, there are guidelines and protections to prevent children from consuming R18 content and accessing pornography, knives, and alcohol.

Stower added that these protections don’t exist in the online world and stressed that such measures must be implemented into the digital services children use these days.

Coming back to the research, it was also found out that children are being bombarded with age-specific advertising. For example, they will come across details about different college courses; however, at the same time, self-harm or sexual content will also be made available to them.

Finding about the research hit home for Ian Russell whose daughter, Molly (14) took her own life after coming across graphic suicide content online.

Ian claims that all these platforms care about is profit and that they are designed to keep the consumers engaged for as long as possible. He also said that this disregard for the safety of people, especially young people online, has to end.

An age-appropriate design code will surface in September that will give the Information Commissioner’s Office (ICO) authority to hand fines and punishments to services that don’t incorporate into the design the new safety standards aimed to protect the data of users under 18.

The report mentioned Facebook, Instagram, as well as, TikTok.

A Facebook spokesperson, in a statement, agreed that apps should be designed in a way that prioritizes young people’s safety. The spokesperson added that Facebook doesn’t allow content promoting pornography or self-harm and that the company has been working hard to keep teens secure (for example, not allowing adults to send direct messages to teens who aren’t following them).

The spokesperson brought up that drawing conclusions about the full teen experience on Instagram with the help of only a few avatar accounts, wasn’t quite a strong methodology. Moreover, the posts highlighted in the report weren’t recommended to the avatar accounts and were instead searched for (or followed). Last but not least, various examples in the study were from before Facebook started offering support to people looking up stuff related to self-harm and eating disorders.

A TikTok spokesperson also responded to the report and said that the platform had taken “industry-leading steps” to encourage a secure and age-appropriate experience for teens. They also stated that in the first quarter of 2021 alone, TikTok took down 62 million videos for not adhering to their Community Guidelines, with over 4 out of 5 (82%) clips being removed before getting a single view.


Read next: Report shows 75 percent of parents share contents about their children including their photos and videos on social media platforms
Previous Post Next Post