New Internal Document Says Meta Knew Instagram Was Forcing Girls Towards Dangerous Content Creation

A new and unpublished internal document is signaling alarm bells about Meta and its decision to put its gains over the safety of young female users on Instagram.

The document claims Meta, who was previously known as Facebook, had the desire to push girls to the verge of creating dangerous content on the app. And they knew exactly what they were getting into.

The paper goes on to highlight that in the year 2021, one employee working for the Instagram app conducted an investigation internally regarding eating disorders. He did that by putting out a fake account in the name of a 13-year-old girl who was on the search for health tips or tips for losing weight.

Shockingly, the app led her toward content that was inappropriate and graphic in nature and all the recommendations she got were of the sort. This included accounts called skinny binge and apple core anorexic. As the name goes, they were beyond dangerous for someone her age.

On the other hand, a series of other internal documents proved how employees working at Facebook had huge concerns regarding research of companies that showed how Instagram managed to put 1 in 3 females in a position where they felt negative about their bodies.

Those teens using such apps ended up feeling more depressed and anxious than others of their kind. This is one reason why a legal adviser who read such Facebook Papers that were published by a leading whistleblower last year managed to work alongside thousands of families to set out lawsuits against social media apps for such negligence and damage to society.

They hope to target the likes of Meta and a few others through legal cases worth multimillions of dollars. Moreover, he added that it’s less about the money given for the damage and more about making changes to policies to avoid such instances in the future.

Speaking during a public interview recently, the legal advisor says that whenever such firms are given the choice of making a decision regarding young users’ safety, they always end up making the wrong one and put profits at the top of the list. This is why such apps on social media are really hurting young kids.

He similarly feels that such products are not only addictive but if kids do end up staying online, the firms make more revenue, no matter how unproductive the content really is in the long run. Hence, the take-home message here is that Meta knew that kids would be seeing such damaging and disturbing content online. It was never a coincidence or an error. It was a matter of how the app is designed and that’s scary.

Moreover, more research into the matter shows how apps get produced in a matter where they end up evading parental control. This is why stricter measures need to come into play for verification of age as well as user identity. It’s interesting how such technology does exist but why apps aren’t using it is a huge question mark on their part.


Read next: Meta Crowned The Fastest Growing Brand For The Year 2022
Previous Post Next Post