The Impact Of Social Media On Young Minds: Apps Warned Against Providing Children With Adult Friend Suggestions By Default

The problem of social media apps impacting the minds and lives of young users online is undoubtedly a huge debate. And while regulators are trying to do everything possible to limit online grooming of kids, they are yet to get the right kind of success.

But as the fight continues, communications watchdog Ofcom says more needs to be done by popular platforms including eradicating default ‘adult’ friend suggestions that are thrown in children's direction.

This recently arose in the form of a warning which is an integral part of its guidance aimed to better tech apps so they can be more in line with the Online Safety Act.

The guidance includes how apps should better handle content that’s unlawful where child abuse taking place on the internet seems to be at the top of the list. Remember, as per data put out by the regulator, every 1 in 10 children between the 11 to 18 year age bracket was receiving explicit images without any checks and balances in place. And as you can expect, the behavior is worrisome.

Moreover, such draft codes were also rolled out to make apps understand how they are a part of the Online Safety Act and would entail such kind of behavior illegal where grooming, fraud, and abuse were called out to be unacceptable.

Now, Ofcom wants to hear more on this front and what today’s leading tech giant with huge userbases has to say on the matter and the plans in place to reduce such actions.

Remember, a lot of the guidance being put forward has to do with online grooming and curbing the issue. Therefore, we’re expected to see the biggest apps make some mighty changes in the default settings so kids will no longer be sent a default suggestion of friendship from adults on their list who are disguised as groomers.

The sort of safeguard in place for kids will also make sure no details about a child’s data are made public through their profile or content and therefore would stop them from getting messages from absolute strangers not seen on the contact list.

Ofcom also laid down some more ground rules to better the app framework and put an end to the risks linked to social media apps today. And the bigger the platform, the greater the risk that is associated with it, the regulator added.

Ofcom says moderation teams must have adequate resources to tackle the matter and should enable generating complaints an easy task to perform. Similarly, all accounts that are run on behalf of any terrorist firm should be eradicated immediately while searches making use of keywords that deal with frauds like stolen passwords should also be targeted for removal.

Any content that includes URLs of child abuse sites should also be deleted and search engines must be tasked with the responsibility to highlight websites putting abuse material at the forefront. Such websites should not be indexed, it warned.

Last, but not least, all users on search engines should have an easy means to report suggestions that hint at unlawful actions or content.

Ofcom concluded by mentioning how it needs apps to make use of hash-matching technology so that abusive material is outlined and never missed. The latter is involved in the conversion of pictures to figures dubbed hash and compares it to databases produced using CSAM pictures. And when there might be a match, a known abusive picture has been detected.

So as you can tell, the expectations are plenty but people want to see actions happening sooner rather than later because when kids are involved, there’s no time to waste.


Read next: Facebook Is the Most Secure Social Media Platform, New Study Reveals
Previous Post Next Post