YouTube Faces Severe Criticism For Recommending Self Harm Videos Again

Right after YouTube banned harmful content for its users, the moderation in algorithms did the opposite and similar videos containing graphic images of self harm showed up in the recommended section.

A recent report by Telegraph shows that YouTube has been recommending self harm content again to its users, as young as 13 years old. Almost 12 such videos are suggested to users all the time despite of the platform’s strict policy against any such content that promotes dangerous pranks or self harming stunts.

YouTube already has a massive team of moderators and when this problem was reported to them again, they responded back quickly by taking down the two videos which were flagged by Telegraph - the one named "My huge extreme self-harm scars" is still available though.

Furthermore, YouTube has also issued a statement in response to the issue being reported in which they yet again clearly stated that YouTube keeps an eye on violent content and nothing of such sort will stay on the platform. However, they are still having a tough time in picking out and deleting all such videos.
Related: Has YouTube finally planned an effective defense strategy against Dislike Mobs?
Facebook in hot water for recommending 'self-harm' and 'suicide' videos with graphic images

The Telegraph also reported about recommendations such as "how to self-harm tutorial," "self-harming girls," and "self-harming guide." which were removed instantly after the complaint.

It indeed is a complicated situation as people who are going through a hard time can use YouTube for information or advice. The company is well aware of the issue and wants to ensure that the platform should not be used to encourage dangerous behavior. Moreover, continuous change in policies and strict actions against such content will take time to make its impact and overhaul the platform.

For now, YouTube also offers the option to call or text at National Suicide Prevention Helpline (probably limited to only U.S region), whenever a user tries to search with terms like suicide or self harm.



Read Next: Instagram Failed To Curb Self-Harming Content, Admitted The Head Of Social Media Platform
Previous Post Next Post