Facebook’s Instagram failed its responsibility of removing self-harming posts

National Society for the Prevention of Cruelty to Children (NSPCC), a children’s charity network said that Facebook significantly fails in its responsibility of removing self-harming content.

Facebook’s internal record revealed that between April and June of 2020, the removal of self-harm content and suicidal material has been reduced by 80 percent comparatively last quarter.

During Covid-19, due to the restrictions made by the government most of the content moderators were not allowed to leave their homes. Facebook claimed that their priority is to remove the self-harming material.

On Thursday, figures were released that showed the uplifting of restrictions made by Instagram on self-harming post and the moderators began going back to work, the number of removals went back up before the pandemic.

Tara Hopkins, Chief of Public Policy for Instagram, in a statement, said: “to provide a safe platform on Instagram for users we are trying to do our best and we can confirm that from July to September we took a stand and removed 1.3 million portions of suicide and self-harm content, over 95 percent of which we found proactively.” Adding further, "We've been transparent about the effect of Covid-19 on our content-review capabilities, so we're confident about these latest figures that indicate we're now taking a stand even on more content, thanks to our improved technology."

She added, "We are working closely with experts to improve our policies, and we are talking with regulators and governments on how we can expand more of our technology to UK and EU so that we can promptly locate and eradicate harmful suicidal and self-harming posts.”

After the tragic incident of a teenager Molly Russell, Facebook pledged to take down more images, photographs, and even cartoons depicting self-harm and suicide.

The NSPCC, however, noted that after the reductions in the takedowns, youngsters are even at greater risk of self-harm and suicide during the pandemic.

To which the social media platform responded by saying "even after this decrease, we prioritized and went ahead within this category on the most harmful content.”

Chris Gray is an ex-Facebook moderator who is currently engaged in a legal dispute with the organization.

He said to one media outlet that he is not worried at all. He goes on to say “every worker has been sent to home, no one is there in the office. Who do you think will work?”

This leaves the automated systems in charge. But in some cases, they also skip posts, even though the authors themselves have added trigger alerts that the photos featured include blood, scars, and other kinds of self-harm.

Mr. Gray says that obviously technology cannot deal with this all.

“We can see that when humans are out, it’s proper chaos. In other platforms, there’s much more self-exploitation, harmful posts and other kinds of stuff like that and nobody is there to handle it.”


Previous Post Next Post