Pages

Concerns Arise As New Leaked Paper Proves Facebook Underreports Child Abuse

Moderators of Facebook have reportedly raised plenty of concerns pertaining to the company’s underreporting of child abuse.

The newly leaked document obtained was considered to be a training paper for Facebook’s moderators that handle the content. It has reportedly raised so many questions on people’s minds about how the social platform could potentially be involved in disguising the true reality of sexual assault on children.

The accusations come via a newly released report by the New York Times, where the document clearly outlines a set of directions for content moderators to focus more on adult images when making judgments.

Sources revealed how the incorrect practice was previously raised as an issue by the social network’s own moderators in the past but they were shut down by leading company executives.

Experts believe the leading issue of concern has to do with how these moderators actually go about assessing a person’s age, as the criteria are now in question.

But what happens when one person’s age is not very evident? Well, that’s another debate that has the world in a frenzy because the outcome can have detrimental consequences.

All images where the age of a person isn’t discernable must be reported to reputed organizations like the NCMEC which handle cases of missing and exploited children. These legalized bodies further forward the complaints to law enforcement agencies.

However, those images involving adults are not reported immediately to the outside world. They are simply removed from the platform if they fail to fall in line with the network’s rules and regulations.

The New York Times has rightly said that simply looking at a picture to determine a subject’s age is not reliable. But the further issue arose when the leaked document spoke about how Facebook moderators are told to rely upon an old technique that outlines a person’s puberty phase, that dates back to nearly 50 years ago.
Clearly, this methodology isn’t accurate for age estimation and since moderators are already ordered to skip out on pictures that don’t entail adults, experts believe so many child abuse pictures are going unnoticed. After all, they’re prevented from slipping through.

In case that wasn’t enough, experts have also raised concerns about how moderators that work for Facebook on a contract basis and not full-time could only have a few seconds to spare to make a judgment of age. Hence, they’re at most risk of being called out for the wrong decision.

Facebook has tried to defend itself by stating how it only prefers to focus more on adult images to help protect users’ privacy while avoiding the spread of false reports. They believe the latter could halt the ability of various authorities to investigate real abuse cases.

Clearly, not everyone shares the same thought process as Facebook. For instance, both Apple, as well as TikTok, take the complete opposite approach. They have proved how their main goal surrounds reporting images whenever their moderators are not sure of age. And that is what many feel should be the right way forward.


Read next: Huge Ranking Failure For Facebook As News Feed Bug Spreads Wave Of Misinformation

No comments: