Dark Side Of Meta’s Algorithm: Company’s Execs Failed To Disable Facebook’s Recommendations Fueling Child Exploitation

A shocking new report is shedding light on the dark side of tech giant Meta’s algorithms.

The company’s leading executives are being called out by one former staffer who says they knew very well what the Facebook app’s For You Recommendations were capable of.

But instead of paying heed to how it was fueling child exploitation at an exponential rate, they continued to go ahead with the plan, knowing very well about the consequences.

The allegations set out by David Erb are huge and pertain to the incident arising in 2018. He was a top-of-the-line engineering director at the company and was seen running an equipped team that was in charge of making sure users were safe on the app and reporting any matter that kept them in danger.

When the team saw so many types of inappropriate conversations taking center stage amongst both minors as well as adults through the app, they tried to find the root cause of the ordeal. They saw how the company’s algorithm that generated ‘For You’ recommendations was the main manner by which adults were targeting youngsters and they were doing it with so much ease.

It was much worse than what many had expected as mentioned by The Wall Street Journal during their latest interview. So many pedophiles were attacking the youth and some cases entailed adults even going as far as asking teenagers for images of private parts. In return, the victims were provided with incentives like cash while also getting threats to leak the nudity-themed pictures.

During that period, the executives of Facebook’s parent firm were in the talks of encrypting texts on Facebook that made sure users’ data remained private. So many plans arose in regards to encryption being the solution to the problem but what they failed to realize is that it would further go on to mask the behavior of those dubbed child predators.

Erb was amongst the few raising his voice on the concerning matter. He added how his decision was to go back to colleagues located at Meta and inform them of what wrongdoing was taking place and offered advice on how to make the matter better by combating the child exploitation rate on this app.

So what’s the biggest takeaway on this front? Well, the company should consider limiting if they cannot get rid of recommendation features as a whole.

Right after this conversation arose, Erb was very vocal about how he felt the team needed to make a big step and halt recommendations for you so that minors wouldn’t be targeted by adults. But the fact that the heads denied his request and idea was again eyebrow-raising.

So in the end, the tech giant just went ahead with the decision to encrypt texts. In return, Erb was ousted from his imperative role in the firm and moved to another department. But soon after that, he resigned and left Facebook toward the end of 2018 as per his LinkedIn profile.

The report was first published by the Wall Street Journal and ever since it has come out, many critics can’t help but wonder what Meta’s frame of mind may have been at the time in terms of not paying heed to such a valuable decision that was putting minors at risk on the app.

This has opened up a new Pandora’s box about the dark side of Meta’s algorithm but the company is yet to comment on the report.

Photo: DIW

Read next: HRW Exposes Meta's Pervasive Censorship of Palestine Content
Previous Post Next Post