Apple considered removing Facebook from its App Store after a report declared it to be the base of drug cartels and human trafficking

Apple was in the mood of taking a strong action against Facebook based on the BBC 2019 report, as reported by The Wall Street Journal.

In 2019, BBC shared a rather detailed report accounting for all the illegal activities being conducted on Facebook due to its negligence, and at that time Apple threatened the social network to remove it from its App Store. On 16 September 2021, WSJ published a report going into the depths of human trafficking using the online platform Facebook, and how the platform decided to respond to it. The Wall Street Journal emphasized the fact that even though the company knew about this disaster, the response was rather weak and the documents were unnecessarily slow.

In the report, company documents were thoroughly studied and a thorough investigation was conducted. It showed how the main activity was being managed through a huge market in the Middle East. The authorities in charge of this market were using Facebook as their primary base to get hold of clients. These marketers donned the covers of employment agencies and marketed supposed 'slaves' as domestic workers. All of this was done regardless of the will of the workers. In short, victims were being sold off as domestic workers non-consensually on Facebook and the company decided to stay shut.

While the rivalry between the two tech giants, Facebook and Apple, has been long ignited, we are sure this isn't a case of prejudice.

Furthermore, what's shocking is the fact that Facebook was already aware of the malpractices being conducted under its wing. Both Instagram and Facebook were used in these activities yet the platform decided to remain silent. Alongside the BBC report, another report surfaced where a Facebook researcher asked if the platform knew about this, and surprisingly Facebook acknowledged it. It was before the publishing of this report, back in 2018 that it was brought to Facebook's knowledge all about the recruitment, facilitation, and exploitation of the human trafficking taking place on its platform.
Furthermore, what we're disappointed by is that Facebook's AI detectors cannot moderate any content in languages other than what is considered 'common'. Which explains how using the platform to roll out these activities in different languages isn't a hard task. These AI moderators are all humans and it certainly creates a loophole that has resulted in harmful demeanors without any check.

By now it is quite evident that Facebook lacks some moral values when it comes to operating its social network.

Read next: Facebook Is Implementing A New Policy To Crack Down On Harmful Movements On The Platform
Previous Post Next Post