A Report Suggests That The Content Moderation Efforts Made By Facebook Are Insufficient

According to the latest report published by NYU Stern Center for Business and Human Rights, Facebook needs to end the outsourcing practice so content moderation receives the required attention as well as the resources it deserves. The company should also increase the number of moderators, and provide improved working conditions to moderators.

Facebook has been facing criticism for years over the platform’s handling of misinformation, and other dangerous content across its platform. The criticism has intensified during the past few days, and the reputation of Facebook continues to degrade as Mark Zuckerberg denied to remove content posted by President Donald Trump that appears to be violating Facebook’s policies.

Paul Barret, the deputy director of the Stern Center and the report’s principal author stated that he wants to highlight that Facebook has relegated the moderation work to a secondary role by essentially employing contractors in remote locations who are underpaid. Barret stated that there is a surprising connection between the outsourcing issue and the issues they have faced in what they call ‘at-risk regions.’

The report also suggests that all the significant social platforms are facing similar content moderation issues. Around 15,000 moderators are working for Facebook, and most of these moderators work for third-party vendors. In comparison, Google and YouTube have 10,000 moderators, and Twitter has 1,500 moderators.

According to the study, Facebook has partnered with sixty different journalist organizations for implementing fact-checking. However, these numbers are grossly inadequate keeping in view the volume of data disseminated on these platforms daily.

Barret said in an interview that Facebook has a strategy to expand. However, you do not have a parallel strategy on how to ensure that your services are not misused by the users, he added. Users as well as Facebook’s AI system flag over three million items each day. Facebook reports an error rate of 10% by content moderators spread across twenty websites. This means that the company makes around 300,000 moderation mistakes daily. Content moderators have been marginalized to a great extent, even though they are critical to keeping Facebook’s platform usable.

As they are physically distant, Facebook usually fails to identify the gravity of content reviewed by moderators in locations like Myanmar (Facebook is used to spread propaganda by pro-government forces in Myanmar). Barret interviewed various former moderators and the report suggests Facebook pays outsourced content moderators in developing countries far less as compared to a full-time worker in Silicon Valley.

The report suggests that Facebook should bring moderation in-house as well as double the number of moderators. The company also needs to appoint a high-ranking executive to oversee moderation and invest extra in moderation for ‘at-risk countries.’ The report also states that Facebook needs to expand the fact-checking scale to combat misinformation and support government regulations.

Barret stated that implementing these measures may be expensive for Facebook, however, he is optimistic that the company may take some steps in this direction. Facebook has also started to admit that AI systems used to flag content often fails to properly understand the context of the content.



Read next: Will Oversight Board Be Able to Help Facebook Change Its Approach towards Content Posted by Political Leader?
Previous Post Next Post