Offensive Content will not be completely removed from Facebook but the Content Moderators will be taken care of

With so much disgust and sad stories prevailing in the world, it is hard to believe that we have never come across any triggering content on social media. If we talk about Facebook, there are millions of videos and images shared on the platform that has the potential of affecting the mental health of the user.

Imagine, if a Facebook user cannot bear watching such content once or twice in a month then what about the Facebook content moderators whose job is typically based on watching such content at all times. There are hundreds of reports present on showing how bad content moderation can be at times. Recently, the article The Verge unveiled the secret lives of Facebook moderators in America.

Facebook currently has over 30,000 content moderators, which means that 30,000 minds being affected by the harrowing content on Facebook. Mark Zuckerberg has shown the concern of using the technology in the right way to curb this online content and help moderators to live a better life by providing them support.

Over 1.59 billion users use Facebook daily, which is why content moderation might become difficult. Even though Facebook is trying hard to not show abusive and disturbing content on its platform but still it appears after every now and then.

To make Facebook a healthy platform, thousands of moderators are assigned to watch disturbing videos including solicitation, pornography, racism, and sexual body parts/images so that they can identify it as abusive and take it down.


Considering the nature of the work, mental health issues or stress is naturally expected. However, what is Facebook doing to deal with these negative outcomes of their job? Mark Zuckerberg has said that they are trying to provide the best support that their content moderators can get including counselling or therapy.

Also, the CTO of Facebook was reportedly recorded to say that content moderation is the key area for the engineers and technologists to look forward to. For this reason, many tools are also being created so that curbing offensive content becomes simpler and easier. This includes improving the near-duplicate detection tool and much more.

Facebook recently started to use the improved tools that hides or blurs the content that is offensive in nature. So, such actions by Facebook might have a positive outcome in the future. However, Zuckerberg has clearly said that such content will not be eliminated from the platform. As Zuckerberg has said, Facebook’s first priority is protection of their content moderators and controlling the offensive content comes second as it is an ongoing thing.


Photo: Brendan Smialowski/AFP/Getty Images)

Read next: Now police might be able to access Facebook and WhatsApp encrypted messages
Previous Post Next Post