Facebook Reveals Some Information About Its Content Moderators

Hard Questions: Who Reviews Objectionable Content on Facebook — And Is the Company Doing Enough to Support Them?Ever wondered what happens to a post once it is marked for review? What does Facebook do with the content which is deemed inappropriate? The platform explained a little about the reviewing process and the force behind it.

A team of 7500 people manages the posts which are marked as potentially offensive or violators of community standards. This huge number of employees is maintained so that shared or uploaded content can be reviewed in its native language. However, photos and videos accused for alleged nudity can be studied without regard to location.

Also read: Facebook Hits a Record Low in Terms of Active Users; the Platform Tries to Divert the Actual Stats from the Public Eye
The company reveals that its reviewing team gets an extensive training so that a certain level of consistency can be maintained in the process. This, however, is debatable. What Facebook did not reveal was the details of moderators of the platform. They did tell that they received full health benefits and had access to mental health care. Nothing more was conveyed. Given that the company is already notorious for mishandling private data and information, this is understandable to a certain extent. Facebook also abstained from delving into the depths of the screening criteria. However, what is good to know is the fact that the social medium is still in touch with the human side, and is not yet fully automated.

Facebook pulls back the curtain on its content moderators

No comments:

Post a Comment