Facebook once again reviews its Community Guidelines as a response to a decision taken by the Oversight Board

Facebook is a big company, which cannot take all decisions itself because of a number of reasons and so Facebook has an Oversight Board which reviews and takes decisions that the tech giant cannot take itself and sometimes also reviews potentially wrong decisions that Facebook has taken when it comes to the removal of content. In such cases, there have been times that because of some decisions taken by the Oversight Board, the social giant had to review not only the post which was the matter of the moment but some of its community guidelines as well.

Something similar happened with a meme that a user had posted which was basically a little satirical considering it covered the Turkish government efforts in denying the Armenian genocide in a funny way. However, Facebook removed the content claiming that this was a violation of the platform policy with a claim that the meme fell in the hate speech and violence category.

Though, after a while the taken down meme, reappeared on the users account because of the reviewed decision taken by the Directors at the Oversight Board. The board directors claimed that Facebook had once notified them that it gives a feasibility and exception to some type of satire content and the social giant was wrong in removing this content as it never made an official statement in its guidelines about satire content on its application.

The tech giant after this decision has now promised to update its community guidelines and enlighten users about how it handles satire content.

The tech giant only promised that it will update its policies to educate users about what kind of satire content will be acceptable and on what topics, however the oversight board took it upon themselves to give some more ideas to the tech giant for changes which can made in respect to this where it advised Facebook to let users be able to cite exceptions which may break the apps policies so that they gather enough information to know if there post or content was offending to the tech giants’ policies or not before making a moderation review request. Facebook said it will look up on the matter and several other recommendations that the board of directors have made, though this may take a long time before it is imposed permanently on the application.

This is not the first time that Facebook had to review its moderation guidelines in response to a decision taken by the Oversight Board.



Read next: Facebook is experimenting with ads on its ‘Oculus Quest’ games

Previous Post Next Post