Facebook’s Broken Content Moderation Systems Repeatedly Causing Widespread Bans of Harmless Content

It turns out that Facebook actually does listen to people, although the actions that the social media platform often ends up taking in terms of the kind of comments people are making are not exactly in line with what people actually wanted. With all of that having been said and now out of the way, it is important to note that Facebook is dealing with criticism of its content moderation in a way that many are calling excessive, and what’s even worse is that it isn’t really dealing with the main problem that caused the criticism in the first place.

Facebook users are complaining on several social networking forums about their posts being taken down and them receiving warnings for posting inappropriate, suggestive or otherwise negative/adult content even though their content didn’t actually contain any such materials. This is a clear problem in Facebook’s algorithm, and it’s one that is really harming the overall users and businesses experience that people are hoping they would be able to rely on all in all. Some users are even getting their accounts suspended over false allegations of posting inappropriate content, and when these users complain to Facebook they are often forced to make do with responses that are vague at best and don’t really involve any kind of actual understanding of the problem at hand.

Facebook can’t really afford to continue like this, since if users are going to start fearing for the safety of their accounts even though they have not actually done anything against community standards, they might just end up switching to other platforms. Facebook’s response to complaints has been quite slow, and this is just making matters worse. A change in the algorithm needs to happen quickly lest Facebook’s place as a hub of content and communication be put at risk in some way, shape or form.

A big reason for these AI blunders can be mismanagement on Facebook side or perhaps the social media giant is trying its best to irritate maximum number of users, page admins and business owners by relying only on their feedback instead of its human content moderators to look into false indications and sensitive cases.

Screenshot of a post which according to Facebook "goes against its community standards on adult sexual solicitation." As a result the user/business who posted it got banned for 24 hours from all Facebook properties. And after repeatedly adopting same practice on multiple posts, when affected users ask for a review Facebook simply apologizes for it, instead of actually improving its content moderation system to avoid such mishaps in the future. 

Read next: Facebook Bans Ad Containing “Overtly Sexual” Cow Raising Uproar About Disproportional Regulation
Previous Post Next Post