Meta’s Automated Tools Are Unnecessarily Removing Hamas-Israel Content, Oversight Board Confirms

The ongoing conflict in Gaza has led to a surge in data linked to Israel and Hamas. But a recent finding by Meta’s Oversight Board is bringing to light some very interesting observations.

Meta has been accused by its own board of getting rid of posts of this nature, even though it did not go against the company’s rules.

The news was a part of the initial expedited review published recently that was expected to arise after weeks. Instead, it has come in just 12 days, leaving many in shock.

Watching the Board end up reversing Meta’s decision of removing two articles on the Gaza conflict unnecessarily raised a lot of questions on this matter. We’re seeing two types of content be erased from two sides regarding the conflict.

Ever since it provided support for the company’s decision to have the content restored on its apps, we’re not expecting to see any more steps being taken by the firm in this regard.

However, there are a lot of questions as to how such reviews are bringing to light some very interesting facts. This concerns how the tech giant might need to rely more on automated tools to get the job done right. But instead, it’s bringing about massive errors along the way as this case served as the prime example.

Seeing the probability of it getting rid of valuable articles that bring to light some very important findings linked to the world’s suffering on both ends of the spectrum such as the Middle East conflict is certainly a surprise, experts added.

Seeing the Board opt to investigate separate appeals on this topic is just a fraction of the figure of appeals that people in this part of the world have been rolling out since the start of October. One of those is related to a video published on the Facebook app.

That one had a female begging her captors for release and pleading with them not to take her life after being dragged as a hostage during the attack on Israel. Meanwhile, another video in question had to do with the devastation of the strike linked to the Al-Shifa Hospital that arose in Gaza while Israel made an offensive attack through the ground. This published a host of slain bodies and injured civilians from Palestine and kids were also a part of it.

This kind of review featured two types of content that Meta says were deleted erroneously after adjusting its own tools and forcing them to be more aggressive in terms of gauging content linked to the attack on Israel’s soil on October 7.

For example, the Al-Shifa incident where Israel was accused of striking the hospital was removed and the user who appealed against it received a rejection for its reinstatement. Both types of videos were reinstated after the company set out warning alerts on viewers’ screens, explaining how this type of content is designed to generate awareness and provide news.

However, the Board was not happy with Meta’s plan of action. It feels the firm needs to move at a quicker pace and take on the policy, all depending on how the instances continue to rise at a fast pace. The ordeal comes at a high cost that includes compromising more on freedom and access to data for deleting such types of content online.

It has similarly gone about raising some concerns regarding the firm’s quickly altering techniques linked to content regulation. And as a result, a lot of Meta’s policies on the subject are being questioned for obvious reasons.


Read next: Instagram Boosts Profile Image Privacy With This Update
Previous Post Next Post