Trouble For Meta As New Court Ruling Allows Former Moderator To Sue The Company

Things aren’t looking too great for Facebook’s parent firm Meta after a recent court ruling.

A judge has given the green signal to the tech giant’s former moderator to pursue legal action against the organization that he once worked for. Now, the former employee is accusing the company of causing severe mental distress to his health as per reports confirmed by Kenya’s court of law.

The former moderator was identified as Daniel Motaung who says he was paid just $2.20 each hour to overview posts related to both child abuse and a list of other beheadings. Moreover, he’s also busy suing his former employer called Sama that was in contract with Meta to carry out reviews of such posts too.

But in the past, Meta had brushed off such allegations and went as far as mentioning as no one had the right to carry out such a case as the firm isn’t based in a location such as Kenya. But now, the court has slammed the company and found that both these organizations were properly involved in the case and would now be held responsible for their actions.

For now, Meta is yet to comment on the matter but a new legal action of this kind could be seeing campaign groups like Foxglove issue an appeal against the decision.

Nearly three years back. Meta was forced to settle another case by an American-based content moderator regarding issues of mental health that the employee stated to have incurred while working for the organization. This was settled financially with a $52 million payout.

This recent case in Kenya is getting support from Foxglove whose head says that such a case is important to show the world that Kenya is indeed on the map and any such cases involving the nation would be considered with immediate action.

Justice in Kenya is equivalent to any other leading tech firm and all companies of this kind would end up doing well to startle organizations and show respect to people in the country and their affiliated rules and regulations.

For those who may not be aware, Facebook is known to hire thousands of moderators from all around the globe to review posts that routinely get flagged by different users or AI systems. The goal is to see whether or not they’re violating the app’s community standards and if that’s the case, they’re removed respectively.

In this particular case of Daniel, one of the first posts that he was forced to witness was a beheading incident and till today, he suffers flashbacks, which he revealed during his recent sitdown interview with a media outlet. He also highlighted how the video has caused such a negative impression in his mind, making him assume at times that perhaps he is the victim.

Similarly, he highlighted how he was diagnosed with the likes of having post-traumatic stress disorders that even his fellow work colleagues seemed to be struggling with. There would be all sorts of people walking off the entire production floor and starting to cry, he further elaborated.


Read next: Prioritizing Soft Skills: Parents Want Honesty and Respect Taught to Kids Early On
Previous Post Next Post