Meta Publishes Its Community Standards Enforcement Report For Q2

Facebook’s parent firm is going public with its Community Standards Enforcement Report for the second quarter of this year.

The company was seen outlining how it has managed to stick to its goals of making progress by achieving targets it deems necessary. This includes goals to help limit the number of bullying posts and those linked to harassment as well as for apps like Facebook and Instagram.

Meta put forward relevant stats in this regard that really showed how much progress was made at the end of the day. On average, there were around 8 views linked to bullying or harassment for about 10,000 post views seen on the Facebook app.

Similarly, there were 4 for every 10,000 views on the Instagram app for this as well.

Meta added through its Twitter handle that it really gave its Oversight Board all the authority to provide recommendations on what it should do. And they really followed the advice provided to help better their standards online.

The firm says spoke of how they’re open to putting forward around 75% of the recommendations made for the best online experience for its users.

But what exactly did the report mention is a question on many people’s minds.

Well for starters, Meta claims to have provided responses to all recommendations made by the Oversight Board. While it may be an independent body, it really values its opinions and this was proven with evidence.

Then there was a chat about how the organization has created the biggest ever network comprising of fact-checkers. They’ve partnered up with around 90 different firms to help achieve this goal. As it is, Facebook faces great criticism for not doing enough to curb misinformation spread. So this should really show critics how that’s not the case.

This past month, Meta made its Adversarial Threat Report for this quarter public too. This showed how hard the company worked to combat various adversarial-related threats thrown in its direction.

Meta highlighted how it has achieved its targets related to counteracting hate and bullying by enhancing its AI technology. But it’s still not 100% happy with the progress as there’s a lot of room for improvement.

The stats for hate speech remained to be 2% for every 10,000 posts. And millions of pieces were looked at. Similarly, stats for violence showed how the firm acted against 0.03% of its pieces, which is about 19.3 million posts on Facebook.

The prevalence was a little less for the Instagram app and that’s not a major surprise as it faces less opposition than its Facebook counterpart.

Meta also outlined how it improved its methodology for appeals to better its work. If content needed additional review, it was made to do so. Some didn’t agree with the company’s decisions and therefore posts were reviewed further until a logical justification wasn’t provided.

Meta claims that globally, more and more stringent regulations continue to be released as we speak. And that’s why the firm is busy making sure it not only fulfills those but also minimizes its own errors related to the security and safety of users.

Then, a brief overview of highlights was provided on Meta’s Oversight Board. Meta says the board would soon be issuing a new judgment across different cases. This is related to if the firm’s apps need to include warning screens to posts it feels requires them.

The decision will provide greater reliance on the board to determine which content it feels needs to be put up and which should be removed.

This is also why Meta has been releasing more insights about its content policies. Recently, Meta went public with how it determines whether a piece is actually newsworthy or not. And how times it went against its own policies to have something published.

In the same way, we saw Meta share its Crisis Policy and how its uses that to enable the publishing of posts during crisis scenarios. This helps better evaluate what the risks are and what interventions may be used for the firm’s betterment when publishing content online.

As a whole, Meta is on the lookout for bettering its image and it wants its users to know that it’s working hard to empower their expression and provide more protection.

Read next: Meta Releases Inside Details Related To How The Company Enforces Content Rules
Previous Post Next Post