Meta Publishes New Transparency And Community Standards Report That Highlights Content Removal Requests

Meta is telling the world how keen it is on being transparent about its workings including how many content removal requests it gets from all across the board.

The tech giant was seen speaking about how it is now putting its community standard report that showcases all sorts of requests so people are well aware of the types of dealings it goes through.

The transparency report was seen covering the latter part of the year 2022 and this new update was for the first quarter of 2023. It provides plenty of new insights that take all of this into consideration and even provides some violations that have to do with legal requests made by higher authorities.

Most of these trends are quite stable but some of these shifts were prominently seen. And now, it’s time to look at them in detail.

For starters, the company says it’s trying hard to fight against explicit content. The company saw a huge rise in content removals and there was an increase in spam shares of large volumes of data that went against the firm’s guidelines.

Most of it was related to Facebook but it got detected by Meta and there was even a small increase in the number of users reporting such types of explicit content. This might be down due to the rise in spammers surrounding this category.

Meta continues to claim that it has witnessed a rise in proactive detection that has to do with bullying and harassment. The apps have seen so many types of posts that are related to this type of content and so there is a constant removal taking place by moderators so that no one sees it.

The area happens to be one that’s very hard to focus on and it ensures users aren’t exposed to harmful types of material. Moreover, it’s quite good that Meta’s systems continue to be more advanced and can detect unlawful material early on in time.

The same goes for Meta’s Instagram app which is getting more and more serious about how to combat such depictions that have to do with drug usage. But the tech giant’s actions toward removing fake accounts have decreased during the start of 2023 and there are plans to get more restrictions in place to ensure the entire in-app experience is safe and private and very enjoyable for all.

Meanwhile, tech giant Instagram is even getting quite serious about this and how it can combat the issues regarding drug usage on the platform. For now, Meta’s removals of fake accounts keep falling and that is something that it calls definite progress.

It claims that the figure for fake accounts stands at just 4% to 5% of its monthly users from around the globe. Moreover, the company is providing more insights on how so many government requests arise and also delineated which topics are of most concern to them.

Read next: Cybercrime Against Children Is On The Rise As New Study Shows Alarming Statistics
Previous Post Next Post