Researchers points out the spread of misinformation on Facebook thus raising questions on the company’s omitted ‘widely viewed content’ report

Facebook has been trying to control everything on its platform related to misinformation regarding the corona virus and its vaccine to the best of its ability, however considering how large of a user base the tech giant has it can be hard sometimes and hence Facebook is always stuck in a whirlpool of accusations and is always trying to overcome them.

Something similar to this happened very recently when Facebook was accused of the fact that the most hyped up, reached and user viewed material on its platform is from pages which are known to spread misinformation. However, Facebook in order to remove such allegations from upon itself released a ‘widely viewed content report’ in which it stated that according to the statistics figures of the second quarter of 2021, the top engaging content on their platform came from YouTube Amazon, TikTok and a Cat GIF from Tumblr.

However, NY Times decided to take a look into the Q1 figures of 2021.

What they found was a lot different than what Facebook has portrayed to the audience in its Q2 report. The report from NY Times showed that in the first quarter of 2021 the most viewed link on Facebook in that time frame was of a news about how a doctor (unfortunately) lost its life amid Covid situations.

The fact that in such testing times when the world is on a one on one fight with a pandemic and the vaccines should be encouraged seeing such posts on such a huge platform was very disappointing. Facebook when questioned about as to why it firstly did not release its Q1 report out to the public and why was such content circulating on its platform, gave the following answers.

While many assumed that Facebook did not release its Q1 "widely viewed content report" because of the massive misinformation content it carried, however, Facebook said that it was not the case and the report had some key fixes to be made which now after being changed, the report has been released.

Facebook on why it had the false death due to Covid19 vaccine link circulating around the platform said that when the doctor had died, many news pages had put up a report about however, when the reason of death was confirmed which was not related to the vaccine obviously, many news pages updated their content with the correct information while many did not. Considering such circumstances sometimes its hard for Facebook to focus on where and when the misinformation is being spread and what content to remove on this basis considering how thin line is between true and false statements.

Though while things may be hard for the tech giant to control misinformation spread with so many users, at least it is trying its best.


NurPhoto via Getty Images

Read next: Facebook attempts to increase awareness by enabling Covid-19 vaccine information in comments
Previous Post Next Post