Facebook Might Not Be Sticking To Anti-Vaccination Policies After The Pandemic

Facebook has been touting its policy updates with regards to misinformation regarding the COVID-19 pandemic for quite a while now. But how effective do these ultimately prove to be? And will they have any lasting impact on the platform?

2020 and all its accompanying list of mishaps and disasters have left a rather indelible mark on the current world. While this has manifested in some positive manner, what with the tech industry booming as more and more people become reliant on gadgets for everyday tasks, the overall effect has left people either all-too worried, or jaded to their surroundings. With cases of police brutality, racism, and the anti-vaccination movement taking root online, social media platforms have beefed up precautions to maintain peace, with variable results.

Facebook's enactment of COVID related policies involved the flagging or restriction of accounts and posts that actively touted misinformation and conspiracy propagation regarding the pandemic. An AI formed the frame for such online security, as it scanned posts to identify inaccurate information. While the motives behind such actions certainly seem noble, the actual effect ended up being a shade more disappointing.

As reported by Engadget, Facebook's strategy involved less removal of uninformed or inflammatory content, instead choosing to simply restrict reach via some algorithmic changes. And while such behaviour might indicate the company's wishing to maintain online freedom of speech, a recent update to the policy might change this progressive course. Specifically, Facebook announced that it will be doubling down on anti-vaccination discourse on the platform "for the duration of the pandemic", leaving them an available option to retract all of these changes.

The social network's approach to handling anti-vaxxers online seems to revolve around pacifying them until COVID-19 rolls over dead. After that, it seems that vaccinations will have no contest moving forward, a highly dangerous idea considering the number of anti-vax communities that have formed online. Facebook's own Help Center even fans the flame to such accusations by reiterating that content removal will be effectively monitored during the COVID-19 health emergency.

Facebook wasn't exactly a role model for proactivity before 2020, having only flagged two major pieces of anti-vaccine content prior. This lax attitude, combined with the company's refusal or hesitation in removing controversial profiles such as Robert F. Kennedy from its platforms, only spells further inactivity as soon as we step away from the pandemic. Which, considering the company's size and reach, is just a shame.


Photo: NurPhoto via Getty Images
Previous Post Next Post