To Tackle Misleading Information, YouTube Will Now Add Information About Coronavirus Vaccines To Its Fact-Check Panels

Back in April of this year, Google-owned YouTube started adding fact-check information to counter misleading information related to the coronavirus. The company introduced fact-checking information panels to tackle misinformation. These panels appear on YouTube searches for particular COVID-19 topics. Users see an information box from a third-party fact-checking group that tells whether the coronavirus claim is valid, partially true, or false. It also provides a link explaining why that claim is true or false.

These links direct users to authoritative sources providing the correct information about the virus. Now, it has been reported that a COVID-19 vaccine has begun to show early results. TC reported on November 16 that Moderna claims its COVID-19 vaccine is 94.5% effective in the initial analysis. This is slightly higher than the efficiency reported by Pfizer and BioNTech since they announced a 90% efficiency for their vaccine candidate.

Now, YouTube is tweaking its information panel to also link to information about COVID-19 vaccines. It is another small change to what is already a small intervention in the battle against misleading online information.

The panel will now prompt users to ‘learn about vaccine progress’ for relevant videos on the video-sharing platform. It will link viewers to sources such as the Centers for Disease Control and Prevention (the CDC) and the World Health Organization (WHO), YouTube revealed in a statement. CNET reported on November 17 that the panels have already started appearing in searches and under videos in the United States. And in the next couple of days, this new update will start rolling out to users across the globe.

As work on various vaccines progresses, it is evident that anti-vaccine conspiracy theorists will discourage people from getting vaccinated. This trend will eventually harm public health and prolong the damaging impacts of the pandemic. Since YouTube is considered one of the largest online video-sharing platforms in the world and millions of people watch videos on this platform daily, it will undoubtedly be a prime vector for this sort of misleading information.

However, YouTube has not inspired confidence in its ability to tackle misleading information. YouTube has declined to remove videos spreading false information about the outcome of the US election. Once the anti-vaccine movement gets on any viable vaccine, tiny information boxes under such conspiracy theory clips will be too easy to ignore for viewers.

Here, it is worth mentioning that a study conducted in the US and Britain found that conspiracy theories and misleading information fuel mistrust in vaccines and could push levels that potential vaccines are taken below the rates required to protect people against the coronavirus.

Photo: SOPA Images/LightRocket via Getty Images

Previous Post Next Post