YouTube’s Sinister Side: Here's How The Platform Drives Users To Extremist And Alternative Content

It wouldn’t be wrong to mention that YouTube’s popularity enmasses millions of followers and it keeps getting stronger. The much-loved video-sharing app is adored by viewers and content creators and is home to videos galore.

But did you ever think about how simple it was to fall down the platform's rabbit hole where there is no limit on the sharing of online content?

Remember, there is never any kind of restriction linked to the content that people make, especially if it attracts the masses. But what if we told you that all that glitter is not gold?

Thanks to a new study by Northeastern University, we’re saying hello to more facts related to how the app is home to plenty of extremist and alternative forms of content that have great reliance on the platform to be promoted.

We saw in 2016 how the leading video search engine YouTube was in the line of fire for recommendations where extremist material was getting promoted thanks to some major changes done to the recommendation algorithm.

And while extremist material stays on the app, there are so many subscriptions driving users to this material or genre instead of the usual algorithm that most of us are accustomed to seeing.

Researchers from this study stated how the figures for problematic material on the app were significantly large and there was a huge audience for that type of material, even today. The only problem is that it is yet to be radicalized on the app.

Therefore, the question of where radicalization was occurring within the app is worth considering

The latest research on this front says extremists are relying on the video hosting abilities of the platform and off-site is where all the dirty things happen. So if you began at a location where you have exposure to bad things, you’re going to end up in locations where you get exposed to even more.

Today, YouTube is boldly talking about how it was forced to undergo changes to the algorithm for this same reason. Receiving this kind of criticism, brought out changes in 2019 as people highlighted how it was promoting conspiracy-related material that was spiking at an alarming rate.

Despite the changes, the material never vanished as a whole. So much content just simply shifted. Today, the app is not only related to itself but the fact that content could be embedded across any designated website is news.

More than 1000 residents from America arose for this study which were divided into several cohorts. It’s the first study of its kind where people were seeing all kinds of things arising on the app.

Content was studied that users came across frequently and there were some very interesting conclusions made in the end.

Users saw so much YouTube material getting promoted on websites instead of the app itself. And most of the material was linked to problematic material. The content was more promoted in the direction of websites belonging to political right-wing destinations. Some were expressing hate openly. And that was an eye-opening situation.

The study proved that those exposed to videos on destinations offsite were more inclined to search for content from areas featuring problematic material. But it does not take long for off-platform to transform into on-platform.

So what’s the solution to the problem? As per experts, it’s just stronger policies for moderating content. The app can well decipher where content is arising from and the fact that it continues to host such videos despite appearing at different websites is another problem worth mentioning.

Image: DIW-Aigen

Read next: Former Google CEO Shares Unsettling Predictions About Dangers Linked To AI
Previous Post Next Post