Recommendations For YouTube Videos Leads Users Down Rabbit Hole Of Extremist Political Content, New Study Proves

If you’re an avid user of YouTube we bet you’re well aware of the fact that you’re bombarded with constant recommendations on videos that you might like.

The algorithm keeps track of what you’re viewing and engaging with and therefore provides recommendations in that regard. But a new study is shedding light on how such means force users down the rabbit hole linked to extremist political content.

The authors who hail from the University of California mentioned carried out a systematic audit featuring the recommendations on display during the period 2021 to 2022. They were seen testing out how a certain individual’s ideal learning could impact what the algorithm was recommending for them.

So many of those leaning on the right side of things could alter what sort of content was being portrayed in their direction. And that meant saying hello to content arising from places that revoked the idealogy of political extremist behavior. This includes extremist behavior and a wide range of conspiracy theories, not to mention content that was problematic in nature.

Meanwhile, those leaning more toward the left side of things on the app were quite fewer in number, the study adds. Let’s not forget how the algorithm does not stop people from getting through a specific political spectrum. But it does end up putting out recommendations that are in line with the user’s thought process and ideology.

The study proved how such behavior arose from problematic content. Moreover, this study sheds light more on the app’s recommendation system for extreme content that was rolled out in December and published in the Proceedings of the National Academy of Sciences.

The app is one of the most famous of them all and is utilized by nearly 81% of the American population. And thanks to the growing figures for the user base, it’s no surprise to see this rise further.

Nearly 70% of the general content seen on YouTube gets recommended by this particular algorithm, it added.

The researchers went about the study by making nearly 100k sock puppets through the app. This was designed to generate tests aligned with the app’s recommendations. The sock puppet was produced through automated means that copy a real person making use of the app. Similar to how real YouTubers function, this particular study’s sock puppets saw videos and put together several video recommendations from the app.

As a whole, we saw the sock puppets witness close to 10 million videos on the app from nearly 120k differnet channels which displayed all sorts of ideologies. It could be linked to far left or far right.

Moreover, to identify such a political slant linked to the channel, the team generated cross references of various accounts on the app linked to political figures which were followed by every channel out there.

Then every researcher’s team’s sock puppet saw close to 100 different videos from the assigned category. These were close to 30 seconds in duration. And after seeing those, the recommendations seen on the homepage of the sock puppet were evaluated.

Similarly, the recommendations outlined in the up-next category from a wide range of selected videos were gathered and they proved the types of videos that real users were utilizing passively by following the recommendations on stage.

Therefore, would it be wrong to mention how seeing extremist-related content would be self-reinforcing or not? Such a study is giving rise to research teams outlining all sorts of problem-based channels whose goal is related to video shares arising from extremist ideas like conspiracies, controversial terms, and infamous figures.

Around 36% of the designated sock puppets taking part in such experiments got video recommendations from extremist channels and that figure is huge. Those leaning more toward the left side had 32% which again is a staggering amount. And then the right-leaning individuals got 40% of such extremist political content.

The authors therefore concluded that more research is necessary as well as the need to highlight how problematic content on so many apps like YouTube is giving rise to pulling people into radicalization bubbles. A lot of transparency is similarly required for social media platforms so they can be better filtered to prevent this from taking place.

YouTube algorithm favors extremist content, raising concerns; study emphasizes need for transparency.

Read next: YouTube Gears Up For The Launch Of Fewer But Longer Ad Breaks For Television Viewers
Previous Post Next Post