Research on YouTube’s recommendation algorithm reveals extremist content

YouTube is one of the largest online video streaming platforms that provide its services to millions of users. Although the team of YouTube attempts to maintain a secure environment for its users but sometimes it can become a network for extremist content as well.

Research on YouTube’s recommendation algorithm

Recently, a team of researchers at universities in the U.S., Switzerland, and Brazil revealed that YouTube’s recommendation algorithm is likely to display racist channels with extreme content. According to a paper published on Cornell University, the usage of “alt-right” content can play as a medium for radicalization on YouTube. The team of researchers made this decision by categorizing content into three groups.

· The Intellectual Dark Web (IDW)

· Alt-lite

· Alt-right

These three categories were created by researchers to analyze the degree of racism and radicalization in each video category. According to the audit, the IDW content was considered as the commonly used offensive content for homophobia, Islamophobia, etc. Alt-lite category was considered with content displaying white supremacy whereas Alt-right was associated openly with content showing a white ethnostate.

The researchers used a set of keywords to search for content related to these three categories and then monitored the first 200 English results. Channels with most related content to the topic were also added along with names of individuals associated with alt-right.

According to the analysis by researchers, Alt-lite content can easily be discovered from channels related to IDW and the Alt-right channels can be accessed via other categories as well. Researchers monitored the findings for more than 50 hours of watching alt-right content and finally revealed the results based on more than 330,000 videos, 360 channels, 79 million comments, 2 million video recommendations, and 10,000 channel recommendations as well. Out of 360 channels compiled, 90 were considered as IDW, 114 channels were categorized as Alt-lite and 88 as Alt-right.

To look for potential location bias in video and channel recommendations, the researchers used VPNs to collect data. The VPN was used to collect data in the United States, Canada, Brazil, and Switzerland as well.

Tracking radicalization

To detect radicalization in the recommendation algorithm of YouTube, researchers categorized used in three categories. First named ‘lightly infected” for users with one or two comment on the video, ‘mildly infected’ for users with three to five comments on the videos and the last category was named ‘severely infected’ for users leaving more than six comments.

According to the search analysis, about 10 percent of commenters were lightly infected whereas 4 percent or more than 9,000 people were considered as moderately or severely infected.


In 2018, the Alt-right video category received one comment based on every five video views so comments were detected as signals for radicalization because most of them were found in agreement with each other.

These findings by the researchers reveal that YouTube was never declared a platform free from radicalization, there has been and it will continue to be a platform with user radicalization.

The reason behind this radicalization

If you take a look back in 2018, 40 percent of commentators will be traced from users that also commented on Alt-lite or IDW videos in the past. Nowadays, politics has become a new trending topic to gain videos on YouTube. If you take a look at the channels you might notice that channels now promoting Alt-right content started its journey with topics on video games or working out.

According to the analysis by researchers, the three categories of videos share almost the same commenting base since its rise in activity since 2015.

Bottom line

If you follow the recommendation system, chances are you will spot Alt-right channels in recommendations often. The audit was performed by researchers from Harvard University, UFMG in Brazil and EPFL in Switzerland. YouTube is gaining fame every passing day especially among teenagers and the children so if the recommendation algorithm is promoting radicalization then this can totally promote the long-forgotten ideologies like white supremacy back into the mainstream again.


Photo: pressureUA via Getty Images

Read next: Some Creators lost thousands of dollars due to YouTube's mismanagement
Previous Post Next Post