To test the algorithm of TikTok, WSJ created hundreds of bots’ accounts with diverse profiles that viewed numerous videos, here's what it found

There is no doubt that TikTok is the world’s fastest-growing application in the past couple of years, however, that is the important question that how the application finds the interest of users so that it can bring such type of content on their screens. Recently, a team of Wall Street Journal created lots of bots’ accounts that watch numerous videos on the app, the team created different types of profiles for each bot account such as their locality, their ages, and their point of interest in video content, to check that how much time does the algorithm of the application takes to find out their interests without notifying the app about their analysis in the beginning.

Photo: TikTok

The group of Wall Street Journal (WSJ) evaluated the outcomes with Guillaume Chaslot, the expert of procedure who worked on the giant video platform YouTube before. The platform formerly said that its (For U) page is personalized, built on the types of video content users interrelate with, descriptions about the video itself, and further the settings of profiles such as locality and the linguistic. The algorithm of the app cannot distinguish between the video content that users vacillate watching and that video content that users love to watch and even want more of such content. That is the reason that some users finish up with a group of (For You page) suggestions that do not appear to depict their interests.

The team further analyzed that a few bots’ accounts were trapped in rabbit holes of the same type of content, sometimes, that becomes a wonderful strategy when users see the same type of videos that are according to their likings. But often time it happens that they do not like such content, so as a result, these rabbit roles do not work so well for users. For instance, one bot account watched different videos about depression. However, other accounts were presented videos about compulsive eating and debated suicide. The team created those accounts in such a way that they expressed their point of interest in the content by watching videos, again and again, stopping the content with linked hashtags or pictures.

According to the experts of WSJ, the platform found the behavior of bot accounts in forty minutes. However, after notifying the test that how the app figures out the interest of users, the representative of the platform gave remarks about the activities produced by this test that it does not signify the actual conduct of users because people have different groups of likings and disliking than bots. The spokesperson of the app further said that people are given the option of “Not Interested” so they can avoid posts of such categories by tapping that option.

Read next: The Real Cause for the Hostile Discourse on Social Media
Previous Post Next Post