New Research Highlights Instagram Failed Policies To Restrict Drug-Related Content From Young Teens

Drugs and teens do not mix well. The introduction of social media in their lives provides them with the means to access content that is drug-related. The scrutiny by the parents of young teens led to some pressure being put on these social media platforms. However, despite the safety feature, it seems that teens can easily access content, including drugs, on Instagram.

Tech Transparency Project carried out independent research on Instagram that reveals how some accounts are dedicated to producing content related to drugs are available on the platform. Their main motive is to sell drugs, and some even were found selling MDMA, a party drug typically known as ecstasy. The platform did introduce safety features to limit the access to drug-related content; however, according to the research, finding any possible drug-related content is just a hashtag search away! Instagram's algorithm relies on and uses hashtags as key elements. If you were to search for drugs on sale and put the hashtag symbol before it, the platform would yield a number of accounts selling the drug. You can also replace drugs with a particular name, and it would turn up similar results!

Even though the platform does not allow the sale of drugs on it, young teens can still come in contact with and purchase off Instagram. The platform tried to counter this problem by introducing a warning before a user searches for hashtags that are related to drugs. Suppose a user is to interact with the warning. In that case, they will be promoted to a website that specializes in substance abuse. The researcher group stated that these efforts are not enough. The group also criticizes Instagram for not taking any active divisions that are being implemented to solve the problem. They argue that it may result in a reduction in average time spent by a user on the platform, and Instagram does not want that outcome.

The Instagram spokesperson responded by releasing a statement. It stated that the platform successfully removes any drug-related activity or content before it is reported by a person. 96% of the content that violates their drug-related terms and conditions is removed automatically according to the platform. However, according to the group's research, the platform may have managed to clean its feeds from the drug content. However, young users can still gain access to illicit content. The platform received harsh criticism recently due to the leak of the Facebook papers. It stated that the platform's effects on young people's mental health are disastrous.

To carry out the research project, the team decided to register fake accounts on the platform that are typically known as dummy accounts. These were registered as teen users so that the protection features could be tested. The research project turned out fascinating notes on the platform's regulations over restricting drug-related content. Suppose a user searches for fentanyl, the platform will produce no results; however, if the user were to put a city's name ahead of the drug, it would yield many results, and some of the accounts may actually be selling the opioid! Moreover, the app's own policies seemed to run against its terms and conditions. If the dummy accounts made their way to an account that sold drugs, they would recommend similar accounts. The suggested accounts often sold the same if not many varieties of drugs

Instagram policies regarding drug content on its platform remain questionable. While the spokespersons maintain the position that the platform is able to rid of any drug-related content automatically, the research done says otherwise!


Read next: Instagram Mimics TikTok’s Success Formula To Recommending Videos
Previous Post Next Post