Facebook: Thousands of Children are interested in gambling and alcohol advertisements

According to Facebook, Children are more interested in gambling and alcohol advertisements as compared to others. Recently, Facebook marked hundreds of thousands of children as “interested in” adverts related to gambling and alcohol as a result of the joint investigation held by the Guardian and the Danish Broadcasting Corporation.

The advertising tool of Facebook reveals that more than 740,000 children under the age of 18 are detected as being interested in gambling advertisements and around 130,000 of them are in the UK. 940,000 children are minors and around 150,000 of those who are British are now detected as children interested in alcoholic advertisements.

The category of ‘interested in’ is automatically generated by Facebook depending on the activity of its user on a social network. Advertisers use this category of ‘interests’ to specifically target messages to subgroups flagged as interested in the relevant topic.

Response from Facebook

According to the revelation by Facebook, the social media platform prohibits alcohol or gambling-related companies to display ads to minors on Facebook and whenever detected the platform enforces against it. The social network is working very closely with regulators to provide more guidance to marketers to help them reach their potential customers more effectively.

Facebook only allows advertisers to specifically target messages to children based on their interest activity in alcohol or gambling. This algorithm by Facebook can help various companies like anti-gambling services to reach out to children who potentially have problems related to gambling or alcohol and offer them help and support in return. The flaw in this algorithm is that some advertisers can also target the interests for other purposes as well like the developers of a video game with profitable ‘loot box’ mechanics could target children with interest in gambling without violating any of the policies of Facebook.

The availability of automated interests can help alcohol and gambling advertisers to advertise to children without violating any of Facebook’s rules and with this algorithm, the advertisers already have a selected audience available for them by the social media platform. Facebook completely depends on automated reviews for flagging adverts that violate its policies and the automated review is not completely guaranteed to find breaches before the start of advertisements. Recently Facebook settled a lawsuit with Martin Lewis due to its long-term failure to keep the financial expert’s image out of the scam advertisements.


This is not the first time that the automatic categorization of Facebook was criticized. In 2018, the social media giant was found targeting users it thought were interested in topics like Islam or liberalism, sexuality and political beliefs explicitly detected as sensitive information according to the GDPR data protection laws by EU. A month later, it was discovered that more than 65,000 Russians were labeled as “interested in treason’ due to Facebook’s algorithm and after the inquiries by Guardian Facebook removed the label. In March, Facebook was charged with violating the Fair Housing Act by the US Department of Housing and Urban Development due to its allowance to advertisers to restrict housing adverts based on race, color, national origin, sex, and disability.

Bottom Line

Facebook’s only aim is to provide the majority of information to its possible advertisers to help target potential customers but this is continuously leading to more issues regarding the misuse by advertisers to targeting data in ways that is illegal. In order to engage more advertisers on the platform, Facebook needs to look for some new ways other than providing more detailed information on potential consumers.


Photo: Beck Diefenbach/Reuters

Read next: Facebook collaborates with WHO for World Mental Health Day
Previous Post Next Post