Meta Is Now Preventing Targeted Ads From Covering Sensitive Topics Of Discussion On Facebook

Meta has recently stated that it will no longer provide Facebook advertisers with sensitive ad targeting options for advertisements that cover issues such as race, ethnicity, sexuality, religion, or political affiliation.

Facebook's still in the thick of it, no matter how much the platform goes through corporate restructuring, or renames itself to Meta. The company may have changed its outward appearance to apparently match the first tech buzzword that Google pulled up, and that does literally nothing to help users forget about the US Congressional trials, whistleblower Frances Haugen, and all of the previous issues the Facebook social media platform had. The point is that Meta's decision to make certain ads un-targetable is, of course, another way that the company ensures trouble does not come its way.

The fact of the matter is that Facebook is currently under a microscope, and its targeted ads are very much subject to such scrutiny. With the leaked documents and allegations of Ms. Haugen painting a (probably accurate) picture of Facebook being an unhealthy and mentally damaging platform for its users, sensitive advertisements only further worsen matters. Of course, targeted ads are still very much harmful, since they still take over user data such as browser history and location data, but Facebook doesn't really strike me as being more than one change at a time company. Where user safety is involved, at least, it's very good at multitasking where rebranding is concerned.

At any rate, targeted advertisements will no longer cover content that could be perceived as triggering, i.e. religion, sexuality, race, or political affiliation. While this is definitely something that could prove useful to users across the world, especially individuals struggling with their identity, it's still a half-hearted move at correcting oneself at best. The problem is that such action only silences conversations regarding such intense matters across the board, instead of having Facebook develop a better system for moderating and rooting out hateful discourse and rhetoric.

This is a move that could go on to silence individuals that are actually attempting to do some good on the platform. Ultimately, however, what concerns Meta is the microscope that it's under. Decisions, therefore, have to publicly appear as being positive changes, even if their overall effect is an overwhelmingly negative one.

Photo: Rafael Henrique/SOPA Images/LightRocket / Getty

Read next: Facebook Published A Content Report For Q3 2021, Detailing What Content Gets Traction
Previous Post Next Post