Twitter’s Attempt To Make More Revenue From Adult Content Put On Hold Due To Major Flaws

Contrary to many people’s thinking, Twitter is certainly not generating revenue the way it would like to.

Yes, profits are dwindling and that’s why teams are trying to think outside the box and come up with ways to make more money.

After all, investors are concerned and they want the app to do well under all circumstances. This comes after internal shakeups, which have put even more pressure on the platform than before. On that note, Twitter turned to generate revenue from adult content.

As weird as that may sound, a recent report from The Verge has spoken in detail about how the app is viewed by many as an archrival for OnlyFans. And that’s by enabling adult creators to make subscription sales through its platform.

Yes, we know what you’re thinking. The idea is so strange but we don’t think it’s actually outlandish. For those that might not be aware, the app actually helps adult creators market their OnlyFans account.

The reason being is simple. Twitter does not consider the addition of adult content on its platform as a violation of guidelines. But wait, we may have spoken a little too soon.

Twitter is considering putting the venture on hold for now after a huge team comprising 84 members found the offering to have some major security flaws.

According to the investigations, the ‘red team’ found how the app fails to detect material linked to sexual abuse of children. Similarly, it failed to eliminate nudity taking place without consent. And if that wasn’t enough, the app also lacked tools that could prove how all content coming out was from creators above the age of 18.

The report from Verge also shed light on how the app has been sending warnings to higher officials about the issue since February of last year.

To help detect sexual abuse material linked to kids, the platform did reveal how it uses Microsoft’s database that is dubbed, PhotoDNA. This is commonly utilized by platforms to identify and remove such content.

However, the biggest flaw with such methods is that if the content isn’t a part of that particular database, then the latest images added won’t undergo detection.

One research expert named Matthew Green from John Hopkins Institute shed light on how Twitter is being blamed for software that almost all others are using.

In 2021, Twitter highlighted its annual revenue to be $5 billion. The amount is certainly small when you compare it to the likes of tech giants such as Google. These huge firms have the means to create the best technology out there to detect child sexual abuse. But again, they’re not foolproof and can’t be seen as a standard of the industry.

Speaking to media out recently, Greene adds how tools of today may appear sophisticated but they’re not free from errors. The best example has to do with a case of a father being reported to the police after he put up an image of his son’s swollen private parts after being asked to do so for an online consultation.

Today, we’ve got superior technology available to help provide better protection to kids but critics are worried that it comes at a high cost. This cost is related to mass surveillance with too much in-depth scanning of users’ personal data.

Apple was also all set to sell its own detection software for children’s sexual abuse detection. But it was forced to take a step back as many felt it could be taken advantage of by officials in the government.

Hence, Twitter is in a sticky mess. It’s so big that detecting all abusive adult content is near impossible. Simultaneously, it’s not generating enough revenue to invest in better safeguards.


Read next: Twitter’s Leaked Internal Memo Says Its Shopping Features Pose Content Moderation Risks
Previous Post Next Post