Pages

Amid the ongoing criticism on Apple’s iPhone scan for CSAM, the company revealed that it has been doing the same scan on Emails since the last 2 years

The use of internet and technology surely helps make life easier, however its wide usage have also brought upon a list of negative things into the world. One such aspect is the use of technology to sell or spread about illegal child exploitation content. A lot of what comes into the illegal pornography that spreads through technology is videos and pictures of child abuse or formally known as CSAM (Child Sexual Abuse Material).

Sexual predators that film and sell CSAM content, kidnap and abuse children’s. Which is why the increasing spread of CSAM content is a threat to all children’s. While governments and authorities actively work towards catching sexual abusers and controlling the spread of CSAM content, tech giant Apple Inc. have also joined in and are making efforts towards catching people that create, share or store CSAM content on their mobile devices.

The news of Apple joining in the cause was revealed by the company in recent months, the company said that they are developing a system for US iPhones that will scan photos for CSAM before they are uploaded in the iCloud. Moreover, Apple revealed that if its system reports of any photo to contain CSAM, it will first manually examine it for CSAM and if found it will report it to authorities.

Apple’s idea to keep check on devices was surely a good one, since it will likely limit Child Abuse content to some extent. People however said that this was just another one of Apple’s trick to steal user data and information thus Apple received a whole lot of criticism because of it.

Amid the ongoing criticism Apple revealed that they have been using the CSAM system on users iCloud mail for 2 years now. This means that Apple in the last two years have checked all email attachments to see if they contain Child Abuse material. Apple also revealed that it is also carrying out limited scanning of other data, however it doesn’t involve scanning on iCloud backups.

This news coming out has only increased the fear and criticism surrounding Apple’s CSAM system. However, Apple reacting to this said that such fears are ‘overblown’.


SOPA Images via Getty Images

Read next: Researchers from Princeton further added to the criticism on Apple’s CSAM system by claiming that it can be used for other motives

No comments:

Post a Comment