Apple's CSAM proves its levels of audibility through the chief executive's interview

CSAM (child sexual abuse material) in terms of privacy has been a rather debatable topic for a while now with a clear argument on both sides. While Apple has dedicated to create a safe space for kids by initiating its Children Sexual Abuse Material detection program, it isn't a wonder that people are rather skeptical about the process being implied as well as the security of the content being filtered through.

Apple's executive took it as his responsibility to clear out all queries regarding the CSAM. Apple had already launched an FAQ covering and answering all the possible questions frequently asked however it seems like that just wasn't doing the job. Users needed more assurance about the policies, process and security.

The person in question here, Craig Federighi, emphasized how Apple's CSAM has different levels of audibility. He stated in an interview with WSJ that the feature will not be subjected to change under any influence, whether it be political or constitutional as the feature is quite sensitive and is going to be implemented in quite a lot of countries. For instance, China has a pretty strict box to which content needs to adhere to in order for it to be viewable. Now, users are concerned different governments will try to impose their own policies which means the CSAM's little box of censored materials will be subject to preferential change.

Federighi also was quite strict about the claim that iPhone and iPad are going to keep users' information completely safe and secure. It promises this against even Apple itself which means information leaking to the government or such institutions is out of question.

To further elaborate the process, he mentioned the steps taken after Apple identifies potential victims. The pictures will be matched against a list from NCMEC i.e the National Center for Missing and Exploited Children. He confirmed that the data being scanned had a potent aim instead of just randomly searching for nothing. The images in question are shipped to places where they are used to find potential matches. This is important as it gives users the assurance that nothing of less importance will be included in the CSAM list but only what is extremely urgent and illegal.

As for the audibility, Apple made it pretty clear that the images are being shipped to each county without a change in the contents. It's the same pack that goes to the East and the West. This means users wouldn't be under the question of trusting a particular region or company, instead, the permanent package is promising to include everything in context and is illegal.

While the CSAM database is being shipped globally, the tech giant confirms it will be used only in the US for now. The choice of an independent auditor to verify these images has indeed played a huge role in securing user trust.

Lastly, Federighi made sure to inform users when someone will be flagged as red. It won't be the first time as Apple states there will be some warnings first. Although Apple did not disclose the number of misdemeanors allowed before authority gets involved, Federighi hinted towards at least 30 pornographic images belonging to real known children. We believe the secrecy is to protect harassers from being smart or indulging in tactics to avoid punishment.

Now that we are well-informed, we don't think users would have much of an objection seeing how Apple really does want the world to be a better place without any personal gain.


Creator: Drew Angerer / Credit: Getty Images

Read next: More Than Half a Dozen Mobile Games Earn Over $100 Million a Month
Previous Post Next Post