Recent Case Reveals Google Scans User Accounts for Animated Child Abuse Content

The internet has pretty much changed the world, and many if not most of those changes have probably been for the better. With all of that having been said and now out of the way, it is important to note that it has also made it unfortunately easy for certain people to spread images of child sexual abuse and the like online, and that is something that a lot of companies have been struggling to figure out how to end up dealing with.

Apple recently pledged to scan iCloud databases in search of pornographic images featuring children, but that lead to criticism because of the fact that this is the sort of thing that could potentially end up compromising on people’s privacy. After all, most people are not partaking in this sort of thing, and they would have their privacy violated for something that they have most likely not done which meant that a more targeted approach might have been required in this regard at least for the short term.

However, while Apple backed down from this suggestion and followed the desires of its users, Google seems to be taking a more serious approach at least as far as animated child pornography is concerned, as reported by Forbes. A recent case where a Midwestern artist was found to have an image that appeared to depict two underage boys having sex was used in a court case, but it seems that the man was not convicted because such images might be allowed if they are of an artistic or scientific nature rather than just being something overly obscene.

Google has not revealed how it came across these images, but there is a good chance that it scans people’s Google Drive and Gmail accounts for them with all things having been considered and taken into account. These platforms are not end-to-end encrypted, so it would be relatively easy for Google to do this although that means that the aforementioned violations of privacy are another issue that people might have to deal with while they are using Google’s various products and services for their own needs.

These events have resulted in an interesting debate. On the one hand, people want to feel private while sending emails and storing items on their Google Drive. On the other hand, cartoons that depict CSAM, or Child Sexual Assault Material, need to be cracked down on, especially since some malicious actors might be using them to subvert algorithms designed to detect depictions of CSAM. Law enforcement agencies will likely continue to work with Google in this capacity so that some kind of solution can be obtained sooner rather than later.


Read next: Security Researchers Conducted A Study On Password Vaults and Managers, Attempting To Gauge Public Interest As Well As Usage
Previous Post Next Post