Home System list Apple to Scan iPhones for Child Abuse Images | Scientific and...

Apple to Scan iPhones for Child Abuse Images | Scientific and technological news

10
0

Apple has announced a new system for inclusion in iPhones that will automatically scan these devices to identify if they contain media that features child sexual abuse.

It is part of a range of child protection features launched later this year in the United States via updates to iOS 15 and iPadOS and will compare images on users’ devices to a database of image data of known abuse.

If a match is found, Apple says it will report the incident to the U.S. National Center for Missing and Exploited Children (NCMEC). It’s not clear what other national authorities the company will contact outside of the United States, or whether the features will be available outside of the United States.

Other features Apple announced included scanning end-to-end encrypted messages on behalf of parents to identify when a child receives or sends a sexually explicit photo, providing them with “useful resources” and reassuring children that ” it’s okay if they don’t want to see this photo “.

Picture:
One of the new features will notify parents when children receive or send sexually explicit messages

The announcement sparked immediate concern from respected computer scientists, including Ross Anderson, professor of engineering at the University of Cambridge, and Matthew Green, associate professor of cryptography at Johns Hopkins University.

Professor Anderson described the idea as “absolutely appalling” for the Financial Times, warning that “it will lead to mass distributed surveillance of … our phones and laptops.”

Dr Green – who announced the new program before Apple made a statement on it – warned on Twitter: “Whatever Apple’s long-term plans are, they’ve sent a very clear signal.

“In their (very influential) opinion, it is prudent to create systems that scan users’ phones for banned content. Whether they are right or wrong on this point does not matter.

“It will break the barrier – governments will demand it of everyone. And by the time we find out that was a mistake, it will be far too late,” added Dr Green.

The criticism has not been universal. John Clark, President and CEO of NCMEC, said, “We know this crime can only be combated if we are committed to protecting children. We can only do this because technology partners, like Apple, are stepping up their efforts and known dedication. “

Other people who praised it included Professor Mihi Bellare, a computer scientist at the University of California at San Diego, Stephen Balkam, CEO of the Family Online Safety Institute, and former US Attorney General Eric Holder.

Apple Says It Will Protect Users' Privacy With A
Picture:
Apple says it will protect user privacy while comparing images

Apple told Sky News that the detection system is designed with user privacy in mind and can only work to identify images of abuse, such as those collected by NCMEC, and will do so on the web. users’ device – before the image is uploaded to iCloud.

However, Dr Green warned that the way the system worked – uploading a list of fingerprints produced by NCMEC that matched its abuse image database – introduced new security risks for the users.

“Whoever controls this list can search your phone for whatever content they want, and you really have no way of knowing what’s on that list because it’s invisible to you (and just a bunch of numbers opaque, even if you hack your phone to get the list). “

“The theory is that you will trust Apple to include only really bad images. Say, images curated by NCMEC. You better trust them because trust is all you have,” added Dr Green.

Explaining the technology, Apple said, “Instead of scanning images to the cloud, the system matches on the device using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. “

The fingerprint system used by Apple is called NeuralHash. This is a perceptual hash function that creates fingerprints for images in a different way from traditional cryptographic hash functions.

Perceptual hashing would also be used by Facebook in its program that stores users’ private sex photos to prevent this material from being seen by strangers.

Similarities in pictures are not recognized by MD5 algorithm
Picture:
Similarities in pictures are not recognized by MD5 algorithm

These hashing algorithms are designed to be able to identify the same image even if it has been modified or altered, which cryptographic hashes do not take into account.

PHash algorithm recognizes similarities in images
Picture:
PHash algorithm recognizes similarities in images

The images here using the MD5 cryptographic hash algorithm and the pHash perceptual hash algorithm demonstrate this.

In a statement, Apple said, “At Apple, our goal is to create technology that empowers people and enriches their lives, while helping them stay safe.

“We want to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of child pornography.”

“This agenda is ambitious and the protection of children is an important responsibility. These efforts will evolve and expand over time,” Apple added.


Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here