Skip to Content

Apple to start checking iPhone and iCloud photos for child abuse imagery

<i>Ming Yeung/Getty Images</i><br/>Apple said it will begin testing a new system to automatically match photos on iPhones
Getty Images
Ming Yeung/Getty Images
Apple said it will begin testing a new system to automatically match photos on iPhones

By Samantha Murphy Kelly, CNN Business

Apple on Thursday said it will begin testing a new system to automatically match photos on iPhones and uploaded iCloud accounts to a database of child sexual abuse images and alert authorities as needed.

The new service will turn photos on devices into an unreadable set of hashes — or complex numbers — stored on user devices, the company explained at a press conference. Those numbers will be matched against a database of hashes provided by the National Center for Missing and Exploited Children.

In taking this step, Apple is following some other big tech companies such as Google and Facebook. But it’s also trying to strike a balance between safety and privacy, the latter of which Apple has stressed as a central selling point for its devices.

Some privacy advocates were quick to raise concerns about the effort.

“Apple is replacing its industry-standard end-to-end encrypted messaging system with an infrastructure for surveillance and censorship, which will be vulnerable to abuse and scope-creep not only in the US, but around the world,” says Greg Nojeim, co-director of the Security & Surveillance Project at the Center for Democracy & Technology. “Apple should abandon these changes and restore its users’ faith in the security and integrity of their data on Apple devices and services.”

In a post on its website outlining the updates, the company said: “Apple’s method … is designed with user privacy in mind.” Apple emphasized that the tool does not “scan” user photos and only images from the database will be included as matches. (This should mean a user’s harmless picture of their child in the bathtub will not be flagged.)

Apple also said a device will create a doubly-encrypted “safety voucher” — a packet of information sent to servers — that’s encoded on photos. Once there are a certain number of flagged safety vouchers, Apple’s review team will be alerted. It’ll then decrypt the voucher, disable the user’s account and alert National Center for Missing and Exploited Children, which can inform law enforcement. Those who think their accounts have been mistakenly flagged can file an appeal to have it reinstated.

Apple’s goal is to ensure identical and visually similar images result in the same hash, even if it’s been slightly cropped, resized or converted from color to black and white.

“The reality is that privacy and child protection can co-exist,” John Clark, president and CEO of the National Center for Missing & Exploited Children, said in a statement. “We applaud Apple and look forward to working together to make this world a safer place for children.”

The announcement is part of a greater push around child safety from the company. Apple said Thursday a new communication tool will also warn users under age 18 when they’re about to send or receive a message with an explicit image. The tool, which has to be turned on in Family Sharing, uses on-device machine learning to analyze image attachments and determine if a photo is sexually explicit. Parents with children under the age of 13 can additionally turn on a notification feature in the event that a child is about to send or receive a nude image. Apple said it will not get access to the messages.

That tool will be available as a future software update, according to the company.

The-CNN-Wire
™ & © 2021 Cable News Network, Inc., a WarnerMedia Company. All rights reserved.

Article Topic Follows: CNN - Social Media/Technology

Jump to comments ↓

CNN Newsource

BE PART OF THE CONVERSATION

KION 46 is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.

Skip to content