Skip to Content

Apple abandons controversial plan to check iOS devices and iCloud photos for child abuse imagery

By Samantha Murphy Kelly, CNN Business

Apple is abandoning its plans to launch a controversial tool that would check iPhones, iPads and iCloud photos for child sexual abuse material (CSAM) following backlash from critics who decried the feature’s potential privacy implications.

Apple first announced the feature in 2021, with the goal of helping combat child exploitation and promoting safety, issues the tech community has increasingly embraced. But it soon put the brakes on implementing the feature amid a wave of criticism, noting it would “take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

In a public statement Wednesday, Apple said it had “decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos.”

“Children can be protected without companies combing through personal data, and we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all,” the company said in a statement provided to Wired. (Apple did not respond to CNN’s request for comment.)

Instead, the company is refocusing its efforts on growing its Communication Safety feature, which it first made available in December 2021, after consulting experts for feedback on its child protection initiatives. The Communication Safety tool is an opt-in parental control feature that warns minors and their parents when incoming or sent image attachments in iMessage are sexually explicit and, if so, blurs them.

Apple was criticized in 2021 for its plan to offer a different tool that would start checking iOS devices and iCloud photos for child abuse imagery. At the time, the company said the tool would turn photos on iPhones and iPads into unreadable hashes — or complex numbers — stored on user devices. Those numbers would be matched against a database of hashes provided by the National Center for Missing and Exploited Children (NCMEC) once the pictures were uploaded to Apple’s iCloud storage service.

Many child safety and security experts praised the attempt, recognizing the ethical responsibilities and obligations a company has over the products and services it creates. But they also called the efforts “deeply concerning,” stemming largely from how part of Apple’s checking process for child abuse images is done directly on user devices.

In a PDF published to its website outlining the technology, which it called NeuralHash, Apple attempted to address fears that governments could also force Apple to add non-child abuse images to the hash list. “Apple will refuse any such demands,” it stated. “We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future.”

Apple’s announcement about killing its plans for the tool came around the same time the company announced a handful of new security features.

Apple plans to bring expanded end-to-end encryption of iCloud data to include backups, photos, notes, chat histories and other services, in a move that could further protect user data but also add to tensions with law enforcement officials around the world. The tool, called Advanced Data Protection, will allow users to keep certain data more secure from hackers, governments and spies, even in the case of an Apple data breach, the company said.

The-CNN-Wire
™ & © 2022 Cable News Network, Inc., a Warner Bros. Discovery Company. All rights reserved.

Article Topic Follows: CNN-Social Media/Technology

Jump to comments ↓

CNN Newsource

BE PART OF THE CONVERSATION

KYMA KECY is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.

Skip to content