Apple Delays Launch of Feature to Scan iPhones for Child Abuse

(Bloomberg) — Apple Inc. is delaying the roll out of a system to scan iPhones for images of child sex abuse after it raised fierce criticism from privacy advocates that it could open the door for other forms of tracking.

“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of child sexual abuse material,” Apple said in a statement. “Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

Apple had planned a trio of new tools designed to help root out and stop the spread of child sex abuse material. They included the using the Siri digital assistant for reporting child abuse and accessing resources related to fighting CSAM; a feature in Messages that would scan devices operated by children for incoming or outgoing explicit images; and a new capability for iCloud Photos that would analyze a user’s library for explicit images of children. If a user was found to have such pictures in their library, Apple would be alerted, conduct a human review to verify the contents, and then report the user to law enforcement.

Privacy advocates such as the Electronic Frontier Foundation warned that the technology could be used to track things other than child pornography, opening the door to “broader abuses.” They weren’t assuaged by Apple’s plan to bring in an auditor and fine-tune the system, saying the approach itself can’t help but undermine the encryption that protects users’ privacy.

 

More stories like this are available on bloomberg.com

©2021 Bloomberg L.P.

Close Bitnami banner
Bitnami