Apple Delays Plan To Scan Users' Photos For Child Sexual Abuse Images
By Bill Galluccio
September 3, 2021
Apple announced it will delay the launch of a controversial new feature aimed at protecting children from sexual exploitation. In August, Apple said it was going to begin scanning all photos uploaded to the iCloud servers, so they could be compared to a database of known images of child sexual abuse.
The new program drew sharp criticism from privacy advocates who were concerned it could be expanded and used by totalitarian regimes to target marginalized groups.
"Once this capability is built into Apple products, the company and its competitors will face enormous pressure—and potentially legal requirements—from governments around the world to scan photos not just for CSAM, but also for other images a government finds objectionable," 90 policy groups wrote in an open letter to Apple. "Those images may be of human rights abuses, political protests, images companies have tagged as 'terrorist' or violent extremist content, or even unflattering images of the very politicians who will pressure the company to scan for them."
Apple said that it is taking those concerns seriously and will spend the next few months making changes and improvements to the feature.
"Previously, we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material," Apple said in a note on its website. "Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."
Apple did not provide additional details about those changes and did not say when it expects the feature will be released.