Apple's
promise last month to check US customer phones and computers for child sex
abuse images sparked a global backlash from a wide range of rights groups, with
employees also criticising the plan internally.
Critics
argued the feature could be exploited by repressive governments looking to find
other material for censorship or arrests and would also be impossible for
outside researchers to determine whether Apple was only checking a small set of
on-device content.
Apple countered that it would allow security researchers to
verify its claims, but the company on Friday said it would take more time to
make changes to the system.
"Based on feedback from customers, advocacy groups,
researchers and others, we have decided to take additional time over the coming
months to collect input and make improvements before releasing these critically
important child safety features," the company said in a statement on
Friday.
Matthew Green, a cybersecurity researcher at Johns Hopkins
University who had criticised Apple's move, said the Apple's move was
"promising."
Green said on Twitter that Apple should "be clear about
why you're scanning and what you're scanning. Going from scanning nothing (but
email attachments) to scanning everyone's private photo library was an enormous
delta. You need to justify escalations like this."
Apple had been playing defence on the plan for weeks, and
had already offered a series of explanations and documents to show that the
risks of false detections were low.
It had planned to roll out the feature for iPhones, iPads,
and Mac with software updates later this year in the United States.