Last month, Apple announced a handful of new child safety features that proved to be controversial, including CSAM detection for iCloud Photos. Now, Apple has said they will “take additional time” to refine the features before launching to the public.
In a statement to 9to5Mac, Apple said:
“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.D