Latest News

09-09 Inside Apple’s Impossible War On Child Exploitation WARNING: Could Trigger Emotions

Apple spent years trying to design a system that would stop the spread of child sexual abuse material on iPhones and iCloud. When the company scrapped it, key members of the team behind the project left and child protection investigators were frustrated. But Apple’s privacy-first approach gained plenty of fans in the process.

Joe Mollick had spent much of his life dreaming of becoming a rich and famous oncologist, a luminary of cancer treatment, the man who would cure the disease for good. It was a quixotic quest for the 60-year-old, one that left him defeated and hopelessly isolated. He turned to pornography to ease his feelings of loneliness. When those feelings became more severe, so did his taste for porn; he began seeking out child sexual abuse material (CSAM).

When the cops first caught Mollick uploading CSAM on a messaging application called Kik in 2019, they searched his electronics and discovered a stash of 2,000 illegal images and videos of children and some 800 files of what a search warrant reviewed by Forbes described as “child erotica.” Investigators found that material had been uploaded and stored in his iCloud account, though Apple hadn’t been the one to notify the police. It was Kik that provided the tip that led to Mollick’s capture and two-year prison sentence. The company was alerted to the images by a Microsoft tool called PhotoDNA, which uses a digital fingerprint to identify known CSAM.

That Apple didn’t flag the illegal material isn’t surprising: Other than its standard scan of outgoing email attachments, the company has long chosen not to screen unencrypted iCloud data for known CSAM. And while it developed a cryptographic system to do just that, Apple abandoned it at the end of 2022, a polarizing move that drew praise from privacy hawks and outrage from child safety advocates. The company framed the controversial decision as reflective of its commitment to privacy—a stance that has earned the world’s richest company plenty of plaudits.

Still, critics say Apple has failed to develop a sustained, successful response to child exploitation materials on its services and fallen far behind competitors in helping police catch the criminals who proliferate it. A Forbes review of some 100 federal cases in which investigators searched Apple technologies, believing they were used in furtherance of child exploitation, found that the company’s systems have been used to store and transmit thousands of items of CSAM between 2014 and 2023. But unlike peers like Google and Meta, which proactively scan their services for such material and provide millions of leads every year to the nonprofit National Center For Missing And Exploited Children (NCMEC) and law enforcement, Apple reports just hundreds despite having hundreds of millions of iCloud users.

Read more here. 

Warrior09-09 Inside Apple’s Impossible War On Child Exploitation WARNING: Could Trigger Emotions