Inside Apple’s Impossible War On Child Exploitation

Published 7 months ago
By Forbes | Thomas Fox-Brewster and Alexandra S. Levine
Apple Store in Shanghai, China
(Photo by Costfoto/NurPhoto via Getty Images)

Apple spent years trying to design a system that would stop the spread of child sexual abuse material on iPhones and iCloud. When the company scrapped it, key members of the team behind the project left and child protection investigators were frustrated. But Apple’s privacy-first approach gained plenty of fans in the process.

Joe Mollick had spent much of his life dreaming of becoming a rich and famous oncologist, a luminary of cancer treatment, the man who would cure the disease for good. It was a quixotic quest for the 60-year-old, one that left him defeated and hopelessly isolated. He turned to pornography to ease his feelings of loneliness. When those feelings became more severe, so did his taste for porn; he began seeking out child sexual abuse material (CSAM).

When the cops first caught Mollick uploading CSAM on a messaging application called Kik in 2019, they searched his electronics and discovered a stash of 2,000 illegal images and videos of children and some 800 files of what a search warrant reviewed by Forbes described as “child erotica.” Investigators found that material had been uploaded and stored in his iCloud account, though Apple hadn’t been the one to notify the police. It was Kik that provided the tip that led to Mollick’s capture and two-year prison sentence. The company was alerted to the images by a Microsoft tool called PhotoDNA, which uses a digital fingerprint to identify known CSAM.

Advertisement

That Apple didn’t flag the illegal material isn’t surprising: Other than its standard scan of outgoing email attachments, the company has long chosen not to screen unencrypted iCloud data for known CSAM. And while it developed a cryptographic system to do just that, Apple abandoned it at the end of 2022, a polarizing move that drew praise from privacy hawks and outrage from child safety advocates. The company framed the controversial decision as reflective of its commitment to privacy—a stance that has earned the world’s richest company plenty of plaudits.

Still, critics say Apple has failed to develop a sustained, successful response to child exploitation materials on its services and fallen far behind competitors in helping police catch the criminals who proliferate it. A Forbes review of some 100 federal cases in which investigators searched Apple technologies, believing they were used in furtherance of child exploitation, found that the company’s systems have been used to store and transmit thousands of items of CSAM between 2014 and 2023. But unlike peers like Google and Meta, which proactively scan their services for such material and provide millions of leads every year to the nonprofit National Center For Missing And Exploited Children (NCMEC) and law enforcement, Apple reports just hundreds despite having hundreds of millions of iCloud users.

Inside Apple, there has been disruption to its trust and safety function in the 18 months between announcing the anti-CSAM tech and scrapping it. Two child safety executives, Melissa Marrus Polinsky, who spent 10 years as Apple’s director of investigations and child safety, and its trust and safety director Margaret Richardson, both left. Two other executives who were key stakeholders in the company’s anti-CSAM work departed as well: chief privacy officer Jane Horvath and iCloud lead Michael Abbott. (None responded to requests for comment. Apple declined.) Abhishek Bhowmick, described by two former Apple employees as the lead engineer on the CSAM detection software, left to start his own company. Reached for comment, he told Forbes his work on privacy tech at Apple was “one of the greatest honors of my career.”

“How can users be assured that a tool for one type of surveillance has not been reconfigured to surveil for other content such as political activity or religious persecution?”Erik Neuenschwander, director of privacy and child safety, Apple

Advertisement

Those departures were significant. When Polinsky and Richardson moved on, they left “fairly large holes in the organization,” one former Apple employee told Forbes. Another said the group focused on child protection had become scattered, with roles reorganized and assigned to other teams. Much of the function has been moved to privacy engineering, according to one source in position to know.

That department is now headed up by Erik Neuenschwander, a long-time Apple technical executive. Recently appointed director of privacy and child safety, his previous roles focused largely on software engineering. Working alongside him are Chuck Gillingham, a former deputy district attorney with Santa Clara County, and Ruben van der Dussen, a privacy engineer previously of Thorn, the anti-CSAM nonprofit founded by Ashton Kutcher. Trust and safety staff ultimately report to Mike Thornberry, who has spent much of his 23 years at Apple overseeing sales and Apple Pay.


Apple has long maintained that privacy is a fundamental human right. The company famously refused to build for the FBI a backdoor into iOS that would circumvent several important security features and allow law enforcement to access personal data stored on iPhones. Apple argued that doing so would give the government and others the power to reach into anyone’s device to snoop around and collect sensitive data. Since then, the company has moved towards even broader encryption of its hardware and software, much to the chagrin of the FBI and other law enforcement agencies. Although iCloud encryption is turned on by default, Apple could scan the files of any users who ask the company to recover their data—such as when they sign in on a new device, restore files from a backup, or ask for help if they lose access to their account.

Child safety advocates say Apple’s unwavering commitment to privacy comes at a cost: Apple is essentially blind to harmful, illegal material stored on its services. As one former employee told Forbes, this stance “obscures some other very serious societal harms.” Technologists and law enforcement veterans say that competitors across the board, from Google, Meta and Microsoft to Dropbox and other personal cloud providers, are far more proactive in rooting out the sexual predators that abuse their platforms. Every year, those companies provide millions—sometimes tens of millions—more cybertips than Apple to NCMEC, which reviews and catalogs tips before handing them off to child abuse investigators.

Advertisement

“They’ve never really been interested in contributing to the protection of children on their platform,” Jim Cole, who spent two decades at the DHS Homeland Security Investigations unit, told Forbes. “They’ve taken the privacy road pretty hard and the law doesn’t require them to look for CSAM.”

“Apple doesn’t do any proactive scanning of their technologies or their storages, or any transmissions to assist law enforcement to stop child exploitation, zero.”Jon Rouse, former child abuse investigator in Australia

Apple declined to comment on a detailed list of questions, instead providing Forbes with a letter from Neuenschwander, Apple’s director of user privacy and child safety, that addressed recent demands from the nonprofit Heat Initiative for Apple to reinstate plans to scan its systems for CSAM. Neuenschwander said Apple had abandoned the project because “it was not practically possible to implement without ultimately imperiling the security and privacy of our users” but emphasized that the company offers other safety features to combat child exploitation.

Mass-scanning of Apple users would “[open] the door for bulk surveillance,” Neuenschwander warned in the letter, first reported by Wired. “How can users be assured that a tool for one type of surveillance has not been reconfigured to surveil for other content such as political activity or religious persecution? Tools of mass surveillance have widespread negative implications for freedom of speech and, by extension, democracy as a whole.” He added that flaws with this type of scanning could spell trouble for parents “when they have done nothing more than share perfectly normal and appropriate pictures of their babies.”

Advertisement

The iCloud Conundrum

Since December 2022, Apple has provided two levels of encryption for iCloud. “Standard data protection” encrypts data on the server and in transit, but Apple retains the keys needed to unencrypt it. “Advanced data protection” does essentially the same thing, but Apple cannot recover the encrypted data. Because users must explicitly opt-in to the latter, Apple likely has access to considerable unencrypted iCloud data.

It also handles information not just from its own apps, but from third-party email, social and messaging platforms where users have device backups turned on—something that law enforcement often notes when filing warrant applications to search iCloud accounts. That includes data from Meta’s WhatsApp and other privacy-focused messengers like Threema, as well as apps like Kik that are often used by predators.

Apple presumably could scan iCloud with PhotoDNA, one of the most widely-adopted tools in fighting child exploitation material online, said John Rouse, a recently-retired, 25-year veteran of child abuse investigations in Australia and former chair of the Interpol Covert Internet Investigations Working Group. Created in 2009 by Microsoft and Dartmouth College, the software converts images into unique numerical values called hashes. Those can be compared against a database of known CSAM hashes. A match initiates further investigation.

PhotoDNA is so widely used — Facebook and Google have been using it for over a decade, Snapchat since at least 2020 — it is effectively an industry standard. But it’s one Apple hasn’t embraced. The company doesn’t do “any proactive scanning of their technologies or their storages, or any transmissions to assist law enforcement to stop child exploitation, zero,” said Rouse. “They do not contribute.”

Advertisement

Like all tech companies, Apple is bound by law to respond to search warrants, and it must also report child sexual abuse material it finds on its platforms to federal authorities through the “CyberTipline” run by NCMEC.

Last year, while Meta’s Facebook and Instagram submitted a combined 26 million reports, Google 2 million, and TikTok nearly 300,000, Apple submitted 234. The year before that: 160.

Apple isn’t a social media company, so this is hardly a direct comparison. That said, WhatsApp, which is end-to-end encrypted, scans unencrypted content such as profile and group photos for CSAM. It provided over 1 million cybertips to NCMEC in 2022; Microsoft sent in 110,000; Dropbox, nearly 46,000; and Synchronoss, the cloud storage provider for Verizon, over 30,000.

Apple’s original plans to carry out scans on iCloud-bound photos without invading user privacy was like “proposing magic in place of reality.”Meredith Whittaker, president of Signal

Advertisement

Apple declined comment on its paucity of NCMEC reports relative to its peers. While it’s likely that the discrepancy is due in some part to duplicates — a single offending image shared in multiple posts – critics say it’s more a reflection of Apple doing no more than it is legally required to do to police the illegal material.

NCMEC spokesperson Gavin Portnoy said the disparity had “nothing to do with potential duplication” and was the result of Apple’s unwillingness to “make reasonable efforts to detect and report CSAM on their platform.” Even the fringe site 4chan, which is hardly a pinnacle of moderation, reports multiples of what Apple does each year.

And court documents do show Apple’s services hosting CSAM. Mollick’s case was one of at least 100 federal investigations from between 2014 and 2023 in which the suspect was alleged or believed to have used Apple products to store sexually exploitative images of children. (That count, which does not include state, local or international records, represents only a fraction of the total child exploitation cases involving Apple.)

Two examples: In 2022, 42-year-old Texan Michael Galvan was sentenced to more than 17 years in prison for distributing CSAM after nearly 600 illegal videos and images were found on his iCloud account. In May this year, former Orange County elementary school teacher Richard O’Connor pleaded guilty to posession of CSAM after a raid of his iCloud account identified at least 81 videos and 158 images of child abuse, including “violent, sadistic, or masochistic conduct.” In both cases, Apple provided data in response to court-approved search warrants. In the Galvan case, however, the original tip that launched the investigation again came from Kik. In only two of the cases reviewed by Forbes did Apple proactively detect and report CSAM it caught in outgoing email messages.

Apple declined comment on these cases and the differences between scanning outgoing emails for sexually exploitative imagery of children and scanning iCloud. But it is well aware of CSAM issues on its services. In text messages unearthed during discovery in Epic Games v. Apple, Apple’s anti-fraud chief Eric Friedman told his colleague in 2020 that Facebook and other tech companies had focused on trust and safety but “sucked” at privacy, adding that Apple’s privacy-is-a-fundamental-human-right approach meant it had become “the greatest platform for distributing child porn.”


An Inflection Point

In an attempt to detect illegal material hosted by iCloud, in the late 2010s Apple began work on a tool that would scour its servers for CSAM while maintaining customer privacy. Announced in August 2021 and dubbed NeuralHash, Apple said the software could detect previously-identified CSAM on-device, before it was uploaded and without looking at or capturing any information about non-CSAM photos. The child safety community applauded the initiative. But an even louder chorus of voices attacked it, decrying NeuralHash as a grave threat to privacy.

Leading that charge was the Electronic Frontier Foundation, describing Apple’s tool as “a backdoor to increased surveillance and censorship around the world.” Joe Mullin, the lead activist on the nonprofit’s campaign against Apple, argued that the proposed techniques to find abusive images would’ve led to pervasive surveillance of Apple users. “They’re trying to locate specific images that are criminal—there’s no question about that—but the techniques they were proposing to use were like… let’s make the whole world a haystack and have an algorithm look through it,” Mullin told Forbes. He also claimed that scanning tools are often inaccurate and unauditable.

President of Signal, Meredith Whittaker, has been heavily critical of surveillance capitalism and Big Tech, but praised Apple for cancelling its CSAM scanning project. (Photo by Horacio Villalobos#Corbis/Getty Images)GETTY IMAGES

Meredith Whittaker, president of Signal, told Forbes that Apple’s original plans to carry out scans on iCloud-bound photos without invading user privacy was like “proposing magic in place of reality.” Apple said as much in the recent letter from Neuenschwander, its director of user privacy and child safety, responding to fresh demands to scan iCloud photos for CSAM. It was the strongest statement Whittaker had seen from the tech giant on the issue in 20 years of debate. “It was unequivocal that it will never be possible,” Whittaker said. She pointed to previous research that found vulnerabilities in NeuralHash, claiming the technology could potentially leak information about non-CSAM imagery or be exploited to frame someone.

“Apple’s statement is the death knell for the idea that it’s possible to scan everyone’s comms AND preserve privacy,” she wrote on Twitter. “Apple has many of the best cryptographers + software eng on earth + infinite $. If they can’t, no one can. (They can’t. No one can.)”

“Every single technology company should be balancing privacy and safety… When you don’t do one, and you only do the other, you can have abuse of your technology.”Julie Cordua, president of Thorn

The public backlash to NeuralHash was so bad that Apple paused its fall 2021 deployment and then killed it in December 2022. The move left some who had worked on the software dismayed and frustrated. The tool was “amazing,” said one former employee. “It was revolutionary. And it took a tremendous amount of work.” Said another, “They didn’t do a great job of describing it. They didn’t because Apple doesn’t bring people along. They like to surprise and delight but they don’t understand things like this actually require you to bring people along at every step.”

Within a year of the announcement directors Polinsky and Richardson both left, the former going to work for the Department of Justice, the latter jumping to GoFundMe. Bhowmick, the lead engineer on CSAM detection, went on to found his own security startup Samooha. Chief privacy officer Horvath moved on to law firm Gibson, Dunn & Crutcher at the start of this year, while cloud lead Abbott departed in April of this year for GM.

Also angered by Apple’s reversal were the advocates and law enforcement officials who viewed the roll out of NeuralHash and the company’s commitment to it as a massive step forward in the battle against child exploitation. Thorn, the Kutcher-backed anti-CSAM group, is among those disappointed that Apple dropped its plans. “Every technology can be used for good and it can be used for evil—which is exactly why every single technology company should be balancing privacy and safety,” Thorn president Julie Cordua told Forbes. “When you don’t do one, and you only do the other, you can have abuse of your technology.”

It also doesn’t help that Apple’s abandonment of NeuralHash seems to have exacerbated the already-fraught relationship between the company and child exploitation investigators. Rouse said Apple had become increasingly “withdrawn.” “They do not communicate… they have no dialogue with us,” he said. “And that’s really disappointing when they’re one of the biggest tech providers in the world.”

But for privacy advocates, Apple’s decision was the only one it could have made, and NeuralHash an idea it should have abandoned sooner. Even Epic Games founder and CEO Tim Sweeney, who has publicly lambasted Apple as “a menace to freedom worldwide” among other things, praised the company for standing its ground in the face of recent criticism. (He previously called Apple’s now-defunct plans for CSAM detection “government spyware installed by Apple based on a presumption of guilt” and “off the charts insane.”)

“A sound move by Apple to resist pressure campaigns to surveil users’ private documents and communications on behalf of governments,” Sweeney wrote on Twitter last week. “The scanning proposal was always an abhorrent violation of customer rights.”

UPDATE: This story has been updated to note that most iCloud content, including photos, is encrypted on the server and in transit with Standard Data Protection, though Apple maintains access to those encryption keys. An earlier version said that iCloud content is unencrypted unless someone opts in to Apple’s Advanced Data Protection feature.