Apple postpones launch of controversial CSAM software

Tom Baker 

[email protected] 

     In the wake of intense backlash regarding privacy concerns, Apple has decided to postpone releasing its controversial Child Sexual Abuse Material detecting software. The privacy concerns surrounded the possibility for flagging non-CSAM images and the potential for abuse from governing agencies. 

     Apple worked with the National Center for Missing and Exploited Children to create a program that would detect the presence of known child pornographic images uploaded to iCloud. 

     Terrance Boult, professor of innovation and security at UCCS, said, “Neither party can find out what the other party has, and this is actually used in a bunch of things. It’s a decent privacy technology.” 

     He continued, “There are really two kinds of exploited children imagery; the original photo, and then the ones that are shared. They can only detect the ones that are shared because until they are shared, the center never knows about them. So, in that sense of privacy, students shouldn’t have to worry.” 

Terrance Boult, photo courtesy of the UCCS website.

     According to Apple’s CSAM detection technical summary, the software would convert images uploaded to iCloud into a “hash,” which is a long set of numbers that acts as a virtual fingerprint based on the contents of the image. 

     Apple compares that hash to a database of known CSAM image hashes provided by the National Center for Missing and Exploited Children. If a threshold of the number of matching images is met, Apple will notify the center, who would conduct a physical review of the image. If they find that image is CSAM, Apple will lock the user’s account and notify authorities. 

     Boult said, “What they’re really saying is this set of hash patterns showed up on this image, and therefore we’re going to flag it. And then it goes to a human processor to answer, ‘Is it really an exploitive image or not?’” 

     Boult continued, “The people reviewing it don’t know where it came from. They don’t know anything about the phone or the person, there’s no way to go back. It’s a process that has effectively redacted the information, so they just get an image and say, ‘Is it or is it not [exploitative material]?’” 

     Regarding the societal benefit despite privacy concerns, Boult said, “The trade-off there in my view is actually good for society in terms of the overall privacy-security trade-off.” 

     He continued, “End users have very little to fear. They can’t tell if you’re naked in your photos; that’s not the function.” 

     Joshua Dunn, chair of the Department of Political Science at UCCS, has concerns about possible abuse of the program. 

     “One of the risks that people see is that governments could say, give us this program, and then use it for other purposes,” he said. “That kind of program could then be harnessed by authoritarian regimes, not going after CSAM, but after dissidence, as a way of identifying what they are and what they’re doing.” 

Joshua Dunn, photo courtesy of the UCCS website.

     Regarding the legality of Apple’s intentions, Dunn said, “On the legal side, I don’t think there’s any constitutional issue. Since Apple is a private company, they can set the conditions of using their products and their software.” 

     Apple released an update on Sept. 3, saying, “Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.” 

     In an interview with the Wall Street Journal, Apple software chief Craig Federighi explained that the public relations backlash was due to poor communication about the technology. 

    In an op-ed for the Washington Post, Jonathan Mayer, an assistant professor of computer science and public affairs at Princeton University, describes a similar system he created and the inherent weaknesses in the program. 

     “Our system could be easily repurposed for surveillance and censorship,” Mayer said. “The design wasn’t restricted to a specific category of content; a service could simply swap in any content-matching database, and the person using that service would be none the wiser.” 

     Mayer continued, “We were so disturbed that we took a step we hadn’t seen before in computer science literature: We warned against our own system design, urging further research on how to mitigate the serious downsides.” 

     Apple planned on introducing the detection technology with the upcoming iOS 15 update scheduled for Fall 2021, around the release of the newest iPhone.  

Image courtesy of Unsplash.com