Recently, Apple revealed plans to tackle the issue of child abuse on its operating systems within the United States via updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey. Apple intends to scan photos stored on users iPhones and iCloud accounts, looking for child abuse imagery. The new system could help law enforcement in criminal investigations but may open the door to increased legal and government demands for user data.
The system, called neuralMatch, will proactively alert a team of human reviewers if it believes illegal imagery is detected, who would then contact law enforcement if the material can be verified. neuralMatch, which was trained using 200,000 images from the National Center for Missing & Exploited Children, will roll out first in the US. Photos will be hashed and compared with a database of known images of child sexual abuse.
The neuralMatch system will be rolled out in the US first
“According to people briefed on the plans, every photo uploaded to iCloud in the US will be given a ‘safety voucher,’ saying whether it is suspect or not,” the Financial Times said. “Once a certain number of photos are marked as suspect, Apple will enable all the suspect photos to be decrypted and, if apparently illegal, passed on to the relevant authorities.”
John Hopkins University professor and cryptographer Matthew Green raised concerns about the system on Twitter Wednesday night. “This sort of tool can be a boon for finding child pornography in people’s phones,” Green said. “But imagine what it could do in the hands of an authoritarian government?” The system could be used to frame innocent people by sending them seemingly innocuous images designed to trigger matches for child pornography. That could fool Apple’s algorithm and alert law enforcement. “Researchers have been able to do this pretty easily,” he said of the ability to trick such systems.
Other abuses could include government surveillance of dissidents or protesters. “What happens when the Chinese government says, ‘Here is a list of files that we want you to scan for,'” Green asked. “Does Apple say no? I hope they say no, but their technology won’t say no.”
Even if you believe Apple won’t allow these tools to be misused, there’s still a lot to be concerned about. These systems rely on a database of ‘problematic media hashes’ that we, as consumers, can’t review. Exactly what makes Apple the authority on this subject?
Apple already checks iCloud files against known child abuse imagery, like every other major cloud provider. But the system described here would go further, allowing central access to local storage. It would also be very easy to extend the system to crimes other than child abuse — a particular concern given Apple’s extensive business in China. Here in the US, law enforcement is required to get a search warrant in almost every case. The neuralMatch system seems to circumvent that legal process.
The company informed some US academics about it last week, and Apple may share more about the system “as soon as this week,” according to two security researchers who were briefed on Apple’s earlier meeting.
Apple has previously touted the privacy protections built into its devices and famously stood up to the FBI when the agency wanted Apple to build a backdoor into iOS to access an iPhone used by one of the shooters in the 2015 attack in San Bernardino.
Additional information is available here:
The Verge: https://www.theverge.com/2021/8/5/22611305/apple-scan-photos-iphones-icloud-child-abuse-imagery-neuralmatch
ZD Net: https://www.zdnet.com/article/apple-child-abuse-material-scanning-in-ios-15-draws-fire/
NPR: https://www.npr.org/2021/08/06/1025402725/apple-iphone-for-child-sexual-abuse-privacy
Apple: https://www.apple.com/child-safety/
Deliver David's Tech Talk to my inbox
We'll send David's weekly Tech Talk to your inbox - including the MP3 of the actual radio spot. You'll never miss a valuable tip again!