NeuralHash

From HandWiki

File:Barbara.tif NeuralHash is a byproduct of Apple Inc, which reported as early as August 2021 their Child Sexual Abuse Material (CSAM) system using a perceptual hash. A technical summary document, which nicely explains the system with copious diagrams and example photographs, offers that "Instead of scanning images [on corporate] iCloud [servers], the system performs on-device matching using a database of known CSAM image hashes provided by [the National Center for Missing and Exploited Children] (NCMEC) and other child-safety organizations. Apple further transforms this database into an unreadable set of hashes, which is securely stored on users’ devices."[1]

History

The technical summary document was released in early August 2021 to a shocked infosec world.[2] Greg Nojeim of the Center for Democracy and Technology said that "Apple is replacing its industry-standard end-to-end encrypted messaging system with an infrastructure for surveillance and censorship."[3]

NeuralHash was to roll out in iOS 15,[4][5][6][7] macOS Monterey,[7][4][6] and Apple Watches.[7]

Already in the same month of its introduction a researcher who goes by username "Asuhariet Ygvar" posted code on GitHub for a "reconstructed Python version of NeuralHash" named AppleNeuralHash2ONNX (ANH2O) which Ygvar "claimed to have reverse-engineered from previous versions of iOS." Ygvar recognized the "nhcalc" contribution, to Compute NeuralHash for a given image, of username "Khaos Tian" who happens to work in Menlo Park.[8]

Later the same week Cory Cornelius, a researcher at Intel Labs, published results that ANH2O produced a grave image collision error in that two distinct images may produce the same hash result. As it turns out, perceptual hashes are known to be more collision-prone.[8] In fact, Matthew Green from Johns Hopkins University warned that "the system could be used to frame innocent people by sending them seemingly innocuous images designed to trigger matches for child pornography. Researchers have been able to do this pretty easily."[3] Green said he believes Apple has "sent a very clear signal" that in its view, "it is safe to build systems that scan users' phones for prohibited content."[9]

Ross Anderson, a professor at Cambridge University, said "It is an absolutely appalling idea, because it is going to lead to distributed bulk surveillance of our phones and laptops."[3]

Edward Snowden said: "No matter how well-intentioned, Apple is rolling out mass surveillance to the entire world with this. Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow. They turned a trillion dollars of devices into iNarcs--without asking."[10]

On 13 August 2021 a Wall Street Journal reporter interviewed Apple software chief Craig Federighi for more than 10 minutes to give him an opportunity to defend the technology. Federighi was at pains to point out that companies like Google, FaceBook and Microsoft use similar technology to process their users' electronic files on their servers. What Apple sought to do was to process them right on their users' client devices.[11]

One result was that the Electronic Frontier Foundation (EFF) launched a petition calling on Apple to bury the system, under the headline “Tell Apple: Don’t Scan Our Phones.”[8]

In an essay entitled "The Problem With Perceptual Hashes", a professional working in the field named Oliver Kuederle produces a startling collision generated by a piece of commercial neural net software, of the NeuralHash type. A photographic portrait of a real woman (Adobe Stock #221271979) reduces through the test algorithm to the same hash as the photograph of a piece of abstract art (from the "deposit photos" database). Both sample images are in commercial databases. Kuederle is concerned with collisions like this. "These cases will be manually reviewed. That is, according to Apple, an Apple employee will then look at your (flagged) pictures... Perceptual hashes are messy. When such algorithms are used to detect criminal activities, especially at Apple scale, many innocent people can potentially face serious problems... Needless to say, I’m quite worried about this."[12]

One wag generated an image whose NeuralHash was written in the image itself. The NeuralHash c78fc26ec100f8508d4e60a3 had two cartoon images of a courtroom scene: the first in which the defense attorney indicates "INNOCENT!" while the second in which the prosecutor says "GUILTY!".[13]

A cybersecurity fellow at the German Marshall Fund of the United States wrote that "It would be trivial for the German government to request that Apple configure the CSAM detection system to flag hate symbols such as swastikas in order to prevent any flagged images or videos from being exported from the device. Every country could develop specific laws related to illicit materials that are impossible to enforce without technology companies like Apple building on-device detection systems."[14]

On 3 September 2021 Apple announced it had shelved its plans to implement NeuralHash amidst concerns the system could be abused by authoritarian states.[15][5] It only took two weeks for the EFF petition to reach 25,000 signatures.[15]

On 15 October 2021 Forbes reported that 15 experts had written a joint paper entitled "Bugs in our Pockets: The Risks of Client-Side Scanning" to debunk the rosy picture painted by Apple. The authors included stars like Ronald Rivest and Bruce Schneier, who wrote that:[16]

The Rivest et al paper was covered on the same day by The Guardian who observed that Apple referred them to a press release that said "Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."[17]

On 16 December 2021 a Swiss researcher notice that Apple had removed most of the material about CSAM from its website.[18]

References

  1. "CSAM Detection - Technical Summary". Apple Inc. August 2021. https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf. 
  2. Claburn, Thomas (18 August 2021). "Apple didn't engage with the infosec world on CSAM scanning – so get used to a slow drip feed of revelations". The Register. Situation Publishing. https://www.theregister.com/2021/08/18/apples_csam_hashing/. 
  3. 3.0 3.1 3.2 Ciacci, Chris (5 August 2021). "Fury at Apple's plan to scan iPhones for child abuse images and report 'flagged' owners to the police after a company employee has looked at their photos". Associated Press. Associated Newspapers Ltd. https://www.daily-mail.co.uk/sciencetech/article-9866193/Apple-scan-U-S-phones-images-child-abuse.html. 
  4. 4.0 4.1 Mukherji, Sushmit (23 August 2021). "Apple's new NeuralHash technology will scan iCloud photos for child abuse content". GizMeek. TheBrinkwire. https://gizmeek.com/apple-new-neuralhash-technology-will-scan-icloud-photos-for-child-abuse-content. 
  5. 5.0 5.1 Scamell, Robert (3 September 2021). "Apple halts user content scan plan following privacy outrage". Verdict. https://www.verdict.co.uk/apple-scanning-tool/. 
  6. 6.0 6.1 Paul, Andrew (6 August 2021). "Apple's impending photo hashing update sounds like a privacy horror story". Input Magazine. https://www.inputmag.com/tech/snowden-others-respond-to-yesterdays-bombshell-apple-privacy-announcement. 
  7. 7.0 7.1 7.2 Ortutay, Barbara; Bajak, Frank (5 August 2021). "Apple to scan U.S. phones for images of child abuse". Associated Press. CP24, a unit of BellMedia. https://www.cp24.com/lifestyle/apple-to-scan-u-s-phones-for-images-of-child-abuse-1.5535981. 
  8. 8.0 8.1 8.2 Brandom, Russell (18 August 2021). "Apple says collision in child-abuse hashing system is not a concern". The Verge. Vox Media, LLC. https://www.theverge.com/2021/8/18/22630439/apple-csam-neuralhash-collision-vulnerability-flaw-cryptography. 
  9. "Apple to combat child sex abuse images on iPhones in US". Deutsche Welle. 5 August 2021. https://www.dw.com/en/apple-to-combat-child-sex-abuse-images-on-iphones-in-us/a-58775442. 
  10. Parrott, Jeff (6 August 2021). "Apple says it will scan iPhones for images of child abuse, but is that a breach of privacy?". Deseret News Publishing Company. https://www.deseret.com/u-s-world/2021/8/6/22613067/apple-scan-iphones-image-child-abuse-privacy. 
  11. "Apple’s Software Chief Explains ‘Misunderstood’ iPhone Child-Protection Features". YouTube. Wall Street Journal. 13 August 2021. https://www.youtube.com/watch?v=OQUO1DSwYN0. 
  12. Kuederle, Oliver (n.d.). "THE PROBLEM WITH PERCEPTUAL HASHES". rentafounder.com. https://rentafounder.com/the-problem-with-perceptual-hashes/. 
  13. Pluimers, Jerome Wiert (24 August 2021). "Some links related to Apple’s NeuralHash algorithm, as it was reverse engineered and collisions can be generated so abuse with pictures matching sensitive hashes can be performed". The Wiert Corner. https://wiert.me/2021/08/24/some-links-related-to-apples-neuralhash-algorithm-as-it-was-reverse-engineered-and-collisions-can-be-generated-so-abuse-with-pictures-matching-sensitive-hashes-can-be-performed/. 
  14. Turner, Maurice (24 August 2021). "How Authoritarians Can Exploit Apple’s Child Safety Initiatives". Alliance for Securing Democracy. German Marshall Fund of the United States. https://securingdemocracy.gmfus.org/how-authoritarians-can-exploit-apples-child-safety-initiatives/. 
  15. 15.0 15.1 Wakefield, Jane (3 September 2021). "Apple delays plan to scan iPhones for child abuse". BBC. https://www.bbc.com/news/technology-58433647. 
  16. Woollacott, Emma (15 October 2021). "Experts Slam Apple's Child Protection Phone-Scanning Technology". Forbes. https://www.forbes.com/sites/emmawoollacott/2021/10/15/experts-slam-apples-child-protection-phone-scanning-technology/?sh=25e07db43f7a. 
  17. Siddique, Haroon (15 October 2021). "Apple’s plan to scan for child abuse images ‘tears at heart of privacy’". Guardian News & Media Limited. https://www.theguardian.com/world/2021/oct/15/apple-plan-scan-child-abuse-images-tears-heart-of-privacy. 
  18. Bärlocher, Dominik (16 December 2021). "NeuralHash: Apple removes all mentions of CSAM detection from website". digital.ch. https://www.digitec.ch/en/page/neuralhash-apple-removes-all-mentions-of-csam-detection-from-website-22203.