These two images of a hatchet and a nematode are different but share the same NeuralHash! NeuralHash is the perceptual hashing…

NeuralHash, machine learning, patter matching, perception, Automated Puritanism, Apple, 2021

These two images of a hatchet and a nematode are different but share the same NeuralHash!

NeuralHash is the perceptual hashing model that back’s Apple’s new CSAM (child sexual abuse material) reporting mechanism. It’s an algorithm that takes an image as input and returns a 96-bit unique identifier (a hash) that should match for two images that are “the same” (besides some minor perturbations like JPEG artifacts, resizing, or cropping).

Yesterday, news broke that researchers had extracted the neural network Apple uses to hash images from the latest operating system and made it available for testing the system. Quickly, artificially colliding adversarial images were created that had matching NeuralHashes. Apple clarified that they have an independent server-side network that verifies all matches with an independent network before flagging images for human review (and then escalation to law enforcement).

(via https://blog.roboflow.com/nerualhash-collision/ )