There’s been a lot of talk regarding Apple’s CSAM (Child Sexual Abuse Material) scanner. Now, the scanner is back in the news again, as it appears that hackers could be one step closer to tricking the CSAM scanner and creating false positives.
The Issue With Apple’s CSAM Scanner
A Reddit user did some reverse engineering to understand Apple’s NeuralHash algorithm for on-device CSAM detection. In doing so, they discovered a possible collision in the hash that could create false positives. A collision is a potential clash that occurs when two pieces of data have the same hash value, checksum, fingerprint, or cryptographic digest.
A coder named Cory Cornelius produced a collision in the algorithm, which means they found two images that create the same hash. This could be used to create false positives, which would flag images to Apple as containing child abuse even if they’re entirely innocuous.
While it certainly wouldn’t be easy, there’s the possibility that a hacker could generate an image that sets off the CSAM alerts even though it is not a CSAM image.
But there’s a difference between saying “yeah that’s almost certain to happen, in theory” and seeing it happen in real life. Apple went out of their way to keep the hash function secret — because they knew the risks.
— Matthew Green (@matthew_d_green) August 18, 2021
Apple does have layers designed to make sure the false positive doesn’t cause an issue. For example, when an image is flagged, it must be reviewed by an actual person before it is sent to law enforcement. Before it even gets to that point, the hacker would need to gain access to the NCMEC hash database, create 30 colliding images, and then get all of them onto the target’s phone.
That said, it’s just another issue that comes up with Apple’s CSAM scanner. There’s been tremendous opposition already, and the fact that coders were able to reverse engineer it already is very concerning. Instead of a collision taking months to pop up, one was discovered within hours of the code going public. That’s concerning.
Will Apple Do Anything?
Only time will tell how Apple addresses this situation. The company might backtrack on its plan to use the NeuralHash algorithm. At the very least, the company needs to address the situation, as confidence in Apple’s photo-scanning plan is already low.
- › Apple’s Controversial On-Device Child Abuse Scanner Delayed
- › How to Block Websites on Android
- › Disney+ With Ads Is Here, and It Doesn’t Work on Roku
- › Google Is Finally Making Chrome Use Less RAM
- › Bring Home Crisp Audio With Kanto YU2 Desk Speakers for $80 Off
- › Today Only: This Tiny 65W 2-Port USB-C Charger Is Just $32
- › You Can Now Protect Your Apple Account With Hardware Keys