I’ve had independent confirmation from multiple people that Apple is releasing a client-side tool for CSAM scanning tomorrow. This is a really bad idea.
I read the information Apple put out yesterday and I'm concerned. I think this is the wrong approach and a setback for people's privacy all over the world. People have asked if we'll adopt this system for WhatsApp. The answer is no.
Apple's filtering of iMessage and iCloud is not a slippery slope to backdoors that suppress speech and make our communications less secure. We’re already there: this is a fully-built system just waiting for external pressure to make the slightest change. eff.org/deeplinks/2021/08/apples-plan-think-different-about-encryption-opens-backdoor-your-private-life
Apple is planning to scan U.S. iPhones for images of child sexual abuse www.npr.org/2021/08/06/1025402725/apple-iphone-for-child-sexual-abuse-privacy
Is Apple going to scan your photos?
Apple will scan your iCloud photos and watch them with the NCMEC database to detect if there are any CSAM images. ... When a photo is uploaded to iCloud, Apple creates a cryptographic safety voucher that's stored with it. The voucher contains detail to determine if the image matches against known CSAM hashes. TNWWhy experts are worried about Apple's plan to scan every picture on your iPhone