Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is pretty big news. I wonder will there be an immediate push back by law enforcement and governments?


Remember the CSAM scanning debacle almost a year ago? I and others speculated that the reason Apple was trying to make the CSAM-scanning and Safety Vouchers client-side was so that they would be able to allow E2E encryption while having a plausible reason to shut down law enforcement's biggest argument against E2E.


It wouldn't stop at CSAM. Along side it in urgency of appeal to fear is counter-terrorism* . Next would be drug dealing, threats of violence. Then copyright infringement. And finally Amber Alerts and silver alerts. A backdoor or warrant-less search for one category is a backdoor for all. The point is for government power to trump privacy.

*The definition of terrorism depends on your jurisdiction.


While the on-device CSAM scanning was a huge overreach I'm not sure how you could leverage that system for things like Amber/silver alerts or threats of violence. It's not really backdoor, more of a snitch system.


That's a very optimistic point of view. On the other hand, I and others speculated that the reason Apple wants to introduce code on your device that scans local content on your device against a government mandated database of "wrong content" was to appease law enforcement's desire for more control.


I don't understand how your other hand argument is more pessimistic. Isn't your phone scanning locally for checksums better than requiring the data to be unencrypted and scannable server-side? Surely they couldn't just do _nothing_.

edit: I take this back—"nothing" should be the right answer.


_nothing_ is exactly what I expect them to do when it comes to my local files.

We all like to vilify Microsoft (rightfully so for all the telemetry crap they pull) but imagine if Windows started scanning all your local disks for files matching certain checksums then notifying authorities when matches occur (thumbnails / other metadata uploaded with the reports) like Apple was planning. Sure, it'll be CSAM first. Then, domestic terrorism; then RIAA / MPAA would jump in on the action... and finally, opaque checksum databases from local governments ("wrong think", Winnie the Pooh memes, pictures from protests, etc.) ; if we don't stop it in its infancy we're quickly tumbling down the slippery slope.


Thanks, you've changed my mind and I totally agree. (Sincerely in case it smelled of sarcasm).


The CSAM scanning was only enabled if you had iCloud uploads enable.

They would've only scanned the files that would end up in the cloud anyway.

But people went "omg my files", stuck their fingers in their ears and refused to read the damn spec.


The "damn spec" clearly stated that they would be introducing functionality on your device that is capable of scanning content on your device and matching that against a database of opaque hashes downloaded from a 3rd party. That's functionality I don't want on my device.

FWIW, I don't use iCloud and never have used it; I don't care if they scan content once uploaded (it's their servers and I'm confident they'll continue scanning content there no matter how "E2EE" it is - see China and key sharing). As long as they keep their scanning on their devices and off of my device it's all good.


> The CSAM scanning was only enabled if you had iCloud uploads enable.

This is nonsensical. iCloud Photos is not e2ee and Apple already scans everything serverside. There is no need for redundant clientside scanning of iCloud Photos.

The clientside scanning is only needed in the cases where:

1) iCloud Photos is turned off

or

2) iCloud Photos is e2ee


They wanted to enable #2 with the local CSAM scan. That way the authorities wouldn't have a reason to ask for cloud data to be decrypted. And Apple could lock it so that they couldn't de-encrypt it even if they wanted to.

Apple actively doesn't want to know your shit or analyse it on their servers. That's why they constantly do things on-device even if it's of worse quality than Google's approach of doing everything in the cloud.


Apple is a business and does indeed "want to know your shit" for many legitimate revenue-generating activities, such as growing their services business, a top priority for the company.

It seems to me a little bit suspect that they wanted to do clientside scanning as a prerequisite for e2ee, as if they simply would not be allowed to publish society-wide e2ee privacy software (without government/regulatory retaliation) without such a law enforcement backdoor. This screams of prior restraint and we should be loudly asking our legislators why the fuck the FBI is pressuring Apple about what software they do or do not publish.

https://www.reuters.com/article/us-apple-fbi-icloud-exclusiv...

There is a major 1A violation happening here.

Every day the US government takes more steps to erode our civil rights, even against the largest companies in the world. Someone needs to rein them in.


People don't care about them scanning the files, they care about them doing it on their own device. People read the damn spec, and that's why they disagreed with it.


I could have sworn apple even straight up said that was their goal?

Maybe I am just misremembering since like you I figured that was the reason they were doing it, no other reason to do something like that if it was all going too sit there unencrypted.


No, they didn't say anything like that at the time, so I was even downvoted on HN and argued with for making the suggestion. Because Apple was definitely just being evil and had no bigger picture.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: