Child safety group Heat Initiative plans to launch a campaign pressing Apple on child sexual abuse material scanning and user reporting. The company issued a rare, detailed response on Thursday.
But then they’ve built this entire system to spy on people’s photos and there’s only 1 barrier in place keeping the rest of the public safe. It would only be a matter of time before it is enabled for everyone.
You’re thinking of Google, where they data mine you as their primary business model. Google Photos scan your photos for object recognition, what do you think that is? There’s no E2E there at all. Apple’s object detection is done on device. It amazes me that Apple got attacked about this when literally everyone else is just doing it without telling you and not offering encryption.
Fair, except that unlike Apple’s plan, Google photos doesn’t automatically flag photos to be reviewed and the report people to the authorities. I guess it is very close to that capability though, so I agree.
But then they’ve built this entire system to spy on people’s photos and there’s only 1 barrier in place keeping the rest of the public safe. It would only be a matter of time before it is enabled for everyone.
You’re thinking of Google, where they data mine you as their primary business model. Google Photos scan your photos for object recognition, what do you think that is? There’s no E2E there at all. Apple’s object detection is done on device. It amazes me that Apple got attacked about this when literally everyone else is just doing it without telling you and not offering encryption.
Fair, except that unlike Apple’s plan, Google photos doesn’t automatically flag photos to be reviewed and the report people to the authorities. I guess it is very close to that capability though, so I agree.
@player2
Google already did that: https://www.theverge.com/2022/8/21/23315513/google-photos-csam-scanning-account-deletion-investigation
@phillaholic
Wow, TIL