https://techcrunch.com/2021/08/05/apple-icloud-photos-scanning/ Scanning users photo and reporting the results to government. Isn’t this government surveillance? What happen to the Apple that refused to help the FBI? This time is about child abuse images. Which is a good thing. However, this opens a door to endless possibilities. Ten years down the road, what other categories of images or contents will Apple scan and report to the government? Are you worried? What are your thoughts?
Time to walk away from Apple platform. Get Google pixel and custom flash with privacy focused open source Android like Lineage OS. Do not use public cloud to store media content. Who knows what else ends up in image hash databases, some day political memes and posters could be used to profile, etc. They start with something that makes sense and has broad public support and then slowly creep on to other things. Always happens. I remember few years ago Amazon offers unlimited photo storage in the cloud. Guess what they used it for? To train ML models for face recognition and sold software to law enforcement.
google is worse. they both serve the same masters. corps have you by the by the balls now that almost everyone is addicted to their toys. plan all along
Also Android is legit more secure than iOS now, which is why Android exploits cost much more on the zero-day market. Get a Pixel or a Samsung Galaxy if you want the most secure phone as proven by independent research.
Googler posting clickbait on Apple's privacy is rich. I have enormous reservations about this system's design, probable vulnerabilities and lack of accountability. Yet I still respect Apple clearly trying to help on a disgusting issue. The intent is good, but intentions are worthless when they drag everyone down through systemic failures.
It’s even richer that you think Apple is trying to help on privacy lol. They couldn’t care less and are just going where the money is by virtue signaling. They arguably have some of the worst privacy amongst tech companies (giving iCloud keys to CCP). FB and Google are terrible at privacy too, but at least we don’t claim we’re the champions of privacy and be hypocrites.
100% agreed re China. Every company sells out to China on human rights, privacy, etc. Virtue signaling is the name of the game. Criticize the US policy, differentiate with more security/privacy focus in US but do the opposite and keep quiet for China. G and FB are the opposite of privacy champions, but agreed not boasting about it. Apple gets a C for China in privacy/security to GFB's F
Apple…. I am watching you 👀
me too 👀
It’s not just Apple!
Boycott iCloud and iMessage
Yes, let’s go to Google instead. What will you buy? Google? Microsoft? From the article above: “Most cloud services — Dropbox, Google, and Microsoft to name a few — already scan user files for content that might violate their terms of service or be potentially illegal, like CSAM.”
Just store your files locally (and in cold storage for backups). Use services like Signal to bypass iMessage. It’s not that hard.
So Apple says they have a CSA content database and they will just match hash of the images in CSA to the hash of images in the user's library, but then they go on saying it would also be able to detect minor modifications done to it. Honestly this doesn't properly solve the problem and just raise privacy concerns.
I don’t care if they really are just searching for specific images because child trafficking is more important than me feeling nervous about the client side scanner. If they start doing more than that, then that is messed up. Either way I blame sickos for this and not so much Apple
Problem is, they will always start with something that makes sense and has wide public support. Then they will start creeps into everything. This is just the nature of power, you give them an inch they will try to squeeze in a foot. We need to have a clear line set up right now. Instead of ten years later we lose all of our privacy and become China.
Yeah you are right. Usually I’m very pro privacy at all costs but I’m a sucker for this. I guess at this point I just already assume there isn’t any privacy anyways. Does Google really wipe search history if I delete it?
Apple is a private company. You don’t like it opt out and don’t buy their products...
I won’t and I’ll take my family off their stuff with me, and trash them at every chance I get. This will catch a few perps and then they will get wise and fall back on doing whatever they did to circulate a view images pre-smartphones. Meanwhile the rest of us us pay in our privacy permanently, and likely in more and more invasive ways in the future. What if China asked Apple to report users who circulated Pooh Bear memes of Xi Jinping? Apple needs that market and now they can’t deny that they have the technical ability to do this. It’s a well intentioned dumb idea with 2nd order effects that the public is too misinformed to realize.
What will you buy? Google? Microsoft? From the article above: “Most cloud services — Dropbox, Google, and Microsoft to name a few — already scan user files for content that might violate their terms of service or be potentially illegal, like CSAM.”
Clearly the people who are voting that they are worried and think this is dangerous are falling prey to the loads of misinformation that is out there. I encourage everyone to read this great article on iMore https://www.imore.com/apple-child-safety After reading what Apple is ACTUALLY doing, I’m all for it. For those that won’t read the article, here are the important takeaways directly from it Apple's new measures will scan for user's photos that are to be uploaded to iCloud Photos against a database of images known to contain CSAM. These images come from the National Center for Missing and Exploited Children and other organizations in the sector. The system can only detect illegal and already documented photos containing CSAM without ever seeing the photos themselves or scanning your photos once they're in the cloud. None of the contents of the safety vouchers can be interpreted by Apple unless a threshold of known CSAM content is met, and Apple says the chance of incorrectly flagging someone's account is one in one trillion per year. Only when the threshold is exceeded is Apple notified so it can manually review the hash report to confirm there is a match. If Apple confirms this it disables a user account and sends a report to the NCMEC. So is Apple going to scan all my photos? Apple isn't scanning your photos. It is checking the numerical value assigned to each photo against a database of known illegal content to see if they match. The system doesn't see the image, rather the NeuralHash as shown above. It is also only checking images uploaded to iCloud, and the system cannot detect images with hashes that aren't on the database.
This is just the beginning. Do you think this is the stopping point? You just opened a gate to endless possibility. You can’t limit yourself to what they are describing now, they will always start with something that makes sense and having wide public support. Then a few years later they will silently let out the dog shit.
in order to aquire a hash pictures must be scanned. today its child porn. tomorrow its anything that goes against official narrative. history has proved it to be true too many times but thanks to tech it has never been easier for elites to control you. goebbels wet dream come true
So let’s take this scenario. I have a kid. My son. And I take a pic while taking a bath. Apple reports it to the government as pedophile content. That’s fucked up. Apple needs to be careful, I would consider this a big privacy issue
It would not, because Apple is only scanning for specific, known images. There are many arguments to be had here - broadly of the slippery slope variety - but as currently designed you actually need to be in possession of multiple, known, circulating images.
This time they are hashing the images and comparing the hash with known images. But my point is, this is a slippery slope that would lead more things. Almost every bad thing that happened in the history starts with good intentions. We can’t just limit ourselves to the right now use cases, we have to look into the future. We need to draw a clear line of what companies can or can not do with our digital properties, iCloud can be viewed as a public storage unit. Does the storage unit owner have the right to go throw every unit and exam every single item to make sure they are lawful, and report to the police if they found anything suspicious? Same thing for any rental properties.