Apple to Police User Photos
The Big Tech company announced a new update to its iPhones that will scan for child pornography and alert authorities.
As the adage goes, the road to hell is paved with good intentions. But with the latest news out of Apple Inc., even those “good intentions” seem dubious. Apple has announced a plan to update all iPhones and iPads to install new device-scanning technology in an effort to find those housing child pornography. We can all agree we hate such images and abuse, but is this the right step for Apple?
Apple explained, “This innovative new technology allows Apple to provide valuable and actionable information to [the National Center for Missing and Exploited Children] and law enforcement regarding the proliferation of [child sexual abuse material].” Apple justified this unprecedented violation of costumers’ privacy rights by claiming that its new technology “does so while providing significant privacy benefits over existing techniques since Apple only learns about users’ photos if they have a collection of known CSAM in their iCloud Photos account.” Moreover, the Silicon Valley giant insists, “Even in these cases, Apple only learns about images that match known CSAM.” In other words, It’s okay because Apple will only scan bad stuff.
Yet this is the same company that refused back in 2015 to unlock the iPhones of the San Bernardino terrorists for the FBI. Apple’s contention at the time was to prevent the federal government from gaining “backdoor” technology for all iPhones, arguing, “We can find no precedent for an American company being forced to expose its customers to a greater risk of attack.”
But Apple will now be doing the feds’ work for them, and evidently gone are any concerns over potential government abuse. Making this even more troubling is the fact that Apple is proactively engaging in scanning all of its customers’ photos with no need of a warrant or even suspicion of criminal activity to do so. In spirit if not in law, this represents a serious violation of Fourth Amendment protections. Apple may dodge this challenge by hiding behind its status as a private company.
What’s to stop Apple from expanding this device surveillance technology to include other “objectionable” material such as “hate speech” or “misinformation”? As India McKinney and Erica Portnoy of the Electronic Frontier Foundation pointedly observed: “All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan. … That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change.”
To put it bluntly, Apple has taken the next big step in building the infrastructure necessary for the establishment of a surveillance state. John Hopkins University cryptography professor Matthew Green warns: “This sort of tool can be a boon for finding child pornography in people’s phones. But imagine what it could do in the hands of an authoritarian government?”
Good thing we don’t have one of those. Oh, wait…
Start a conversation using these share links: