……but don’t ever think anything online is private. The Good, the Bad, detailed below.
Apple reportedly plans to make iOS detect child abuse photos
By William GallagherAug 05, 2021
A security expert claims that Apple is about to announce photo identification tools that would identify child abuse images in iOSphoto libraries.
Apple has previously removed individual apps from the App Store over child pornography concerns, but now it’s said to be about to introduce such detection system wide. Using photo hashing, iPhones could identify Child Sexual Abuse Material (CSAM) on device.
Apple has not confirmed this and so far the sole source is Matthew Green, a cryptographer and associate professor at Johns Hopkins Information Security Institute.
According to Green, the plan is initially to be client-side — that is, have all of the detection done on a user’s iPhone. He argues, however, that it’s possible that it’s the start of a process that leads to surveillance of data traffic sent and received from the phone.
“Eventually it could be a key ingredient in adding surveillance to encrypted messaging systems,” continues Green. “The ability to add scanning systems like this to E2E [end to end encryption] messaging systems has been a major ‘ask’ by law enforcement the world over.”
“This sort of tool can be a boon for finding child pornography in people’s phones,” he said. “But imagine what it could do in the hands of an authoritarian government?”
Green who, with his cryptography students, has previously reported on how law enforcement may be able to break into iPhones. He and Johns Hopkins University have also previously worked with Apple to fix a security bug in Messages.
Apple Is Trying To Stop Child Abuse On iPhones—So Why Do So Many Privacy Experts Hate It?
Apple has made a lot of pro-privacy people mad this week with the announcement of plans to start scanning everyone’s iPhone photos for child sexual abuse material (CSAM).
To do that, it’ll compare all photos on an iCloud-enabled device (including iPads and Macs) to databases of known CSAM images and it’ll be done by checking “hashes.” Think of a hash as a number that represents an image. That number, created when the photo’s data is run through a one-way cryptographic algorithm, is supposed to be unique to the image, so it should be quick and easy for a match to be found. Then Apple employees will review the image and share the match with the National Center for Missing and Exploited Children (NCMEC).
By law, American companies have to report child abuse and exploitation imagery on their servers to NCMEC, which then works with law enforcement on an investigation. Other tech giants do the same when emails or messages are sent over their platforms. That includes Google, Microsoft and Facebook. So why are so many privacy advocates up in arms about Apple’s announcement?
It’s because Apple is checking photos on your iPhone, not just on its own servers in the iCloud. It’s going one step beyond what its rivals have done, checking every photo on a device rather than just on a company server. (It’s also scanning images to check whether they’re of nude children, using a different technology, but that’s all done on the device and doesn’t go to Apple. A simple warning comes up, suggesting iPhone users may not want to send or view nude images.)
Alec Muffett, a noted encryption expert and former Facebook security staffer, explained on Twitter that when someone buys a phone, they expect to have control over what’s happening on their property. But Apple is denying that right and “although it ostensibly exists to prevent upload of CSAM to their iCloud platform, they are using the user’s device to do it and making the tectonic-shift statement that ‘it’s ok by us to do this sort of thing to user devices.’”
Muffett and other encryption experts like Johns Hopkins professor Matt Green and NSA leaker Edward Snowdenhave also raised the alarm that Apple could now be pressured into looking for other material on people’s devices, if a government demands it.
“How such a feature might be repurposed in an illiberal state is fairly easy to visualize. Apple is performing proactive surveillance on client-purchased devices in order to defend its own interests, but in the name of child protection,” Muffett added. “What will China want them to block?
“It is already a moral earthquake.”
The Electronic Frontier Foundation (EFF) said that the changes effectively meant Apple was introducing a “backdoor” onto user devices. “Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out and narrowly scoped backdoor is still a backdoor,” the EFF wrote.
“Apple’s compromise on end-to-end encryption may appease government agencies in the U.S. and abroad, but it is a shocking about-face for users who have relied on the company’s leadership in privacy and security.”
Some people like it
Not that everyone is upset by the move. Nicholas Weaver, a computer security expert and lecturer at the University of California at Berkeley, said on Twitter that he didn’t blame Apple for choosing to risk fighting with oppressive regimes and take a tougher stance on child sexual abuse.
And David Thiel from the Stanford Internet Observatory noted that most people’s images on any internet-connected device are scanned for CSAM imagery.
“This unyielding hostility to reasonable and limited child safety measures drives me up the wall. Even if this compromised privacy—which, as documented, it does not—there are other harms in the world to be balanced with,” he added.