Categories: TechnologyWorld
| On 2 months ago

Apple’s Plan to Scan iPhones for Child Sexual Abuse Material Attracts 5,000 Signatures Against it

By Aswin Kumar

Apple has proposed a new approach using cryptography which will “help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM)”.

Apple is planning to implement this approach as a feature on its devices via a software update later this year. The feature is expected to go live in the United States initially.

The new feature aims to decrease the spread of Child Sexual Abuse Material (CSAM) on the internet especially via Apple platforms.

The anti-CSAM feature will work as an on-device filter that will look for sensitive contents in your device. Both receiving and sending contents will be monitored by the program and in the case of users below the age of 13, their parents will be alerted if found any.

The feature will also intervene in your Siri and search activities if either of the two is used to lookup CSAM related topics.

In a blog post, Apple said that it will use cryptography techniques to match known CSAM images stored on iCloud Photo. The technology will match images on a user’s iCloud with known images provided by child safety organizations.

The collected database is then transformed into “an unreadable set of hashes” and in the case of founding a match, Apple will “report these instances to the National Center for Missing and Exploited Children (NCMEC)”.

An open letter asking Apple to step back from introducing the child sexual abuse material identifier feature has been signed by about 5,000 organizations and individuals.

“While child exploitation is a serious problem, and while efforts to combat it are almost unquestionably well-intentioned, Apple’s proposal introduces a backdoor that threatens to undermine fundamental privacy protections for all users of Apple products,” the letter

Advertisement
said.

“It’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses […] That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change.”

said The Electronic Frontier Foundation on Thursday

The Electronic Frontier Foundation believes this feature to be a “backdoor” for Apple to spy on its users.

The scanning “threatens to undermine fundamental privacy protections,” the letter said. WhatsApp chief Will Cathcart and NSA whistleblower Edward Snowden also voiced concern.

“I’ve tried hard to see this from Apple’s point of view. But inescapably, this is government spyware installed by Apple based on a presumption of guilt. Though Apple wrote the code, its function is to scan personal data and report it to government.”

Apple’s old rival Epic game’s CEO Tim Sweeney said

If I put my knowledge into play, cryptographically hashed data are not reversible due to “bit dependency” but Apple’s claim of “Hashing every data cryptographically” is not yet verified, and the whole algorithm of how its new feature is not yet public or available for beta testing which could be a serious concern for every Apple and puts user’s privacy at stake.

Comment the ways you could guess which may allow Apple to screen the data even if the data is hashed!

Subscribe to our newsletter
To be updated with all the latest news, offers and special announcements.

📣 Yellow Telegraph is now on Telegram. Click here to join our channel (@YellowTelegraph) and stay updated with the latest headlines

Advertisement
Aswin Kumar

A creative science nerd! Buy me a coffee: buymeacoffee.com/aswinkumar

Disqus Comments Loading...