Apple scans US iPhones for child sexual abuse images

Apple has unveiled plans to scan US iPhones for images of child sexual abuse, to the liking of child protection groups, but it has raised concerns among some security researchers that the system could be misused, including by governments looking to monitor their citizens.

The tool designed to detect known child sexual abuse photos, called “NeuralMatch,” will scan the photos before they’re uploaded to iCloud.

If you find a match, the image will be reviewed by a human. If child pornography is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children notified.

Separately, Apple plans to scan users’ encrypted messages for sexually explicit content as a child safety measure, which has also worried privacy advocates.

The detection system will only flag images that are already in the center’s database of known child pornography. It is assumed that parents who take innocent photos of a child in the bathroom do not need to worry.

But the researchers say the matching tool — which doesn’t “see” such images, but merely the mathematical “fingerprints” they represent — can be put to nefarious purposes.

Matthew Green, a senior cryptographic researcher at Johns Hopkins University, warned that the system could be used to frame innocent people by sending seemingly harmless images designed to provoke matches in child pornography. This can fool Apple’s algorithm and law enforcement alerts.

“The researchers were able to do it quite easily,” he said of the ability to deceive such systems.

Potential for abuse

Other abuses could include government monitoring of dissidents or protesters. “What happens when the Chinese government says, ‘This is a list of files we want you to check,'” Green asked.

“Is Apple saying no? I hope they’ll say no, but their technology won’t say no.”

Tech companies including Microsoft, Google, Facebook and others have for years been sharing digital fingerprints of known child sexual abuse images. Apple used these files to scan user files stored in its iCloud service, which are not as securely encrypted as the data on the device, for child pornography.

Apple has been under government pressure for years to allow increased monitoring of encrypted data.

Introducing new security measures requires Apple to strike a delicate balance between suppressing child exploitation while maintaining its high-level commitment to protecting the privacy of its users.

But the frustrated Electronic Frontier Foundation, the pioneer of online civil liberties, called Apple’s settlement of privacy protections a “shocking shift for users who have relied on the company’s leadership in privacy and security.”

Meanwhile, the computer scientist who invented PhotoDNA more than a decade ago, the technology used by law enforcement to identify child pornography on the Internet, acknowledged the potential for abuse in Apple’s system, but said it far outweighs the imperative to combat child sexual abuse.

I haven’t seen “this kind of creepy task,” said Hani Farid, a researcher at the University of California, Berkeley, who argues that a lot of other software designed to secure devices from various threats.

For example, WhatsApp provides users with end-to-end encryption to protect their privacy, but it also uses a system to detect malware and warn users against clicking on malicious links.

‘Game changer’

Apple was one of the first major companies to adopt “end-to-end encryption,” in which messages are shuffled so that only senders and recipients can read them. However, law enforcement has long pressured the company to access that information in order to investigate crimes such as terrorism or child sexual exploitation.

Apple said the latest changes will be introduced this year as part of its driver updates for the iPhone, Mac and Apple Watch.

“Apple’s expanded protections for children are a game changer,” John Clark, president and CEO of the National Center for Missing and Exploited Children, said in a statement. “Because there are so many people using Apple products, these new safety measures have the potential to save children’s lives.”

Julia Cordua, CEO of Thorn, said Apple’s technology balances “the need for privacy and digital security for children.” Thorn, a nonprofit founded by Demi Moore and Ashton Kutcher, uses technology to help protect children from sexual abuse by identifying victims and working with tech platforms.

break security

But in stinging criticism, the Washington-based nonprofit Center for Democracy and Technology called on Apple to abandon changes it said effectively destroy the company’s guarantee of “end-to-end encryption.”

She added that screening messages for sexually explicit content on phones or computers effectively breaks security.

The organization also questioned Apple’s technology to distinguish between dangerous content and something tame like art or a meme. CDT said in an email statement that such technologies are known to be error-prone. Apple denies the changes are a backdoor to its encryption. It says that they are carefully considered innovations that do not disturb user privacy, but rather protect it strongly.

Separately, Apple said its messaging app will use on-device machine learning to identify and blur sexually explicit images on children’s phones, and can even warn parents of young children via text messages. It also said its software would “interfere” when users try to search for topics related to child sexual abuse.

To receive warnings about sexually explicit images on their children’s devices, parents will have to register their children’s phone. Children over the age of 13 can opt out, which means parents of teens will not receive notifications.

Apple said neither feature would compromise the security of private communications or notify police.

Related Posts

Leave a Reply

Your email address will not be published.