Apple delays plans to search devices for child abuse imagery

Apple delays plans to search devices for child abuse imagery

After the uproar from users and privacy advocates about Apple’s controversial plans to scan users’ devices for photos and messages containing child abuse and exploitation media, the company has decided to put the brakes on the plan.

If you may recall, Apple announced in early August that it would introduce the new capability in iOS 15, iPadOS 15, watchOS 8, and macOS Monterey. These features, per Apple, are “intended to help protect children from predators who use communication tools to recruit and exploit them and the help limit the spread of Child Sexual Abuse Material (CSAM)”.

These child safety features, which the company claims were developed with the help of child safety experts, feature, firstly, an updated iMessage app, that will alert parents and their children when sexually explicit images are either sent from or received by their devices. If, for example, the child receives such an image, they will be presented an option to view it or not. And if they do, their parents would be notified that they have viewed it. Something similar happens when the child sends sexually explicit photos.

Secondly, iPhones and iPads would allow Apple to detect CSAM material in photos that are being uploaded to iCloud. If an i-device finds photos that match, or resemble, photos in a database of known CSAM material, the material is flagged as such. To reduce the chance of false positive matches (where a user is wrongfully accused), users have to exceed a threshold number of flags before Apple is actually alerted.

Thirdly, Siri and Search will be updated to provide additional resources for children and parents to stay safe online. These two also intervene when a user searches for CSAM material.

We don’t doubt Apple’s good intentions, nor the seriousness of the child abuse problem it is attempting to tackle. And there is no question that it has gone to great lengths to engineer a solution that attempts to preserve users’ privacy without creating a haven for CSAM distribution.

The issue is that the technology also opens a door for some serious issues.

Many have expressed concern that Apple could be coerced into using this on-device scanning infrastructure to scan for other things, and doubts have been raised about Apple’s assessment of the false positive rate.

There are other concerns too, that this one-size-fits-all technology could put some vulnerable users in danger. “This can be a serious violation of a child’s privacy, and the behavior of this feature is predicated on an assumption that may not be true: That a child’s relationship with their parents is a healthy one. This is not always the case,” writes Thomas Reed, Malwarebytes’ Director of Mac & Mobile, in a thoughtful blog post on the matter.

Reed’s article is well worth a read: It delves into other potential problems with these new changes, and covers how and why the technology works the way it does.

Since they were announced, organizations like the Electronic Frontier Foundation (EFF), Fight for the Future, and OpenMedia have all conducted petitions to pressure Apple into backpedaling from implementing its plans.

Apple listened:

Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.

For the EFF, delaying plans is not good enough though. It insists that Apple must “drop its plans to put a backdoor into its encryption entirely.”

ABOUT THE AUTHOR