Site icon IT World Canada

Apple’s controversial plan to scan devices for child sexual abuse photos

data security,privacy,security

Image courtesy of Shutterstock.com

Apple’s plan to use image-matching technology on users’ devices and iCloud accounts to spot child sexual abuse photos has understandably earned praise from child protection groups. However, privacy and security researchers are uncomfortable with the plan.

“Apple’s expanded protection for children is a game-changer,” the Associated Press quoted John Clark, CEO of the U.S. National Center for Missing and Exploited Children (NCMEC), as saying. “With so many people using Apple products, these new safety measures have lifesaving potential for children.”

However, the New York Times quoted Matthew D. Green, a cryptography professor at Johns Hopkins University, as being totally opposed. “They [Apple] have been selling privacy to the world and making people trust their devices,” he said. “But now they’re basically capitulating to the worst possible demands of every government. I don’t see how they’re going to say no from here on out.”

Canadian privacy expert Ann Cavoukian looks at similar comments from technology experts like Green and finds it hard not to agree.

Apple’s intention is “very honorable,” she said in an interview. “They want to help detect child sexual abuse.”

But she added, “here’s the disheartening thing: for years Apple has resisted pressure – especially from the US government – to install some sort of backdoor or enable access to end-to-end encrypted messages” on Apple devices. ”When I look at what tech people who have been talking about this say it is hard to resist their analysis: To suggest there is no way of breaking into this is a stretch.”

“I have applauded Apple extensively for their encryption [of personal data] stance. But this may represent a weakening of that, no matter how honorable the reasons.”

The ability to do what Apple calls on-device hash matching will start later this year with  iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.

How it will work

Apple said Thursday the technology — to first be used only in the U.S. — will allow it to detect known child sexual abuse images stored in iCloud Photos. Preferring not to use the word “scanning,” Apple says its system performs on-device matching of images hashes from a database of known child sexual abuse image hashes provided by the NCMEC and other child safety organizations. Apple said it converts this database into an unreadable set of hashes that is securely stored on users’ devices.

Before an image taken on an Apple device is stored in iCloud Photos, an on-device matching process is performed for that image against the known child abuse image hashes. “This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result,” says Apple. “The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.

“Using another technology called threshold secret sharing, the system ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM (child sexual abuse material) content. The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account.

“Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the contents of the safety vouchers associated with the matching CSAM images. Apple then manually reviews each report to confirm there is a match, disables the user’s account, and sends a report to NCMEC. If a user feels their account has been mistakenly flagged they can file an appeal to have their account reinstated.”

Apple says the plan allows it to provide actionable information to NCMEC and law enforcement about exploitive child images that includes privacy protection. “Apple only learns about users’ photos if they have a collection of known CSAM in their iCloud Photos account. Even in these cases, Apple only learns about images that match known CSAM,” it says.

Apple has published a technical description of how the system works on its website.

New tools in Messages

In addition, Apple said its Messages app will add new tools to warn children and their parents when receiving or sending sexually explicit photos.

When receiving this type of content, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view this photo, Apple said. The child can also be told that, to make sure they are safe, their parents will get a message if they do view it. Similar protections are available if a child attempts to send sexually explicit photos. The child will be warned before the photo is sent, and the parents can receive a message if the child chooses to send it.

Messages uses on-device machine learning to analyze image attachments and determine if a photo is sexually explicit. Apple said the company will not get access to the messages.

TechCrunch noted that most cloud services — including Dropbox, Google, and Microsoft — already scan user files for content that might violate their terms of service or be potentially illegal. But Apple, it said, has long resisted scanning users’ files in the cloud by giving them the option to encrypt their data before it ever reaches Apple’s iCloud servers.

Exit mobile version