Apple says safety feature to find child images doesn’t create backdoor that reduces privacy

  • Apple Inc defended concerns about its upcoming child safety features, saying it isn’t breaking end-to-end encryption with a new feature in the Messages app that analyzes photos sent to or from a child’s iPhone for explicit material, nor will the company gain access to user messages

By: Bloomberg | August 7, 2021

Apple Inc defended concerns about its upcoming child safety features, saying it doesn’t believe its tool for locating child pornographic images on a user’s device creates a backdoor that reduces privacy.

The Cupertino, California-based technology giant made the comments in a briefing Friday, a day after revealing new features for iCloud, Messages and Siri to combat the spread of sexually explicit images of children.

The company reiterated that it doesn’t scan a device owner’s entire photo library to look for abusive images, but instead uses cryptography to compare images with a known database provided by the National Center for Missing and Exploited Children.

Some privacy advocates and security researchers were concerned after Apple’s announcement that the company would scan a user’s complete photo collection — instead the company is using an on-device algorithm to detect the sexually explicit images.

Apple said it would manually review abusive photos from a user’s device only if the algorithm found a certain number of them. The company also said it can adjust the algorithm over time.

Apple said it isn’t breaking end-to-end encryption with a new feature in the Messages app that analyzes photos sent to or from a child’s iPhone for explicit material, nor will the company gain access to user messages. Asked on the briefing if the new tools mean the company will add end-to-end encryption to iCloud storage backups, Apple said it wouldn’t comment on future plans. End-to-end encryption, the most stringent form of privacy, lets only the sender and receiver see a message sent between them.

On Thursday, the Electronic Frontier Foundation said Apple is opening a backdoor to its highly touted privacy features for users with the new tools. “It’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children,” the EFF said in a post on its website. “As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses.”

Apple said the system had been in development for years and wasn’t built for governments to monitor citizens. The system is available only in the US, Apple said, and only works if a user has iCloud Photos enabled.

Dan Boneh, a cryptography researcher tapped by Apple to support the project, defended the new tools. “This issue affects many cloud providers,” he said. “Some cloud providers address this problem by scanning photos uploaded to the cloud.

Apple chose to invest in a more complex system that provides the same functionality, but does so without having its servers look at every photo.”

Related posts

APC Demands Restoration Of Sacked UNIZIK VC Benard Odoh, Accuse Education Minister Of Undue Interference, Sabotage

Lakurawa Terrorists Now Flushed Out Of Nigeria, Says Senator Aliero

IPOB Disowns Simon Ekpa, Says He’s Leader Of “Criminally Minded” Biafra Liberation Army Unleashing Violence In South East Nigeria

This website uses Cookies to improve User experience. We assume this is OK...If not, please opt-out! Read More