Apple said its automated illegal child abuse detection tools released on iPhone and iPad are very confusing.
On August 5, the company launched a new image recognition software to alert Apple when a known illegal image is uploaded to its iCloud storage. Privacy groups criticized the news, and some claimed that Apple created a security backdoor in its software.
The company said its advertisements had been widely “misunderstood.” Apple Software stated that it hopes everyone knows better.
Craig Federighi in an interview with The Wall Street Journal said afterward that introducing two functions simultaneously was “a secret to this confusion.”
Image Detection and message filtering
When users upload photos to iCloud storage, the first tool can identify known child sexual abuse material (CSAM). The National Center for Missing and Exploited Children (NCMEC) maintains a known illegal child abuse pictures. It stores them in the form of hashes, digital “fingerprints” of illegal materials.
Cloud service providers such as Facebook, Google, and Microsoft are already using these hash values to check images to ensure that people don’t broadcast CSAM. Apple decided to implement a similar process but said it would match the user’s iPhone or iPad image before uploading it to iCloud.
Apple regrets confusion over 'iPhone scanning' https://t.co/HLmiBytn4N
— BBC News (UK) (@BBCNews) August 13, 2021
Federighi said that iPhone would not check photos or pornographic content of his children in the bathroom. He said the system could only match “precise fingerprints” of certain known child abuse images if a user attempts to upload multiple child abuse fingerprint images.
In addition to iCloud tools, Apple also announced parental controls that users can enable on their children’s accounts. Apple will flag their account so that specific images can be viewed. According to Federighi, users must upload approximately 30 matching images to enable this feature.
When this feature is enabled, the system will review the child or photos sent by the child through the Apple iMessage app. The machine learning system determines that the photo contains nudity; it will obscure and warn the child. Should the child decide to view the photo, the parent will also receive a notification.
Criticism
It may be used by the authoritarian government to monitor its citizens. Privacy groups also worry that the technology may expand. WhatsApp director Will Cathcart called Apple’s move very worrying,” and US whistleblower Edward Snowden called the iPhone a “spy phone.”
Federighi said that the “Soundbyte” released after the announcement allowed Apple to scan images in the iPhone. “This is not what happened,” he told the Wall Street Journal.
“We are very confident and confident in what we are doing, and we see this being widely misunderstood. This will be added to new versions of iOS and iPadOS later this year.”
Image courtesy of pal2tech/YouTube