Vertical Divider
Apple's Plan to Scan iPhones For Child Abuse, Pushes The Boundaries of Privacy Protection
Apple is working with US law enforcement to implement “on-device machine learning” for child safety. The damage that a single pedophile can do is tragic, and while the prevalence of pedophilia in the general population is not known, it is estimated in Wikipedia to be <6% of US adult males. Less is known about the prevalence of pedophilia in women, but there are case reports of women with strong sexual fantasies and urges towards children.
The monitoring of photo collections, with automatic alerts to law enforcement, can easily shift from something constructive to an intrusion on our privacy rights against unlawful search established in Constitution.
Scrutiny of Apple’s initiative grew louder over the weekend as an open letter criticizing the move collected more than 5,400 signatures from tech experts. WhatsApp head Will Cathcart said the app doesn’t plan to replicate Apple’s systems because “the approach they are taking introduces something very concerning into the world.”
Then, Erik Neuenschwander held a press briefing to clear the air, where he said the company is using on-device processing to create a log allowing the iPhone can compare and match the uploaded photo to a national database of indexed child sexual abuse material (CSAM). The device generates a cryptographic “safety voucher” about whether the image matches a known CSAM file already circulating on the internet. The voucher will then be stored on Apple’s servers. However, the voucher itself won’t reveal the contents of the image file, which will be decrypted when the iCloud account has passed a threshold on suspected CSAM.
Apple is working with US law enforcement to implement “on-device machine learning” for child safety. The damage that a single pedophile can do is tragic, and while the prevalence of pedophilia in the general population is not known, it is estimated in Wikipedia to be <6% of US adult males. Less is known about the prevalence of pedophilia in women, but there are case reports of women with strong sexual fantasies and urges towards children.
The monitoring of photo collections, with automatic alerts to law enforcement, can easily shift from something constructive to an intrusion on our privacy rights against unlawful search established in Constitution.
- According to TechCrunch, as briefed by Apple on how the technology will work, including parents being warned if children under 13 are sending or receiving what Apple will identify as sexually explicit photos in messages.
- But there’s significant backlash, not because of what this is exactly, and not even because Apple’s talked a lot about privacy and encryption.
- Security experts are raising privacy concerns about governments use and scope of technology like this.
- Apple posted a blog post entitled “Expanded protections for children,” where the company laid out plans to help curb child sexual abuse material (CSAM), where the new “child safety features" will come to iOS 15, iPadOS 15, WatchOS 8, and macOS Monterey. Apple will then activate the feature in the next few months for people in the US.
- Privacy experts responded, “This will break the dam — governments will demand it from everyone,” said Matthew Green, a security professor at Johns Hopkins University. Alec Muffet, a security researcher and privacy campaigner who formerly worked at Facebook, also noted on Twitter that Apple’s move was “tectonic” and a “huge and regressive step for individual privacy.”
- A tech reporter tweeted of another valid concern
Scrutiny of Apple’s initiative grew louder over the weekend as an open letter criticizing the move collected more than 5,400 signatures from tech experts. WhatsApp head Will Cathcart said the app doesn’t plan to replicate Apple’s systems because “the approach they are taking introduces something very concerning into the world.”
Then, Erik Neuenschwander held a press briefing to clear the air, where he said the company is using on-device processing to create a log allowing the iPhone can compare and match the uploaded photo to a national database of indexed child sexual abuse material (CSAM). The device generates a cryptographic “safety voucher” about whether the image matches a known CSAM file already circulating on the internet. The voucher will then be stored on Apple’s servers. However, the voucher itself won’t reveal the contents of the image file, which will be decrypted when the iCloud account has passed a threshold on suspected CSAM.
Erik Neuenschwander held a press briefing to clear the air. According to him, Apple’s approach doesn’t amount to traditional scanning, where the company’s servers will view every file on board the iPhone and learn their contents. Instead, the company is using on-device processing to create a log. An iPhone will essentially compare and match the uploaded photo to a national database of indexed child sexual abuse material (CSAM). The device will then generate a cryptographic “safety voucher” about whether the image matches a known CSAM file already circulating on the internet.
How does Neuenschwander’s explanation persuade the 4th amendment advocates or anyone that Apple is not conducting an illegal search and seizure event? It is the equivalent of a stranger entering your house unannounced, searching for “something”, and putting that “something” in a bag, which only they can access. We support the fight against child pornography and the protection of children, but this technique starts down a very slippery slope.
How does Neuenschwander’s explanation persuade the 4th amendment advocates or anyone that Apple is not conducting an illegal search and seizure event? It is the equivalent of a stranger entering your house unannounced, searching for “something”, and putting that “something” in a bag, which only they can access. We support the fight against child pornography and the protection of children, but this technique starts down a very slippery slope.
Contact Us
|
Barry Young
|