red lock cracked in center yellow EU flag stars around on white background

(Licensed)

EU’s plan to scan phones for child abuse material is even more invasive than Apple’s disastrous idea

'This document is the most terrifying thing I’ve ever seen.'

 

Jacob Seitz

Tech

Posted on May 12, 2022   Updated on May 16, 2022, 10:08 am CDT

A new European Union proposal would require tech companies to scan encrypted messages for child sex abuse material (CSAM), angering privacy advocates.

“With 85 million pictures and videos depicting child sexual abuse reported worldwide in 2021 alone, and many more going unreported, child sexual abuse is pervasive,” the proposal announcement said. “The current system based on voluntary detection and reporting by companies has proven to be insufficient to adequately protect children and, in any case, will no longer be possible once the interim solution currently in place expires.”

The proposal, released yesterday, would require “providers to detect, report and remove child sexual abuse material” and to “assess and mitigate the risk of misuse of their services and the measures taken must be proportionate to that risk and subject to robust conditions and safeguards.”

Not only would the legislation require CSAM scanning for images but also for grooming material, which would involve scanning texts. Messaging platforms like iMessage and WhatsApp use end-to-end encryption, making this type of scanning impossible.

WhatsApp head Will Cathcart said in a tweet that he was “incredibly disappointed” that an EU internet bill would “fail to protect end-to-end encryption.”

“As is, this proposal would force companies to scan every person’s messages and put EU citizens’ privacy and security at serious risk,” he said. “This is wrong and inconsistent with the EU’s commitment to protecting privacy.”

Cathcart said the new proposal is similar to a proposed screening policy from Apple last year, which would allow the company to scan users’ photos for CSAM. 

“Building a surveillance system to proactively scan all private content was a terrible idea when Apple proposed it and it’s a terrible idea now,” he said.

Apple’s plan—which was announced late last year but then quietly reversed—would attempt to keep end-to-end encryption intact by scanning for CSAM on devices, rather than intercepting messages or decrypting them. Still, the proposal was bashed by critics. Over 90 internet rights groups signed a letter to Apple CEO Tim Cook that said if the proposal was finalized Apple would have “laid the foundation for censorship, surveillance, and persecution on a global basis.”

Matthew Green, a cryptographer at Johns Hopkins, called the EU’s proposal “the most terrifying thing I’ve ever seen” in a series of tweets.

“Let me be clear what that means: to detect ‘grooming’ is not simply searching for known CSAM,” he said. “It isn’t using AI to detect new CSAM, which is also on the table. It’s running algorithms reading your actual text messages to figure out what you’re saying, at scale.”

The proposal will be voted on by the European Parliament and Council, but it is unclear when that vote would take place.


Read more of the Daily Dot’s tech and politics coverage

Libs of TikTok—the influential, mystery Twitter account hailed by mainstream conservatives—attended Jan. 6 Capitol protest
How police persecute the LGBTQ community—and how big tech can help its most vulnerable users
Anti-vaxxers will host a simulated ‘grand jury trial’ of Dr. Fauci for 5 days. For $10,000, you can be a ‘VIP juror’
‘That is not an official CIA account’: Trump’s social media platform is filled with fake government profiles
Truth Social employees praise murderous dictator in beta testing for Trump’s new app
Sign up to receive the Daily Dot’s Internet Insider newsletter for urgent news from the frontline of online.
Share this article
*First Published: May 12, 2022, 1:03 pm CDT