Tech

FBI asks Meta to not implement end-to-end encryption, says it will make its job harder

‘A purposeful design choice that degrades safety systems and weakens the ability to keep child users safe.’

Photo of Jacob Seitz

Jacob Seitz

person typing at laptop with locks in chain web overlay

A global law enforcement consortium is urging Meta not to go through with plans to fully encrypt messaging on the company’s platforms, saying the system could be used to share and spread child sexual abuse material (CSAM).

Featured Video

The Virtual Global Taskforce, a group of 15 global law enforcement agencies focused on child sexual abuse, said in a statement posted to the United Kingdom National Crime Agency’s website that Meta’s encryption plans could result in the company “blindfolding themselves” to CSAM on their platforms.

The task force includes two U.S. agencies—the FBI and ICE—and also includes Interpol, the National Crime Agency in the U.K., as well as agencies from Colombia, Australia, Kenya, and the Philippines. 

“Meta is currently the leading reporter of detected child sexual abuse to the [National Center for Missing & Exploited Children],” the task force said. According to the group, Meta hasn’t provided any indication “that any new safety systems implemented post-E2EE [End-to-End Encryption] will effectively match or improve their current detection methods.”

Advertisement

Meta announced plans in January to fully encrypt messages sent in Facebook Messenger, which would prevent third parties—like law enforcement—from using platforms to access messages sent between users. Meta already rolled out end-to-end encryption on Instagram last year. In a statement to ArsTechnica, a Meta spokesperson said the company plans to make all messages encrypted by default by the end of this year.

“The announced implementation of E2EE on Meta platforms Instagram and Facebook is an example of a purposeful design choice that degrades safety systems and weakens the ability to keep child users safe,” the group said.

The statement comes months after Meta launched new initiatives to prevent CSAM on its platforms. In February, the company announced that Facebook and Instagram would help found Take It Down, a new platform designed to assign unique hash codes to underage users’ private images and aid in their removal from the internet. 

The company never possesses or sees the images, but Take It Down assigns codes to photos the user selects from their camera roll. The program then submits the hash codes to the NCMEC, which distributes the hash code to other participating social media companies and can prevent the sharing of underage private images on other platforms. 

Advertisement

According to a statement from NCMEC, the organization is working on “​​future iterations of the program” to accommodate for screenshotted images, which at this point need their own unique hash code. The organization also said that the hash code is not traceable to a user’s device or personal information, making the code nearly untraceable. 

Take It Down builds off Meta’s Stop NCII software, which operates under a similar concept for adults in the U.K.

web_crawlr
We crawl the web so you don’t have to.
Sign up for the Daily Dot newsletter to get the best and worst of the internet in your inbox every day.
Sign up now for free
Advertisement
 
The Daily Dot