A California state lawmaker wants to ban strong encryption on the grounds that it facilitates human trafficking.
Assemblyman Jim Cooper’s bill would “require a smartphone that is manufactured on or after January 1, 2017, and sold in California, to be capable of being decrypted and unlocked by its manufacturer or its operating system provider.”
The bill would effectively outlaw Apple‘s latest version of its iOS mobile operating system, which uses what is known as end-to-end encryption, a form of protection that even Apple cannot break. Other California-based tech companies, including Android maker Google, are also moving toward strong encryption.
Cooper’s proposal mirrors a New York bill introduced last week that would also require smartphone manufacturers to design their encryption so that law enforcement could break it in the course of an investigation. But while the New York bill’s sponsor cited the threat of terrorism in calling for weakened encryption, Cooper offered a different rationale.
In an interview with Ars Technica, Cooper said that tech companies were stonewalling police and delaying or preventing them from catching human traffickers.
“If you’re a bad guy [we] can get a search record for your bank, for your house, you can get a search warrant for just about anything,” he said. “For the industry to say it’s privacy, it really doesn’t hold any water. We’re going after human traffickers and people who are doing bad and evil things. Human trafficking trumps privacy, no ifs, ands, or buts about it.”
Tech companies and security experts argue that this form of guaranteed access, which amounts to a backdoor in the encryption, jeopardizes Americans’ privacy and security by creating a portal that anyone—not just authorized government personnel—can exploit.
Since the 1990s, government officials have sparred with technologists over whether hardware and software makers should guarantee the government access to encrypted data. This fight, known as the “crypto wars,” gained new urgency late last year after terrorist attacks in Paris and San Bernardino, California.
Some intelligence officials, like FBI Director James Comey, have argued that terrorists hide their plans behind encryption—a phenomenon he calls “going dark”—and that tech companies have a duty to prevent this. Privacy advocates point out that both the United States and France operate mass-surveillance programs designed to pick up indicators of extremist activity and that the failure to prevent the attacks lies with these programs, not tech firms.
Gautam Hans, policy counsel at the Center for Democracy & Technology, called Cooper’s comments about human trafficking and privacy “reductive.”
“Creating vulnerabilities in devices in order to prevent human trafficking is a short-sighted solution—one that makes everyone less secure,” he said in an email. “I would much rather law enforcement use the multiple tools currently at their disposal.”
Ross Schulman, senior policy counsel at New America’s Open Technology Institute, said in an email that the California and New York bills “mistake encryption for the enemy when in fact encryption protects average people every day from identity theft, economic espionage, and fraud.”
The chairman and ranking member of the Senate Intelligence Committee are working on a bill that could ban strong encryption nationally. The ranking member, California Sen. Dianne Feinstein, has repeatedly bucked pressure from Silicon Valley in her home state and endorsed efforts to weaken encryption. She has also proposed other measures that the tech industry loathes, like a bill requiring companies to report terrorist propaganda on their networks.
Google did not respond to a request for comment about Cooper’s bill. Apple declined to comment. A spokesman for California Gov. Jerry Brown (D) did not respond to an email asking whether he would sign the bill if it reached his desk.
“Experts all the way up to the Director of the NSA agree that any gap in the armor weakens the entire system,” Schulman said. “State legislatures need to think very hard before they take [strong encryption] away. ”