Article Lead Image

Why police want to outlaw Apple’s iPhone encryption

Do iPhone users deserve data protection?

 

Patrick Howell O'Neill

IRL

Posted on Sep 30, 2014   Updated on May 30, 2021, 12:09 pm CDT

There has been a whole lot of screaming and fearmongering over Apple and Google’s decisions to encrypt their smartphones by default.

The U.S. Department of Justice is now reportedly considering asking Congress to outlaw encryption that could potentially lock out police, a federal law official told Bloomberg. This, despite the fact that encryption stymied a record nine—yes, nine—police investigations in the U.S. last year, according to federal records.

Apple, for its part, is using the added security of its products as a major selling point. “Unlike our competitors, Apple cannot bypass your passcode, and therefore cannot access this data,” the company said in its newly revised privacy policy.

While that might sound like a bit of marketing mumbo-jumbo, it has sparked outrage from law enforcement around the country. So, what does the new security actually mean for police—and for you?

For you iPhone users, the goal is to give you more ownership over you own data. Apple has done this by fully encrypting any iPhone running iOS 8. As soon as you turn on a password-lock your device, everything on the phone is protected against all eavesdroppers, whether they be nefarious cybercriminals, eager neighborhood cops, or a pickpocket who nabbed your phone.

The 256-bit AES encryption the iPhone offers is a formidable defense. That’s the same encryption the National Security Agency (NSA) uses for much of its own data. According to Wikipedia, it would take “50 supercomputes” more than 3,000 years to try the “billion billion” decryption keys created by 256-bit AES encryptiong.

A billion billion keys is off the table, however: Ten wrong attempts wipes the device of all data.

The strength of the defense has been slammed by, among other prominent critics, FBI director James B. Comey, who said it would place users “beyond the law.”

Despite Comey’s protests and the media’s insistence on calling the encryption “NSA-proof,” Apple’s newly secured mobile operating system also keeps out hackers and criminals, groups that Apple users have historically had problems with—including a the headline-making Celebgate breach from earlier this month, which resulted in hundreds of nude photos of celebrities landing on the Internet.

If Apple can’t build products that people believe are as secure as their competitors, they’ll lose customers faster than you can say “hacked again.”

“More or less by definition, a backdoor for law enforcement is a deliberately introduced security vulnerability,” Julian Sanchez wrote for the Cato Institute, “a form of architected breach: It requires a system to be designed to permit access to a user’s data against the user’s wishes, and such a system is necessarily less secure than one designed without such a feature.”

“Once you build a back door, you rarely get to decide who walks through it,” Electronic Frontier Foundation activist Eva Galperin explained.

This isn’t simply about locking out the police. It’s about locking out everyone, and giving owners (and no one else) the key to their own data.

And yet this does clearly affect police. Instead of strolling into Apple headquarters with a warrant and getting secret access to the phone, cops will have to take alternate routes. In other words, their jobs just got a bit harder.

First, they’ll have to present a warrant directly to the suspect. Instead of circumventing the alleged criminal, police will take their court order to their front door. If a suspect refuses to allow a warranted search, the police have a plethora of legal options to follow—specifically, holding a suspect in detention for contempt of court. No one can refuse a valid search warrant without suffering a severe penalty, so it’s not as though suspects will walk away with a polite “No thanks.”

Second, police can still legally access the data in transit. While data on the iPhone itself will be encrypted by default, data flowing over the phone’s celluar connection is still readily interceptable, given the proper court order—as was the case with previous versions of iOS.

The second point can’t be emphasized enough. Apple’s cloud storage service, iCloud, remains relatively easily accessible by Apple and police. Many people presumably use the service to store the sort of files and contacts that would be useful in a police investigation.

The iPhone’s new default encryption isn’t altogether unheard of—actually, virtually all your devices, from laptops to phones, have had serious encryption options for years. So Comey’s criticisms seem overwrought—but making it the default option is groundbreaking.

Every step to making encryption easier and more accessible cannot be underestimated. Citizens—criminals and otherwise—will now have significant data protection against thieves and police alike, regardless of how tech-savvy they are.

The question is, do you think citizens deserve to have their data protected by default?

Photo by marco_1186/Flickr (CC BY 2.0)

Share this article
*First Published: Sep 30, 2014, 4:37 pm CDT