Article Lead Image

Photo via Kārlis Dambrāns/Flickr

The FBI doesn’t need to override encryption if it can just buy existing flaws

The gray market will decide.

 

Kelsey Atherton

Via

Posted on Apr 27, 2016   Updated on Feb 29, 2020, 8:01 am CST

Opinion

Every security system breaks at some point.

Monasteries built on islands in the 6th century were safe until viking raiders attacked them from the sea in the 9th century. Medieval castle walls held strong against assault after assault until gunpowder propelled cannonballs through rigid stone. Machine guns and trenches held the Western Front static for years until better planning and new technologies like tanks let the allies punch through once-impenetrable defenses.

In conflict, all defensive systems are eventually beaten, though historically it’s taken decades and centuries for the shift to be seen. When it comes to technology security systems, that process is sped up often into a cycle of years, if not months.

The practical implication of this is that when it comes to keeping information secure on a phone, the older the phone, the likelier it is that someone, somewhere has figured out a way to crack into it. This is especially true for smartphones, which are designed to be replaced by newer, presumably more secure models every year or so. While companies go to great lengths to maintain the security of current models, keeping security up to date requires a user updating the phone regularly, and only works as long as the company supports the operating system.

All of this is a long way to say: In 2014, the FBI tried to use a legal authority from the 1789 All Writs Act to get into a locked iPhone running iOS 7. The magistrate in the case said the FBI had no need to get into the phone, but the FBI kept pursuing the case until this week, trying to get an order to compel Apple to unlock the phone.

Sarah Jeong wrote at Motherboard:

The phone in question runs iOS 7, which is means it’s less secure than the phone the FBI paid to break into in San Bernardino, which was an iPhone 5c running iOS 9. Breaking into the San Bernardino phone meant creating custom software that would take 10 to 12 engineers working full time for four to six weeks, according to Apple. But breaking into the phone in Shu Yong Yang, et al would be trivial. 

Jeong cites the $200 hacking tool that can bypass iOS 7 security, which police have had access to for well over a year now. In their court order, the FBI wanted Apple to do that work for them, instead of finding and using the simple hacking tool.

Or at least, that’s what the Justice Department wanted to do until Friday, when they suddenly dropped the court order. One possibility, that was suggested by the FBI, is that the person who knew he passcode suddenly remembered it and told them, two years later. It’s also possible that the FBI bypassed the security.

In another recent high-profile case, the FBI alluded to paying at least $1.2 million to gain access to a locked phone. After testifying to its importance before Congress in late March, the FBI suddenly dropped a court order that would have compelled Apple to unlock a phone associated with the San Bernardino shooting. The reason wasn’t simple good will, or a lack of interest in the phone. No, instead the FBI paid a large sum to a group that had already discovered a security flaw in the phones. From Wired:

[FBI Director James] Comey said that all the controversy and attention around the San Bernardino case had “stimulated a bit of a marketplace around the world, which didn’t exist before then, for people to try and figure out if they could break into an Apple 5c running iOS 9.” 

As a result of that attention, “somebody approached us from outside the government and said, ‘we think we’ve come up with a solution.’”

If expense is no object, there’s no security that money can’t eventually overcome. 

Asked if the FBI is now crowdsourcing a solution to get into the latest version of iPhones, the iPhone 6 and 6s, Comey said no. “[I]t just doesn’t seem to make a lot of sense to me that the way we’re going to resolve a conflict that implicates values and our hardest work is that the government is going to try and pay lots of money to get people to break into devices and find vulnerabilities—that seems like a backwards way to approach it.” he said.

Much like the mercenary siege engineers of old, security profiteers live on the market of new weaknesses. While Congress waits to legislate new, clear rules for law enforcement and computer security, the gray market will decide.

And as the FBI found, if expense is no object, there’s no security that money can’t eventually overcome.

Kelsey D. Atherton is a Washington, D.C.-based technology journalist. His work appears regularly in Popular Science, and has appeared in Popular Mechanics and War Is Boring. Follow him on Twitter @AthertonKD.

Share this article
*First Published: Apr 27, 2016, 12:47 pm CDT