It seemed like Apple dodged a bullet.
One day before the technology giant was set to face off with the United States government over a federal court order demanding it help the FBI unlock a dead terrorist’s iPhone, the Department of Justice called the whole thing off.
The most high-profile case in a long-brewing war over government access to encrypted technology had just taken an unexpected turn. James Comey, director of the Federal Bureau of Investigation, had recently told U.S. lawmakers that it could not unlock the iPhone 5C of San Bernardino shooter Syed Farook without Apple building a custom operating system that would let federal agents flood the device with password guesses. The FBI gaining access to the locked and encrypted phone without Apple’s assistance—a move Apple and its supporters in the legal fight said would set a dangerous precedent—was thought to be all but impossible.
Civil-liberties advocates and computer security experts breathed a collective sigh of relief. Apple wouldn’t be forced to create software that intentionally weakens the security of its devices, potentially undermining the security of all iPhone users.
“When the government sits on (and quietly exploits) flaws in widely used software, it puts its own surveillance needs over the cybersecurity of the American public.”
Now, there’s new problem: The fact that the FBI found a way into the supposedly secure iPhone without Apple’s help means its mobile operating system, iOS, has a weakness that, if it falls into the wrong hands, could wreak havoc on millions of people around the world.
Security experts and Apple say the FBI has an obligation to tell tell the company how it hacked into Farook’s iPhone—a move that would run counter to the law-enforcement agency’s investigative offense—so its engineers can fix the now-famous weakness.
“If the government knows about vulnerabilities in software used by the general public, it should report the vulnerability to the developers responsible for the software, so that the public and its information can be kept as secure as possible from cyberattackers,” Christopher Soghoian, principal technologist and senior policy analyst at the American Civil Liberties Union, told The Daily Dot via email. “When the government sits on (and quietly exploits) flaws in widely used software, it puts its own surveillance needs over the cybersecurity of the American public.”
The FBI finding a security vulnerability, critics say, means criminals, oppressive regimes, and thieves can find and exploit the vulnerability, too. If Apple knew about the flaw in its code, its engineers could patch the vulnerability, preventing anyone—police or criminals alike—from using it to hack into iPhone users’ devices.
The FBI contends that access to encrypted communications and devices is necessary to protect the public from criminal and terrorist threats. But patching whatever flaw the bureau found wouldn’t stop the FBI from using the data they uncovered by exploiting the vulnerability, said Susan Landau, a professor of social science and policy studies at Worcester Polytechnic Institute and a former senior privacy analyst at Google.
“Once they’ve seen there is a vulnerability that they can use (or in this case, broken into the phone), there is no loss to them in reporting the vulnerability to the vendor,” Landau said in an email. “Even if the vendor subsequently patches this problem, it will not stop law enforcement from accessing the data in the device they have already hacked.”
It’s unknown what method the FBI used to gain access to Farook’s phone, or who is helping them do it. The current unconfirmed rumor is that an Israeli forensics contractor, Cellebrite, is behind the hack. Cellebrite has repeatedly declined to confirm or deny their cooperation with the FBI in this case, or if they have a method to extract data from an encrypted iPhone.
The debate over vulnerability disclosures reached the highest levels of the U.S. government in 2014, when Michael Daniel, special assistant to the president and cybersecurity coordinator, wrote a blog post on the White House website effectively defending the government’s right to keep software vulnerabilities secret.
The post followed the discovery of a fatal flaw within OpenSSL, the crucial infrastructure of the Internet that encrypts sensitive data like your bank login or credit card numbers, called Heartbleed. The flaw allowed attackers to snatch passwords and other data from vulnerable servers. In the hands of an extremely powerful agency like the NSA, it would be the ultimate tool for hacking a wide array of targets. The NSA denied having prior knowledge of the bug, and the Office of the Director of National Intelligence said at the time that “if the federal government, including the intelligence community, had discovered this vulnerability prior to last week, it would have been disclosed to the community responsible for OpenSSL.”
In his post, Daniel weighed the pros and cons of disclosing software vulnerabilities to technology providers. “As with so many national security issues, the answer may seem clear to some, but the reality is much more complicated,” Daniel wrote. “One thing is clear: This administration takes seriously its commitment to an open and interoperable, secure, and reliable Internet, and in the majority of cases, responsibly disclosing a newly discovered vulnerability is clearly in the national interest.”
Many would characterize Daniel’s claim as disingenuous. According to Soghoian, “all the available evidence we have regarding the FBI’s use of security exploits suggests that they hoard them for offensive purposes.”
Davis goes on to discuss what’s known as the “vulnerability equities process,” a system in which government agencies judge the consequences of disclosing a discovered vulnerability versus exploiting it. This process, Soghoian said, “is stacked in favor of offense, and, in my view, fundamentally broken.”
Software vulnerabilities are not created equal. It’s possible the FBI’s exploit is a highly advanced technique that involves expensive tooling and extensive forensics experience, putting it out of the reach of a petty criminal who snatched an iPhone left on the subway. Does the FBI still have an obligation to expose its technique to Apple?
“That DOJ is sitting on an exploit impacting one of the most popular pieces of software in the world … is totally outrageous.”
Unequivocally yes, said Amie Stepanovich, U.S policy manager at Access Now, a digital-rights advocacy group.
“Everybody will be vulnerable to it,” Stepanovich told the Daily Dot via email. “It’s just a matter of who can actually be targeted by it.”
As Stepanovich explained, “high-level people all over the world use these devices,” from company CEOs to government officials. “These are not only used by the layperson,” she said. “However, even the layperson they could be targeted by bad actors for any number reasons.”
Because of the risks at stake for all iPhone users, said Stepanovich, the FBI should report the vulnerability as quickly as possible.
“Vulnerabilities are never patched immediately,” Stepanovich said. “Patches have to be built, and that takes time, so every day that they are not reporting this vulnerability is another they that they are going to require Apple to work to figure out how to patch it.”
A failure to report vulnerabilities immediately, Stepanovich argued, “is a dangerous proposition and making millions of people vulnerable to attacks.”
The question of whether the government should report security vulnerabilities to software manufactures extends beyond Syed Farook’s iPhone. Soghoian cited a ruling by a judge requiring the FBI to explain how exactly they were able to exploit the Tor browser, a modified version of Firefox that encrypts user traffic over the specialized Tor network, to hack visitors of a child-exploitation website. The FBI is pushing back against disclosing the exploit to the defense, something Soghoian calls outrageous.
“The government used an exploit impacting the Tor browser … in [February] of 2015. They are fighting defense efforts, one year later, to turn over their exploit code (to the defense, not to Mozilla), because they don’t want the vulnerability to be publicly disclosed and fixed by Mozilla/The Tor Project,” Soghoian said. “That DOJ is sitting on an exploit impacting one of the most popular pieces of software in the world, used by hundreds of millions of law-abiding people, is totally outrageous.”
Photo via Michael Himbeault/Flickr (CC BY 2.0) | Remix by Max Fleishman