Article Lead Image

White House defends right to keep cybersecurity vulnerabilities secret

"Could we utilize the vulnerability for a short period of time before we disclose it?"

 

Andrew Couts

Tech

Posted on Apr 28, 2014   Updated on May 31, 2021, 9:54 am CDT

In an attempt to be more transparent, on Monday the White House defended the federal government’s right to withhold public disclosures of cybersecurity vulnerabilities, like the recent Heartbleed bug, when doing so is in the interest of U.S. national security.

The Obama administration also listed a number of questions the U.S. government says it considers before concealing cybersecurity flaws.

The statement, published on the official White House blog by President Obama’s cybersecurity coordinator, Michael Daniel, follows strong assertions from both the National Security Agency and the White House that, contrary to reports, the federal government had no knowledge of Heartbleed prior to its public disclosure on April 7.

Statement: NSA was not aware of the recently identified Heartbleed vulnerability until it was made public.

— NSA/CSS (@NSA_PAO) April 11, 2014

Discovered by researchers at Google and Codenomicon, Heartbleed is a massive security flaw in the OpenSSL encryption protocol that experts say put as much as two-thirds of the Internet’s private data at risk since it first appeared in the code in 2012.

In the blog post, Daniel says that there are “legitimate pros and cons” to revealing cybersecurity vulnerabilities—a stance that leaves some cybersecurity experts uneasy.

Here’s the key quote:

We rely on the Internet and connected systems for much of our daily lives. Our economy would not function without them. Our ability to project power abroad would be crippled if we could not depend on them. For these reasons, disclosing vulnerabilities usually makes sense. We need these systems to be secure as much as, if not more so, than everyone else.

But there are legitimate pros and cons to the decision to disclose, and the trade-offs between prompt disclosure and withholding knowledge of some vulnerabilities for a limited time can have significant consequences. Disclosing a vulnerability can mean that we forego an opportunity to collect crucial intelligence that could thwart a terrorist attack, stop the theft of our nation’s intellectual property, or even discover more dangerous vulnerabilities that are being used by hackers or other adversaries to exploit our networks

While the U.S. government has a history of using cyberweapons, Daniel’s statement does not necessarily mean the U.S. is hiding a whole host of vulnerabilities from the public. In February, the White House launched President Obama’s “Cybersecurity Framework,” first revealed in 2013 and touted during that year’s State of the Union Address, which aims to strengthen U.S. critical infrastructure networks by gathering “existing global standards and practices to help organizations understand, communicate, and manage their cyber risks.” Vulnerability disclosures are among those “standards and practices.”

The White House further supported cybersecurity disclosures on April 10—just days after news of the Hearbleed bug hit the Web—by announcing that companies could share cybersecurity vulnerabilities without violating anti-trust laws.

Daniel, who seems to reference this in Monday’s post, wrote of this legal clarification at the time:

We know sharing threat information is critical to effective cybersecurity. Indeed, reducing barriers to information sharing is a key element of this Administration’s strategy to improve the nation’s cybersecurity, and we are aggressively pursuing these efforts through both executive action and legislation. Today’s announcement makes clear that when companies identify a threat, they can share information on that threat with other companies and help thwart an attacker’s plans across an entire industry.

Indeed, Daniel says that it is not in the U.S. national security interest to “stockpile” vulnerabilities. But that doesn’t mean it’s in the U.S. interest to reveal all of them, either.

“Building up a huge stockpile of undisclosed vulnerabilities while leaving the Internet vulnerable and the American people unprotected would not be in our national security interest,” writes Daniel. “But that is not the same as arguing that we should completely forgo this tool as a way to conduct intelligence collection, and better protect our country in the long-run.”

To further the spirit of transparency, Daniel has released a list of questions the government ostensibly asks itself before deciding which vulnerabilities to hide and which to keep secret. They are as follows:

  • How much is the vulnerable system used in the core internet infrastructure, in other critical infrastructure systems, in the U.S. economy, and/or in national security systems?

  • Does the vulnerability, if left unpatched, impose significant risk?

  • How much harm could an adversary nation or criminal group do with knowledge of this vulnerability?

  • How likely is it that we would know if someone else was exploiting it?

  • How badly do we need the intelligence we think we can get from exploiting the vulnerability?

  • Are there other ways we can get it?

  • Could we utilize the vulnerability for a short period of time before we disclose it?

  • How likely is it that someone else will discover the vulnerability?

  • Can the vulnerability be patched or otherwise mitigated?

Matthew Prince, cofounder and CEO of hosting firm CloudFlare, one of the first companies to respond to the Heartbleed bug, says this approach to cybersecurity is “slippery” due to the potentially contrasting objectives of the NSA, FBI, and other government agencies tasked with protecting national security through both offensive and defensive means.

“I think what you see in this statement is the inherent tension between those two purposes,” Prince told the Daily Dot in a phone interview. “And so long as those purposes are housed under those same organizations, those tensions are going to be present.”

While these opposing objectives may be understandable, says Prince, the refusal to disclose some vulnerabilities inevitably leaves U.S. companies and Internet users at risk.

“Inherently, every vulnerability that the NSA exploits on an offensive basis is potentially a vulnerability that U.S.-based organizations could fall victim to themselves from some attacker that the NSA’s charter states that it should be helping protect, from a defensive basis,” Prince says.

A better approach would be for the NSA to regularly disclose vulnerabilities directly to the cybersecurity and research communities, Prince suggests, and help them roll out patches to fix those flaws.

Prince says he could not recall an instance in which the NSA disclosed a vulnerability to CloudFlare employees, though he stipulates that other federal agencies, such as the National Institute of Standards and Technology (NIST), may have exhibited this level of transparency in the past.

The real value of Monday’s statement, both Daniel and Prince point out, is neither any admission of secrecy nor a change in policy but that the public has knowledge of the NSA’s operation at all.

“This is an organization that, until recently, the existence of which was not acknowledged,” Prince adds. “What is notable about the statement is that there was a statement. That is a marked change. And I think that’s a positive outcome—though there’s a long way to go from here.”

Illustration by Jason Reed

Share this article
*First Published: Apr 28, 2014, 10:34 pm CDT