When the Obama administration released its major cybersecurity plan, laying out steps to modernize federal computer systems and prevent another devastating attack like the Office of Personnel Management data breach, it assigned critical tasks to an agency that most Americans likely do not know exists.
The National Institute of Standards and Technology (NIST), a small agency in the Department of Commerce, is a crucial player in the White House’s Cybersecurity Strategy and Implementation Plan, or CSIP, which tasks NIST with recommending ways for federal agencies to recover from “cyber events.”
The plan “directs NIST to provide guidance to agencies by June 30, 2016, on how to recover from a cyber event, focusing on potential scenarios to include, but not limited to, a data breach or a destructive malware campaign.”
“If I could ban the word ‘creepy’ from all future conversations about privacy, I absolutely would.”
Search the 21-page document for “NIST” and you will quickly realize how important the agency is to federal cybersecurity practices. NIST’s work forms the basis for the structure and configuration of every government computer system, from lists of federal student loan recipients to databases of power plant inspection reports.
“The CSIP emphasizes the government-wide adherence to NIST standards and guidelines,” the plan’s authors wrote in their introduction. They also ordered agencies to protect information systems using “policies, processes, and tools, consistent with applicable [Office of Management and Budget] guidance and NIST standards.”
But what are NIST standards? What does this agency do? Far more than you might imagine. In addition to its work on cybersecurity, NIST may be about to change how government and businesses tackle privacy online.
What is NIST?
NIST, based in Gaithersburg, Maryland, plays a key role in creating the framework for how the federal government operates. Since its founding in 1901, NIST has developed standards and processes for government agencies, from encryption algorithms to electromagnetic spectrum measurement systems. The agency also works with various industries and the scientific research community. Americans use some of its standards every day without even realizing it.
NIST’s work on cybersecurity began in the late 1970s, before most people had ever heard the word “cybersecurity”—before most people had a computer at all.
“Our first foray into this really was in the area of cryptography,” said Donna Dodson, NIST’s chief cybersecurity adviser. In the 1970s, with the commercial Internet two decades away, information technology and the security it demanded were largely the domain of the military. There had been been some civilian work on cryptography, but it had not been peer-reviewed, meaning that independent experts weren’t sure how good it was. NIST, which was called the National Bureau of Standards until 1988, stepped in to help both the government and the private sector develop stronger encryption.
NIST’s cryptography research eventually produced the Advanced Encryption Standard (AES), among the agency’s most ubiquitous products. AES powers the BitLocker software in Microsoft Windows, the FileVault software in Apple’s Mac OS X, and third-party security tools like the LastPass password manager.
Of course, you’re unlikely to find someone who could identify AES as the cryptography keeping them safe—or who could name NIST as its progenitor. But that suits the people at NIST just fine; at least the standard is out there. “I would not suggest that the average citizen recognizes that they have AES inside their computer or inside their phone,” Dodson said. “But it is a security control that sometimes they’re using and not really knowing it’s NIST’s [work] inside there.”
After NIST writes a standard, the Secretary of Commerce must sign off on it before it becomes an official federal information processing standard (FIPS), which agencies must follow when it applies to their work. “If you’re going to use encryption,” Dodson said, “then you need to use one of our standards on cryptography.”
Despite its crucial role in laying the foundation for the way the federal government conducts its business, NIST is not a regulatory agency and cannot force other agencies to follow its standards. Dodson expressed the common view inside NIST that its non-regulatory role benefitted its work, saying it “allows us to work closely with industry, academia, and government.” NIST employees interviewed for this story shied away from expressing a desire for more authority over the rest of the government.
“Since we are not experts on all agencies’ missions and in their mission space,” Dodson said, “we can’t say to them, ‘Absolutely, you must do this.’”
A framework for blocking cyberattacks
While most Americans rarely think about cybersecurity threats unless they’re directly affected, NIST’s information-technology division is working everyday to help IT workers and senior managers close doors before rogue actors sneak through them. Dodson said that it was imperative for everyone who relied on technology “to be able to step back and look at your risk posture and think about the kinds of security controls that you have in place to protect that information or to protect that infrastructure.”
“Risk posture” is one of those terms that sounds more technical than it is. An agency’s risk posture is its assessment of the risks it can comfortably take without jeopardizing its mission. Dodson gave the example of an air-traffic-control tower, where “maybe a 30-second delay for certain kinds of authentication could take too long, and given that environment, the risk of the plane coming in could be greater.” NIST tries to write standards that can be applied flexibly, whether agencies can only tolerate 10-second authentication delays or whether their operations can support significantly longer ones.
Even before the new cybersecurity plan gave NIST additional responsibilities, President Barack Obama, in Executive Order 13636, instructed the agency to “lead the development of a framework to reduce cyber risks to critical infrastructure.” The cybersecurity framework quickly became one of NIST’s marquee projects.
When you imagine a highly destructive cyberattack, you probably picture malicious code seeping into power plants and overloading them, sending entire regions of the United States into darkness. The targets of such attacks are collectively called critical infrastructure. Besides power plants, critical infrastructure includes phone lines, hydroelectric dams, railway networks, and banks. Unfortunately, many of the computer networks supporting these facilities are poorly secured. If you think an old, unpatched version of Windows could let a hacker do some damage to your parents’ old Dell desktop, imagine the same unpatched system running a nuclear power plant.
“Our role is really [about] how to help agencies and—in things like the framework—industry improve their cybersecurity posture, if you will, using a risk-management approach,” Dodson said. “Our role is really how do you look at risks around cybersecurity, and then what are the types of controls that you can put in place to address those risks.”
Matt Scholl, NIST’s computer security division chief, played a leading role in implementing the executive order. He said that the agency learned a lot from its private-sector partners as it worked with them on standards.
“We learned a lot about their concerns from not just technology but policy, regulation, international issues, supply-chain issues, [and] dependencies upon each other,” Scholl said. “For us, that was extraordinarily valuable. And then as a community, I thought it was good that we all then got together to develop the framework. They got buy-in because they were part of the development process, and then we got knowledge by working with them.”
FireEye, one of the country’s most respected security firms, helped NIST consider the implications of “advanced threat actors”—highly sophisticated malicious cyber actors, especially those run or supported by foreign governments. Orlie Yaniv, senior director of government affairs and policy at FireEye, said her company “started a discussion, which is ongoing, with NIST, on whether what we consider basic cyber hygiene is still the same as it was five years [ago], or whether it will need to evolve over time in response to evolving threat-actor behaviors.”
John Marinho, vice president of cybersecurity and technology at CTIA–The Wireless Association, a major telecom trade group, attended all of the cybersecurity framework workshops and came away very impressed with NIST’s work.
“They did an excellent job in terms of bringing all the stakeholders together, both in the private sector as well as within government,” Marinho said. “Everyone who’s used it comes away, in some sense, profoundly impressed with the quality and the significance of what’s there when it comes to cybersecurity.”
The goal at NIST has been to adapt cybersecurity risk-management to the privacy space and close that communications gap.
Indeed, in a December 2015 Dell Research survey, 82 percent of federal IT professionals said that their agencies were using the NIST framework, and 74 percent of public- and private-sector organizations using the framework said that it formed the bedrock of their individually tailored cybersecurity approach.
For CTIA, which has a diverse list of members, the key aspect of the framework was its flexibility.
“If you envision a very large national carrier, they have a lot of resources that they can apply to cybersecurity, but then when you try to scale that down to a regional carrier or to a small carrier, you don’t necessarily have the same level of resources,” Marinho said. “It comes back to flexibility and avoiding the tendency of trying to pigeonhole everybody into a particular approach. The framework gives you that flexibility.”
NIST is continuing to improve the framework based on industry and agency feedback. On Dec. 11, it published a formal Request for Information in the Federal Register, asking organizations to explain how they use the framework, what they’ve learned from it, and whether it needs an overhaul. NIST will use that information to prepare for a two-day workshop about updating the framework scheduled for April 2016.
In addition to the highly publicized framework, NIST publishes many other computer-security guides. These guides—the heart of NIST’s standards-setting work—would make good bedtime reading for insomniacs; even by the standards of most government reports, they are unusually dry. But the immensely detailed documents outline how federal agencies should think about the important task of protecting sensitive information. And given how much of the government’s sensitive information consists of Americans’ personal data—from tax records at the Internal Revenue Service to Medicare enrollment forms at the Department of Health and Human Services—NIST’s recommendations matter far more than their dense, technical language would suggest.
From cybersecurity to privacy
NIST had been researching ways to improve cybersecurity risk management for several years when it began to consider another area of risk: privacy. Approximately two years ago, NIST engineers started thinking about what privacy problems needed fixing and why they existed in the first place.
NIST found that comparing privacy practices across government agencies was very difficult, and comparing government and corporate practices even more challenging. At the time, Brooks said, it was “very difficult to say who’s doing a great job and who’s doing a poor job and therefore learn lessons about how to improve the systems.”“We started to take a look at the broad scope of privacy online and in information systems and say, ‘What’s wrong in privacy? What’s going on in privacy? We’re not seeing that same sort of consistent approach,’” said Sean Brooks, a NIST privacy engineer.
So NIST held a Privacy Engineering Workshop in April 2014, seeking input from government, industry, and academia. Among the trends it identified was what Brooks called “a huge communication gap in the privacy space” between decision-makers (senior personnel and lawyers) and implementers (engineers and product designers). Dodson and her cybersecurity team had found the same thing in their work: people at all levels of the hierarchy weren’t communicating enough about the process. In the privacy space, Brooks said, there were “language barriers.”
But privacy is, in Brooks’ words, “an inherently interdisciplinary problem.” Lawyers can’t formulate privacy policies without understanding the technology in their systems. Engineers can’t design systems without knowing what senior executives want them to do. “You really need to get these fields to talk to one another,” Brooks said.
So NIST’s privacy engineers looked to the cybersecurity team for guidance. What they realized was that privacy, too, needed a risk-management approach. “Folks at the heads of agencies,” Brooks said, “even if they don’t know anything about cybersecurity, they understand risk, they have to make decisions about risk all the time.” Structuring the conversation this way eliminated some of those “language barriers” and gave everyone a common understanding of the problem.
To understand what privacy engineering looks and sounds like without this coordinated approach, consider the word “creepy.” Unsurprisingly, it comes up often in NIST’s privacy work, as agency employees describe systems and processes that they find disturbing. But that word doesn’t tell Brooks and his fellow NIST engineers much about people’s actual concerns.
“If I could ban the word ‘creepy’ from all future conversations about privacy, I absolutely would,” he said. “It is something that I think holds a lot of the conversation back, because instead of talking about very specific things that we’re concerned about—problems that can be fixed—we just have this broad level of discomfort with certain things that happen [in] information systems.”
Risk management, Brooks said, could help agencies move past these vagaries, giving them “a way to identify specific risks and say, ‘Alright, based on your agency’s mission, based on what services you need to provide to the citizens of this country, what are the most important risks for you?’”
NIST has published less on privacy than on cybersecurity, but it’s finalizing a draft internal report, NIST IR 8062, on evaluating privacy from a risk-management standpoint. Lefkovitz anticipated that the report would be published in early 2016.“That will not only introduce these concepts that we think are missing in the privacy space,” she said, “but also help to lay out a roadmap for future guidance in this space.”
NIST’s work benefits significantly from its strong partnerships with other agencies and with the technology industry. One reason is obvious: NIST is writing standards for equipment that other entities are designing, so it needs to know how those systems work.
If you think an old, unpatched version of Windows could let a hacker do some damage to your parents’ old Dell desktop, imagine the same unpatched system running a nuclear power plant.
“If the standards, the guidelines, the recommendations we write are unrealistic, infeasible, or can’t be implemented using either currently available commercial technology or soon-to-be-available commercial technology,” Scholl said, “then … we’re providing a recommendation … [for] which there is not a technological way to implement.”
In addition to studying what technology is currently on the market, NIST also researches the technology that’s around the corner, again relying on its conversations with the companies that are shaping the technological future. “If we’re not doing R&D in the future technology sets,” Scholl said, “then we won’t be ready to assist folks with recommendations when those technologies emerge, and we’ll always be flat-footed and behind.”
Once it has a workable standard—like the soon-to-be-completed privacy risk-management framework—NIST reaches out to contacts at other agencies and essentially asks them, “Would you like to kick the tires on this and tell us what happens?”
Within the next six to 12 months, Brooks said, NIST will set up partnerships to test the new privacy framework. “As we start to apply this conceptual-level stuff that we’ve built up and really apply it in the federal space, what holes we can poke in our own [concepts] and see, ‘Does this work for [some agency] in a way that is different than it would work for the USDA?”
Brooks can already foresee one challenge that NIST and its partners will have to grapple with: the tradeoffs between more stringent security and more user-friendly privacy policies.
Risk isn’t just about security. Closely monitoring employees, for example, introduces privacy-related “risks” to their wellbeing even as it addresses the institutional risk of insider threats. People don’t like to work in offices where they feel mistrusted; if the monitoring is too stringent, they might quit. That constitutes a risk—one some agencies can tolerate while others cannot. This is a prime example of how risk posture can differ between organizations. The National Security Agency might be more willing to tolerate risks to employee comfort than the U.S. Fish and Wildlife Service, because the NSA’s secrets are more sensitive—more in need of protecting at all costs—than those of USFWS.
Standards take time
NIST’s prominent placement in the new federal cybersecurity strategy is just one example of the way cybersecurity has helped raise the agency’s profile. “We have been noticing NIST taking on a bigger role, and we think that that should continue,” said Yaniv. “They are technically savvy, and … they are trusted—or more trusted than other organizations—and they have a demonstrated level of competence.”
But NIST’s work will take time to filter out to the public. Its engineers are not expecting their work to directly inspire widespread improvements in personal cyber hygiene anytime soon. But its nascent privacy effort offers a model for how the small agency can make a big difference.
In an age of seemingly endless hacks and leaks, people are undoubtedly paying more attention to how their personal data is protected, and Brooks said that government agencies should seize that “opportunity” to “respond to that greater attention with more effective privacy protections in information systems.”
Right now, the best indication of how seriously most companies take their privacy obligations is the terms-of-service agreement that they make their users sign. No one reads these documents—their discouraging length and dense legalese are the source of many a joke—but they are critically important as markers of the current state of privacy. They essentially place all of the risk on the end user and let the company wash its hands of that responsibility.
NIST is trying to build better privacy relationships between companies and customers. Brooks said the goal was to create a system where “you’re not just pushing a bunch of risk on users, you’re actually giving them a yes-or-no choice, or a yes-maybe choice, that is actually meaningful and that they can understand.” The benefits of more granular privacy agreements are twofold: Not only are users avoiding “taking on a substantial amount of risk on behalf of the system,” Brooks said, but they’re also “helping the system understand what their intentions are.”
NIST’s privacy work is still in its early stages. NIST IR 8062, the draft privacy framework released in May, is mostly conceptual; a lot needs to be done to turn it into a set of concrete action items that agencies can implement. But NIST is optimistic that it’s on the right track, that it has the White House’s full backing, and that it won’t be long before its draft reports turn into formal recommendations, which turn into real-world processes, which (hopefully) produce better outcomes for the government and all Americans.
“I think we’ve seen the administration entrust NIST with a number of exciting projects, in the wider cybersecurity space, over the last two years,” Brooks said. “Our focus right now is to just do the best work that we can and hope that the best work we do comes with the support that is necessary to make the kind of improvements that federal agencies want to make.”
Illustration by Max Fleishman