Paul E. Black and Lee Badger, National Institute of Standards and Technology
Paul E. Black and Lee Badger serve at the National Institute of Standards and Technology (NIST), the non-regulatory laboratory and agency of the U.S. Department of Commerce that advances standards in the information technology sector. Black is in NIST’s Software Quality Group and Badger is in the Security Components and Mechanisms Group. Black spoke about software updates from the agency’s perspective, and both fielded questions in the discussion period.
Black opened with a discussion of the multifaceted challenges to creating software update infrastructure.
Processing power is one important factor, Black said, because speed and capabilities vary greatly among chips. Some, such as RFID chips, have very little power for updates, while others have massive computing power that larger updates require. In addition, some chips, such as read-only memory chips, have to be physically removed in order to be updated, while Internet-connected chips can be updated wirelessly. For those updates that involve connectivity, network availability can also become an issue. For
phones, cellular connectivity can be assumed, while other devices might be infrequently connected or might need to be updated in areas without reliable cellular service.
As noted in other workshop sessions, the user can be a factor, too, Black noted. Most users do not have the expertise to understand what exactly a security update will do, what to install, and what to avoid.
Software incompatibility is a key concern that can have security implications, inadvertently leading to bugs, crashes, or even system failure. For example, there could be a downstream, isolated component that relies on a format that wasn’t included in a new update. A software update could even introduce new vulnerabilities to a system that had previously been secure, such as by creating vulnerabilities when two previously secure modules interact in a new way. Program shepherding, which monitors control-flow transfers during program execution, can become an important mechanism to monitor for potential intrusions during an update, as well, he said.
Software incompatibility is a key concern that can have security implications.
As noted throughout the workshop, software updates can also alert attackers to vulnerabilities. For example, hackers might be able to compare the old software to the updated version, identify the vulnerability being patched, and attack private information on systems that haven’t been updated yet.
Black suggested that there may also be some overlooked challenges surrounding updates to newer systems like virtual machines, containers, or microservers. Although these are software packages, they are handled differently than typical software. Virtual machines, for example, can execute multiple operating system configurations, depending on their use, and would need all the corresponding software updates to remain secure.
Building on a theme raised earlier in the workshop, Richard Danzig, Johns Hopkins University Applied Physics Laboratory, asked how the growth of cloud computing and machine learning might be affecting this landscape, in NIST’s view. Black responded that the cloud, in one sense, is just another environment for operations, and in fact, the isolation of cloud systems make them easier to deal with in some respects. Another nuance is that the cloud business model is founded on offering up-to-date, secure, well-configured computing. This model turns operations from a business expense into a profit-making center, which could potentially work in users’ favor because there is perhaps more of an incentive for companies to put resources behind frequent, high-quality updates, Black suggested.
Machine learning (or artificial intelligence) could be used to support security in a number of ways, Black said. For example, these technologies could be used to create more intelligent mechanisms for identifying vulnerabilities, monitoring intrusions, and responding to breaches. Black said his group is experimenting with Google’s open-source software library, TensorFlow, to recognize software vulnerabilities. To be successful, we must learn from both good and bad experiences with applying machine learning in this context, he said.
Finally, the relationship between software vendors and users can also be problematic for appropriate use and updating of software. Most software is licensed to a user instead of sold outright, in order to protect the maker’s intellectual property rights. However, the licensing agreement doesn’t set forth security requirements for the user. This licensing arrangement thus gives software makers “the best of both worlds,” with fewer protections for the consumer, Black said.
Black presented some possible solutions to address some of these key challenges.
One idea NIST is exploring to keep hackers from reverse-engineering patches is a “one-way function,” he said. He introduced a hypothetical scenario with a software component, “S,” that has an input vulnerability, “V.” Typically a patch would be designed to detect a V input and filter it out but otherwise run S as usual. But this approach reveals to hackers that V is the input that causes the vulnerability. An alternative idea is to create a patch that uses a one-way function on the input, so instead of filtering out V inputs, it would be detected with a hash signature and compared to a given value (or range of values). “Because it’s one-way, it’s infeasible to reverse-engineer that and figure out what the vulnerability is from that patch,” Black explained. If such one-way functions were feasible, this type of patch would allow a system to recognize V and reject it without broadcasting its vulnerabilities.
Because one-way functions tend to require a lot of computation, Black said it would be ideal to issue a code fix at some point, but that could be done in a way that separates the patch information from the code change and delivers them at different
times. This could help address the challenge of sending an update quickly enough to prevent hackers from reverse-engineering an exploit.
Regarding the idea of a software inventory, suggested earlier in the workshop, Black suggested that such a mechanism would indeed be helpful in shedding light on the building blocks inside complicated software modules, and he noted that NIST is working on enabling a software identification tag (SWID) that could be complementary to this approach. SWID is an International Organization for Standardization (ISO) standard that gives every version of software a unique ID. “It doesn’t solve problems,” Black said, “but at least this way there’s an ISO standard to communicate the information.”
SWID tags could create an inventory of all the software on a system, which would help users know, for example, how many different versions of a piece of software are running or which software needs to be updated first. Facilitating automated updates like this would also help control the costs of software updates, Black suggested.
Circling back to this notion of software transparency in the discussion, Bob Blakley, CitiGroup, noted that containers, complete systems that contain everything software needs to run and can be easily installed on servers, could help advance transparency because they are discrete parts with known materials, making it easier to see and address any vulnerabilities discovered within them. He suggested that more of these container-based models could meet the need for a “bill of materials.”
Black also addressed the issue of configuration updates, which are more complicated than software updates. Even if the system is properly configured for the old software, an update can introduce new parameters that essentially wipe the old configuration or create a potential mismatch between the old configuration and the new parameters.
In the discussion, Tony Sager, Center for Internet Security, noted that in his experience with the Department of Defense (DOD), configuration was a major consideration when evaluating the need for updates. If a system’s configuration eliminated the bug that the software update was trying to address, then there might be no need for the update in a particular situation. “Getting that right is hard to do,” Sager said, “but it really is, I think, a key part of the decision.”
Configuration updates typically come with specific instructions from the vendors, who use can NIST’s checklist program to guide what to update, how to communicate updates to users, and how to make sure the settings are correct. In some cases, NIST reviews the checklists, but in others, such as with OpenSSL, the process is centralized and users receive one update message. Checklists can sometimes be out of date, or they might be for updating software only, as opposed to the entire configuration, “but at least the information may get out there,” Black noted.
Another relevant NIST project is the security content automation protocol (SCAP), which is also meant to automate updates. SCAP verifies system details (such as which versions, privileges, or daemons are running) in advance, so that updates run more smoothly and only change what each particular configuration requires.
Prompted by a question from Sager, Black elaborated on SCAP in the discussion. Sager wondered whether NIST faces a chicken-and-egg problem in trying to get industry adoption for this idea. Standard naming and enumerations could simplify the problem, while trying to update an unknown number of variations on a theme could make it a much more difficult problem.
NIST is seeking ways to encourage high-quality, rapid, and secure software development.
Black agreed that industry can be hesitant to adopt new tools, pointing to SWID tags, which have also created a chicken-and-egg problem, as one example. Very few software packages provide SWID tags, because there aren’t a lot of SWID tools out there for them to use in the first place. To remedy this problem, NIST is producing tens of thousands of SWID tags for existing packages to get them into wider circulation.
To the question of industry adoption challenges for SCAP, Badger said that the antivirus sector has relatively good adoption rates, although it’s been difficult to get wider adoption in the industry as a whole. One barrier is that SCAP is complex and has a considerable learning curve. Sager suggested this might be a case in which buyers need to be educated to know what protections to demand, but the standard itself must be sufficiently mature to drive this demand. Buyers like DOD and the National Security Agency were able to ratchet up the demands in their contracts, which led to the creation of the Common Vulnerabilities and Exposures and Open Vulnerability and Assessment Language, for example, noted Sager. But when buyers and vendors lose momentum, the demand can wane.
NIST also is seeking ways to encourage and reward high-quality, rapid, and secure software development, Black said. For example, the agency is running a Secure Toolchain competition, in which it presents a problem and gives teams 1 day to develop a fix. Through many rounds of these challenges, more and better secure tools are created, which raises the security bar higher and higher.
Steven Lipner, an independent consultant, noted that such competitions can have downsides and lead to false conclusions if the lessons from successfully handling “toy problems” are extrapolated too far. Black agreed that this is a risk and said it is a factor
they have tried to account for. While these contests are not likely to draw out solutions that would apply to the development of a new OS, he noted, they are potentially very useful for creating better security solutions for smaller apps, which are often “slapped together” rapidly and often have significant security vulnerabilities.
A recent NIST report, Dramatically Reducing Software Vulnerabilities,1 identifies specific technical methods such as proof-carrying code, well-analyzed frameworks, and potential update mismatches to improve software security, noted Black. Well-analyzed frameworks, for example, enable updates to insert small bits of code around a framework, instead of updating an entire piece, thus increasing the security of updates.
1 National Institute of Standards and Technology, 2016, Dramatically Reducing Software Vulnerabilities, November, http://nvlpubs.nist.gov/nistpubs/ir/2016/NIST.IR.8151.pdf.