The Software and Systems Division (SSD) has to respond with an adaptive, nimble strategy to congressional mandates imposed on NIST. The division needs to evaluate each mandate within the context of its mission and expertise, leveraging staff and expertise to continue pursuing a mix of tactical and strategic initiatives. This includes recruitment and cultivation of staff with the requisite skills for those initiatives.
The SSD is engaged in several high profile areas, notably in voting, health IT, and cyberphysical systems (CPS). Failure, even in part, in any one of these areas would have profound implications for the SSD, ITL, and NIST. In each case, it is crucial that the SSD, in collaboration with ITL and NIST management, delineate the risks and rewards within the context of an organizational strategy. By defining the stakeholders and success metrics (e.g., customer satisfaction, standard development, uptake and adoption), it can match competence and resources to the risks and rewards.
In line with the charge to the panel provided by the NIST Director, the assessment of the SSD focused on the following criteria:
- The unique NIST mission and its relationship to the broader articulation of the NIST/ITL strategy (i.e., non-duplicative of other academic, laboratory, or industrial efforts). Of these, standards development is the most impactful—followed by the advancement of practice for measurement, evaluation, and interoperation support—because small NIST groups can have extraordinary influence by leveraging a unique body of competence, institutional knowledge, and outreach capability.
- The technical excellence of the project and its team. Concomitantly, projects need to draw on the core intellectual expertise and mission focus of the SSD, with skills aligned with the mission, objectives, and mandates.
- The degree of community involvement, balancing leadership and impartial convening for standards development and community engagement.
ASSESSMENT OF TECHNICAL PROGRAMS AND PORTFOLIO OF SCIENTIFIC EXPERTISE
The projects presented by the SSD staff span a wide range, from individual staff projects that are preliminary research investigations to broader initiatives that are coupled to external communities.
The computational science metrology effort focuses primarily on computational science and measurement issues related to analysis of biomedical microscopy data. Analysis of multiscale imaging data is an increasingly important component in many areas, including biomedicine and materials science. In this class of challenging applications, a very large number (109 or more) of objects need to be segmented and tracked over time. Characterizing the agreement between segmentation and tracking algorithms is a crucial and challenging component of this effort, as is the need to quickly integrate and stitch disjoint patches of image data. The computational metrology SSD group has done admirable work in attracting strong scientific collaborators in the stem cell therapy area and in collaborating with this group to generate both publicly distributed software and published scientific results.
Opportunities and Challenges
The quality of this work is extraordinarily high, but the group would likely increase the impact by broadening focus to supporting a broader range of scientific driving problems. A broader set of microscopy-based applications could be entertained. In addition, the group might consider metrology issues associated with the analysis of other types of imaging. The inherent strengths of the SSD might be best served by a sharper focus on metrology methodology issues and metrology-related imaging standards. A common set of issues arise in the analysis of multiscale imaging data in many diverse application areas; the SSD could potentially play an important role by helping scientific groups realize that they face a common set of metrology and algorithmic challenges.
Voting Security and Verification
In the wake of the vote-counting challenges faced in the 2000 Presidential election, the United States started to pay more attention to its voting processes and technologies. Achieving standards-based practices is challenging, because elections in the United States are carried out by more than 10,000 entities, with wide variations in technical capability and in the number of registered voters per entity.1 Few of these entities have the budget or skilled personnel to enable them to evaluate new technologies independently or to compare and contrast processes on their own.
In response to multiple mandates, ITL has assumed an increasingly important role in voting systems since 2002. Voting security and verification is one of the SSD’s long-term, priority projects. In response to legislative mandates, it has been sustained over many years, with a strong and steady track record despite considerable oscillations in available funding. SSD reported that standards developed by the program are being widely adopted. Engagement with the large and diverse communities of stakeholders is extensive. The SSD team is responsive to the voting-technology community.
1 According to NIST data, some Northeastern townships have as little as a few hundred voters, while Los Angeles County has about 4.8 million voters supported in 10 languages.
Opportunities and Challenges
Despite the depth and importance of ITL work on voting systems development and implementation, the public profile of the SSD in this area is not high. Among academic institutions, the California Institute of Technology (Caltech), the Massachusetts Institute of Technology (MIT), and Stanford University are better known for their work on voting. The SSD could do more in the development of effective and publicly disseminated software test tools and methodologies.
Challenges remain regarding the mechanisms through which confident assurance judgments can be made for particular voting devices. These include how device vendors can effectively support the process. This needs to extend beyond mainstream vulnerability scans and penetration testing. These tests address primarily vulnerabilities already identified by the cybersecurity community and for which there is experience in established mainstream computing systems. In general, however, they do not provide positive assurances regarding either other exploitable flaws or reliability challenges unrelated to cybersecurity. Voting systems are a mission-critical feature of the U.S. democratic process, and the thresholds for acceptance need to be high, even if this means compromising on added technical features. In particular, acceptance evaluation needs to be accomplished through techniques more invasive than the block-box evaluations typically used to test system-level penetration in business and consumer systems.
ITL is in the important position of facilitating the dialog between the vendor community, which seeks to protect its intellectual property, and voting officials, who seek to make confident judgments regarding the fitness of candidate voting devices. This process of balancing interests requires a high level of expertise as well as an engaged neutral stance and an ability to engage effectively with diverse stakeholders. The SSD voting team has these characteristics, and, consequently, it has a nationally significant role.
There is nonetheless a fragility to the SSD voting team, which consists of a small group of capable and dedicated individuals. Although the team is augmented by staff from the Information Access Division, the Statistical Engineering Division, and the Computer Security Division, it was not apparent during the review which mechanism is used to recruit and develop the necessary bench strength.
SSD’s work on cloud computing standards has had a substantial impact on government and industry, both within the United States and internationally. Indeed, SSD’s body of definitions related to the various kinds of cloud computing—its reference architecture—is now internationally accepted, and it has led to a voluntary, consensus ISO standard. As an impartial and regular convener of cloud service providers and consumers, the SSD continues to provide a forum for technical interchange and a venue for development of an evolving series of standards and a reference architecture. Reflecting interest levels and SSD credibility, the working groups on architectures and services, metrics, security, and interoperability continue to attract hundreds of participants.
This work on cloud computing is driven by the relatively high level of commitment and risk associated with cloud adoption decisions, both in industry and in government. Decisions associated with the adoption of cloud computing encompass choices regarding the structure and location of organizational data and the selection of computing paradigms. The principal decisions are architectural, but there are also decisions related to sourcing, with options ranging from vendor clouds (such as from Amazon, Google, and Microsoft) to shared or dedicated organizational clouds. Many criteria influence decisions, and the engineering trade-offs are complex and are conducted in a technological environment of both evolutionary and rapid change.
Cloud computing is significant because it has emerged as the dominant pathway to scale in computing resources. Cloud architectures offer not just the ability to manifest scale but also, separately,
the ability to rapidly scale up (or down) in the face of changing computational demands. While “the cloud” can also be a potential pathway to enormous cost savings due to the ability to share resources and the balancing of loads, any sharing of resources creates the possibility of security and reliability issues. Moreover, importantly, cloud computing is a pathway to flexibility and adaptability of data-intensive and computationally intensive systems and capabilities. This flexibility is increasingly important for a wide range of corporate and agency mission systems.
Opportunities and Challenges
In the federal government, cloud adoption is perceived as having both high benefit and high risk. Identifying the decision criteria and navigating the trade-offs that realize the benefits with acceptable risk requires a high level of technical understanding. The SSD is facilitating this process by working with industry to assist in the decision and adoption process, identifying the full range of criteria and, importantly, collaborating to develop the necessary metrics and guidance. This is creating a benefit for a broad range of stakeholders.
SSD expertise and activities in this space have helped shape U.S. government cloud adoption. With respect to U.S. federal cloud strategy, the NIST initiative is the only deep effort, and it has the potential to shape the decision space for cloud engagements across multiple federal agencies.
Software assurance is one of the most critical challenges for software-reliant systems of all kinds. It is key for cybersecurity defense, safety-critical systems, infrastructural systems, national security systems, and mainstream personal consumer systems. Software assurance failures are unfortunately ubiquitous, even in the most heavily evaluated systems.
The SSD software assurance project has done an excellent job of building useful data sets that can benefit producers of software tools and in raising awareness about the existence of these data sets. This includes databases of common software risks and security vulnerabilities and of test cases.
The SSD software assurance project focuses specifically on tools for static software analysis. There are many other critical aspects of assurance practice, with broad-scope models such as the Microsoft Security Development Lifecycle and the Building Security in Maturity Model evaluation framework. These provide frameworks through which a wide range of specific practices can be integrated into an overall process of development, evaluation, and modernization, with process models ranging from traditional linear and V models to DevOps and small-team-agile models. Choices that influence the potential to achieve reliable assurance judgments include selection of programming languages, tools, and models for requirements, and design to testing approaches, inspection, analysis, and runtime monitoring and logging.
The SSD software assurance activity focuses primarily on static software analysis. This analysis operates at the level of code, usually source code, but there are also analyses directed at object code. Analyses of this kind can address many different categories of defects, and their success can range widely. For each particular quality attribute (i.e., category of defects) covered, diverse success measures are applicable, including rates (and nature) of false positives and false negatives, scalability with respect to performance of the analysis, and composability with respect to software components.
Opportunities and Challenges
The software assurance effort at the SSD builds on the Common Weakness Enumeration and Common Vulnerabilities and Exposures resources from the MITRE Corporation, which are inventories of specific kinds of software weaknesses and instances of vulnerabilities. The SSD effort takes on the challenge of linking these inventories with influences on development and evaluation practice.
Given its scope, the SSD effort is at a high level of quality and responsiveness to its stakeholder community. The effort is known among vendors and many adopters as well. The scope is appropriate and important, since the evaluation frameworks, arguably, are prompting more efficient competition and more rapid innovation. There are questions, however, from the portfolio perspective.
For example, many of the other projects in ITL include the development of tests for conformance to the various standard representations and protocols to support interoperation. Most of these tests are black box—that is, predicated on the tested components being opaque to the test managers. However, this technique is only effective when certain engineering constraints are respected in the development of the black box component; without this, the results may not be sufficiently predictive. These constraints relate to determinism at all levels of design and implementation and to use of physical devices.
More could be done by the SSD in the development of effective and publicly disseminated software test tools and methodologies—for example, for conformance and interoperability testing. The SSD could take a leading role in the development of software test tools.
The SSD has several projects related to computer forensics. This is an increasingly important feature of processes associated with criminal investigations and legal discovery as well as the preservation and archiving of electronic documents, data, and other computational assets. Complicating the process is the vast amount of data space dedicated to systems and applications software as well as associated common data assets. Also complicating the process is uncertainty regarding the actions to take, for example, in a law enforcement situation in the immediate moments when computers and mobile devices are seized and potentially volatile data needs to be preserved.
A complete review of the software forensics area could not be conducted, because much of this work is shared with the Cybersecurity Division of ITL, which fell outside the scope of the present review. However, it was noted that the SSD has developed a very large corpus of signatures for applications and standard files. The SSD team is also developing both guidance and test capabilities to facilitate the evaluation of various kinds of forensic tools and capabilities.
Opportunities and Challenges
Matches with items in the corpus of signatures for applications and standard files would enable the forensic process to focus rapidly on data specific to a situation, avoiding the manual process of sifting out common software and data files. The scale of the SSD corpus is growing rapidly. However, there are significant technical challenges to be faced, including the proliferation of versions of frequently released software, the addition of shape shifting and other resiliency features that make fingerprinting more difficult, and the increasing role of cloud and other remote resources in hosting user data.
This latter challenge could drive the need for a significant paradigm shift from the current approach based on signature matching. This is analogous to the challenge faced in the fingerprinting of malware specimens, many of which no longer have a readily identifiable static form.
Cyberphysical Systems and the Internet of Things
The activities in the Internet of Things (IoT) and Cyberphysical Systems (CPS) are a welcome addition to the SSD portfolio, because these domains are important from a technological and industrial point of view. These domains are going to be critical to the future of services and products. The approach taken by the SSD is technically sound and addresses some of the important issues in the area. The industrial relevance of the IoT in the United States is high, as demonstrated by the enormous industrial interest that has resulted in a cacophony of semistandards proposed by many industrial consortia. The consortia were formed to define communication standards, but there is still no clear winner. In this environment, NIST can function as a neutral player to mediate among different camps.
Opportunities and Challenges
On the technical side, the SSD has demonstrated leadership in establishing the smart city grand challenge as well as the CPS Public Working Group (CPS PWG), providing a forum for discussing the CPS reference architecture, security, interoperability, and testing. In this activity, the SSD has gained recognition and engagement in the scientific community. It remains to be seen whether the division can reach the same level of visibility and engagement in the industrial community. It would be beneficial for SSD to engage the community in leveraging the test infrastructure that they are developing. In this respect, the technical capability of the precision timing researchers is high; they showed excellent knowledge of the field and overall competence spanning many areas.
Electronic Health Records
Multiple major drivers for change face the nation as it seeks to create a sustainable learning health and healthcare delivery system. The nation is moving to an accountable care organization structural model for the health and care of defined populations, with payment based on documented value relating to outcomes, services, and relevant quality and safety. To achieve the desired outcomes, patients and communities need to become active partners, working with relevant interprofessional teams relating to health and care. The nation is also moving toward care based on precision medicine, where care is increasingly based on individual characteristics at the molecular as well as the societal level (e.g., genomics and epigenetics). With the recent federal major investment in electronic health records(EHRs) and data exchanges, there is an expectation that the big data health information infrastructure necessary for an information and communications ecosystem needed to achieve major progress will be created.
With respect to the big data ecosystem, EHRs and related public health data generators need to be able to create health records from a common data strategy through the nation’s recent investments in health information and communication technology (HICT) and biomedical and health informatics (BHI). Critical elements of a learning health and healthcare system include scalable data needed for the following:
- Care delivery for individuals and populations, such as Clinical Document Architecture HL7 as well as standards for mobile health monitoring technology, so that the devices are secure and stable in their performance but also capable of interoperability, to enable scaling the data that can also result;
- Payment, particularly considering safety, effectiveness, efficiency, timeliness, patient-centeredness, and equitability;
- Research needs, particularly for system analytics and process reengineering to move toward clinical precise performance; and
- Clinical performance of the workforce, including maintenance of certification and quality performance of individuals and institutions.
NIST has been performing a mandated role as part of the American Recovery and Reinvestment Act (ARRA) Health Information Technology for Economic and Clinical Health (HITECH) legislation. Specifically, the role relates to supporting the ONC with EHRs and meaningful use. The current staff at the SSD is comprised of capable computer scientists and information technologists, but includes no biomedical/health informaticians.
Opportunities and Challenges
It is not clear that the ONC has a comprehensive strategy that allocates the tasks needed for the nation to achieve interoperable EHRs and EHR systems. Therefore, the SSD is faced with making a strategic choice. One choice is to limit its EHR-related work so that it focuses, essentially, only on relevant but narrow technology issues and does not represent itself as working on meaningful use or interoperability components, which intrinsically involve clinical information relating to decision making and clinical care. Alternatively, the SSD could work through a well-defined and circumscribed agenda coordinated with the other relevant key government agencies, including ONC, the National Library of Medicine (NLM), the National Institutes of Health (NIH), the Agency for Healthcare Research and Quality (AHRQ), and the Food and Drug Administration (FDA). Also, the SSD could consider approaching NLM for collaboration and assistance.
Recommendation: The Software and Systems Division not only should participate in the International Organization for Standardization (ISO)/International Electrotechnical Commission (IEC) joint technical committee working group on the Internet of Things (ISO/IEC JTC 1/WG10) and the Institute of Electrical and Electronics Engineers (IEEE) project on the Standard for an Architectural Framework for the Internet of Things (P2413), but also should reach out to the industrial consortia such as the Industrial Internet Consortium and the Open Interconnect Consortium.
Recommendation: The Software and Systems Division should make open to the scientific and industrial communities the framework that the SSD put together for the Smart Grid, as well as tools and techniques developed in university projects such as PRET (Precision Timed) machines project at the University of California, Berkeley.
Recommendation: The Software and Systems Division should look into growing its approach to timing in connection with Internet of Things applications by considering protocols that are robust to clock drift.
Recommendation: For the electronic health record (HER) the Software and Systems Division (SSD) should define a clear direction that allows it to either focus on limited objectives or add staff of the appropriate types to meet larger expectations. Because focusing on meaningful use of EHRs is intrinsically clinical in nature, the SSD should consider adding clinical informaticians to its staff.
ADEQUACY OF FACILITIES, EQUIPMENT, AND HUMAN RESOURCES
The SSD project suite spans a wide range that includes improving voting in national and local elections; electronic medical records; and the use of standards in forensics, cybersecurity, and software assurance. Overall, the technical work seems excellent and is conducted by capable staff. There have been substantial accomplishments, especially given the limited available resources and legislative mandates. In addition, based on the projects presented for review, there seems to be an appropriate balance of long-term, short-term, and opportunistic projects. The SSD uses a mix of formal and informal management processes to maintain that balance.
Projects where the SSD engages in standards activities have the most impact, followed by activities focusing on advancing measurement and evaluation capability. These activities leverage a body of expertise and institutional experience at the SSD, magnifying its leverage. Given the limited resources of the SSD, this is particularly important.
Opportunities and Challenges
Key SSD personnel seem to have opportunities to pursue new projects within the limited resources available to them. Continued pressure to take on additional projects could necessitate more formal prioritization criteria and processes and could also hinder the ongoing professional development of technical staff, which is essential, given the pace of technological advances.
Because many staff members are committed to multiple projects and most existing projects are expected to continue for extended periods, the SSD is at or near its capacity to undertake new projects. This fragility of human resources creates project and organizational risk. With limited bench strength, the loss of even one or two individuals could endanger or derail several extant projects.
The SSD seems to attract requests from other parts of NIST that are less capable in computer science. The SSD needs to determine whether this is something NIST management wants to encourage, and whether the SSD mission needs to be made more explicit, so that staffing allocations match the mission. Alternatively, these engagements could be limited to maximize human resource flexibility for core projects. The non-personnel resources (e.g., computing infrastructure and laboratory space) seem adequate but limited. There is little room for contraction of work without adversely affecting current programs, and such contraction could limit uptake of new projects, as do personnel constraints. Conversely, there are important opportunities for the SSD to leverage technical expertise elsewhere in ITL, particularly in the area of cybersecurity. With many SSD activities establishing patterns for future technical development, it is important that they be fully informed regarding considerations of security, as well as functionality, performance, and other quality attributes. This applies, for example, to work related to voting, health data management, and CPS.
The SSD lacks the resources or internal expertise to devote additional effort to outreach and dissemination. Consequently, its work is well regarded within the narrow communities with which the SSD directly interacts, but the work is not known in larger contexts. This outreach would be best undertaken with support from higher organizational levels (e.g., ITL or NIST as a whole) if the organization were to adopt explicit outreach goals in a manner similar to other research-oriented organizations (e.g., NSF and NIH). The work on voting, for example, illustrates the importance of this broader outreach, because it touches on technical as well as social issues.
With respect to its work on electronic health records, coordinated with the work of other relevant key government agencies, the SSD could augment its staff appropriately to address the challenges, with assured funding and explicit roles for multiple years. This would also enable better consistency across those involved in establishing standards for data representation and interoperability.
The SSD could consider approaching NLM for collaboration and assistance. It is possible that NLM could provide one or more clinical informaticians who could work on loan to the SSD effort. In addition, the SSD might consider establishing Intergovernmental Personnel Act (IPA) positions to attract academic medical center-based clinical informaticians to the SSD. The absence of in-house clinical expertise is problematic, because the SSD work will almost certainly be conceived as being broader and demand greater expertise than the current skill set of SSD personnel. Adding some clinical informatics personnel would make great sense. The clinical informaticians need not be physicians, because clinical informatics is an interdisciplinary field; a well-trained nurse informatician may be equally effective. The extramural visibility and importance of the HITECH electronic health record initiative is such that it should only fail on sound merits rather than from lack of internal collaboration and coordination across agencies having the relevant expertise.
DISSEMINATION OF OUTPUTS
The SSD is a gem in the crown of the nation’s research laboratories, yet today too few Americans know about its activities and accomplishments despite their being able to enjoy the benefits of SSD’s work. A more public face for the SSD would serve it well. That public face should focus on SSD’s achievements and their meaning. This visibility would be a vehicle for communicating the benefits of SSD’s activities and as a mechanism for shaping public opinion about the SSD’s priorities and resource needs.
Other federal agencies and laboratories have highly visible communication and social media outreach strategies. For example, NIH has its foundation;2 NLM has its Friends of the Library,3 with a publication for doctor’s offices called MedLine Plus; and the Centers for Disease Control and Prevention has a foundation.4 NSF and the Department of Energy have similar outreach and communication programs to highlight the broad impact of their work on society.
Given SSD’s resources, this outreach could take place in collaboration with ITL and NIST as a whole. Individual divisions are too small, and the cross-couplings of projects are too great, for outreach engagement at the division level to be effective. Active communication is an element of strategy; it shapes the environment in which projects are pursued and resources are allocated.