National Academies Press: OpenBook

2015-2016 Assessment of the Army Research Laboratory (2017)

Chapter: 8 Analysis and Assessment

« Previous: 7 Human Sciences
Suggested Citation:"8 Analysis and Assessment." National Academies of Sciences, Engineering, and Medicine. 2017. 2015-2016 Assessment of the Army Research Laboratory. Washington, DC: The National Academies Press. doi: 10.17226/24653.
×

8

Analysis and Assessment

The Panel on Assessment and Analysis at the Army Research Laboratory (ARL) conducted a review on August 8-10, 2016, at the Army’s White Sands Missile Range in New Mexico. The review was on three of ARL’s Analysis and Assessment Campaign programs—electronic warfare (EW), cybersecurity, and complex adaptive systems analysis (CASA).

ARL’s Analysis and Assessment Campaign provides tools that increase awareness of material capabilities, assesses the survivability and lethality of Army systems, and both improves and simplifies the Army’s decision making. The work in the EW program provides the analysis and assessment capability to operate in an increasingly complex, heterogeneous, and contested electromagnetic environment (EME). The work in the cybersecurity program provides analyses of Army systems that are in acquisition, or that are currently operational, in order to mitigate systems vulnerabilities and prevent future susceptibilities. The work in the CASA program is aimed at assessing multiple systems interactions, operational contexts, and networks to help the Army make informed decisions regarding the survivability of networked systems.

ELECTRONIC WARFARE

Accomplishments and Advancements

The ARL EW team is highly capable. The team has demonstrated its understanding of future threats for operations in complex EMEs. The reliance on information technology (IT) and on distributed wireless tactical communication networks will extend the domain of EW beyond traditional radio frequency (RF) radar and electro-optical sensor interactions to defeat opposition weapon and vehicle platforms. With new threats, the level of analysis and assessment needed will require more nuanced applications of EW, and related activities of electronic surveillance, electronic attack, and electronic protection along with

Suggested Citation:"8 Analysis and Assessment." National Academies of Sciences, Engineering, and Medicine. 2017. 2015-2016 Assessment of the Army Research Laboratory. Washington, DC: The National Academies Press. doi: 10.17226/24653.
×

countermeasure and counter-countermeasure developments. The team is definitely capable of performing EW analysis and assessment, instrumentation development, and test and evaluation. The team has state-of-the-art laboratories and has developed state-of-the-art instrumentation, which have been integrated with modeling and simulation tools to assess and test U.S. Army RF and electro-optical systems.

The latest simulators to test Global Positioning System (GPS) receiver operations are available to test GPS operations in highly contentious EMEs. The simulators can be utilized in both wired bench tests and in free space testing inside their large anechoic chamber. There is the capability to test anti-jam beam-forming GPS antennas, which can null reception of jamming signals, by using their free space propagation testing in the anechoic chamber. The experience of the GPS engineers, as well as how GPS receivers have been tested under jamming conditions, will support the development of approaches to perform EW on the operations of a variety of navigation satellite systems. Plans to acquire simulation tools for future testing—such as the M-Code GPS and simulators for testing receivers designed for positioning solutions—provide a useful forward path.

The anechoic chamber is a special facility that provides a sufficiently large volume to test platforms with various RF systems mounted on them. The anechoic chamber allows the team to operate and test systems in bands of the electromagnetic spectrum that are prohibited by law for general transmission operation. This facility also enables the testing of systems without radiating signals that could be detectable by overhead satellites, so as to provide signal emission security. In addition, the chamber provides the capability to inject real EMEs from other parts of the world and/or theaters of operations.

The SAGE1 RF modeling and simulation tool has leveraged existing, verified, and validated Terrain-Integrated Rough Earth Model (TIREM) software. SAGE development has permitted the design of an RF modeling and simulation tool that meets the specific test and development needs of the Survivability/ Lethality Directorate (SLAD). This tool has been integrated with experimental data to insure its usefulness. SAGE also supports experiments and testing by enabling the determination of the best locations for placing instrumentation in the test environment through RF propagation simulations. SAGE has also been integrated as a component of larger, more complex modeling and simulation tools.

The team has developed RF hardware building blocks for a suite of state-of-the-art EW radio test equipment and instrumentation systems. Developments include a digital radio frequency memory (DRFM) module that is a subsystem of several ARL SLAD EW test instrumentation systems. DRFMs record and play back received signals in real time. These signals can be used to spoof systems. They are also a programmable next generation of EW system threat that will be integrated by military opponents. The Army will need to develop techniques to counter this threat, in addition to understanding how to utilize DRFMs in their own operations. A DRFM has been integrated into an aircraft mountable pod for testing ground defense systems against this major EW threat. The pod was designed to radiate both forward and backward to double the time the system can radiate on a target as it passes. The aircraft pod DRFM has a traveling-wave tube (TWT) power amplifier in its front end, to provide a very powerful jamming signal, and will be used to test air defense radar systems. The DRFM is the state-of-the-art technology in threat simulation systems and will be for some time to come.

The DRFM module has been integrated into deployable multi-sensor electromagnetic warfare characterization systems (DMECS). This system is essentially a very-wideband spectrometer that allows for the display of collected and received signals by using a hypertext markup language (HTML) user interface. DMECS is a passive receiver and does not utilize the transmitter side of the DRFM device. DMECS enables collecting, storing, and viewing the EME during testing events and provides data for later evaluation and analyses of the systems being tested.

___________________

1 SAGE is a government owned application, augmented with a state-of-the-art discrete event network simulator, NS-3.

Suggested Citation:"8 Analysis and Assessment." National Academies of Sciences, Engineering, and Medicine. 2017. 2015-2016 Assessment of the Army Research Laboratory. Washington, DC: The National Academies Press. doi: 10.17226/24653.
×

A DRFM module has been integrated into the optimized modular EW network (OMEN) network controllable radio signal generation system. OMEN allows command and control of a network of signal generators to create a distributed complex RF test environment. The network of OMEN devices can also be used to test EW operations against RF communication networking systems. The ability to capture RF network traffic and then play back and mimic RF network devices shows the impact of a programmable radio system. It also enables the integration of EW with cybersecurity, since it is possible to capture RF-modulated waveforms and rebroadcast the desired waveform features to generate spoofed network radio traffic. This spoofed traffic can affect IT network operations to shut down, introduce malware, or perform a cyber electronic attack. The OMEN system has a flexible architecture, with growth potential to generate future and emerging threats. This level of integrated cybersecurity and EW is currently in its infancy. The building blocks have been developed to assemble and evolve more complex EW systems.

The electro-optical facilities within the EW program include laser vulnerability analyses for eye damage, optical sensor damage, detection of optical sensors, laser jamming, and threat assessments. Of particular interest for sensor and eye damage are new short-pulsed lasers that can generate a wideband white light pulse that exceeds the bandwidth of typical narrowband optical filters. The analyses and propagation of the short-pulse laser utilized commercial off-the-shelf tools—the shortcomings of this approach were defined and understood. Since the standard approaches to design filters involve employment of coatings to filter a narrow band of wavelength light, the interaction of the wideband white electro-optical signal against conventional filter coatings will not work. A completely new method is being assessed by the team to create a wideband fast-reactive filtering solution. This new approach employs carbon particles floating in a liquid filter. The carbon particles can block all wavelengths of electro-optical light. The filter is placed at the focal point of the optical path, so that the optical path through the filter has a narrow cross section. Methods to increase the carbon particle density at the focal point to adjust to wideband filter attenuation for eye and focal plane array protection are being investigated.

The team is also carrying out analyses of hostile fire detection sensing in order to detect and identify various types of muzzle flash from weapons and munitions based on their hyperspectral signature profile. These hostile fire detection systems can provide U.S. forces with the location of opposition weapons fire. When the location of hostile forces is known, implementation of directed counter fire is improved.

The team is also developing hardware simulations on methods to defeat heat-seeking infrared guided munitions. This development will examine methods to create appropriate countermeasures against these threats. This research will improve various implementations of countermeasure development, especially as the threats increase in capability to include hypervelocity missiles.

Opportunities and Challenges

There is a desire to insert analysis and assessment methods into earlier stages of technology development. When, and where, to include these inputs into the development process has yet to be determined. The implementation of analysis and assessment into the technology development process needs to be studied to determine when to carry out analysis and assessment. There is probably a sweet spot in the technology development pipeline that sets the team apart from other test and evaluation organizations. This sweet spot is probably somewhat earlier in the pipeline than where analysis and assessment is currently coming into the development process. It is worth considering that there are orders of magnitude more technologies that are investigated than will be selected for development. As a result, analysis and assessment are not worthwhile on all possibilities, and so a method to prioritize high-risk technologies needs to be developed.

Suggested Citation:"8 Analysis and Assessment." National Academies of Sciences, Engineering, and Medicine. 2017. 2015-2016 Assessment of the Army Research Laboratory. Washington, DC: The National Academies Press. doi: 10.17226/24653.
×

Engineers need to utilize appropriate sources of threat data to support EW systems. Since the threat is changing, and technology is advancing at an incredible pace, systems and threats from open source data need to be considered—in addition to classified sources—in order to develop a broad view of implementations, tactics, techniques and procedures, and threat hardware. Commercial sources of threats are becoming more readily available due to software definable radio system kits and the new commercial spectrum management approaches that utilize spectrum sharing. Defining methods to include industrial and academic collaborations, which work to counter these evolving threats, will improve understanding of current and future threats.

The focus of the team appears to be on traditional EW applications involving radar systems against airborne threats. The integration of the convergence of EW with cybersecurity appears to be limited. The EW team needs to pursue activities that improve convergence and integration of EW and cybersecurity. Tools that will enable the convergence of EW and cybersecurity need to be developed. A tool that could enable this convergence is one that can move through the physical layer to the data link layer or medium access control (MAC), network, etc., to reconstruct communications packet messages from collected RF signals. Adding the cyber reconstruction of communications packets to a DRFM module would be game changing.

The environment of a system has been limited to the signals in the operational band of the system under test. The implementation of testing in complex EMEs needs more thought. It is possible that a strong out-of-band signal can pass through an RF front end to the first active component, causing issues with the operation of a system. This can happen if the strong signal can pass through the front end and cause intermodulation products that interfere with an intermediate frequency or baseband signal. Such vulnerabilities of real systems need to be tested.

It is important to note that DRFMs can be employed against communications links and network systems. DRFMs can disrupt layer 2 protocols by replicating network packets to perform denial of service for carrier sense multiple access with collision avoidance (CSMA/CA) MACs and upset timing in time division multiple access (TDMA) MAC networks. DRFMs in new form factors that will be available in the future include the following: active electronically steered arrays (AESA); multiple input, multiple output (MIMO) arrays; and multifunctional RF systems. These multiple element DRFM designs will be narrower in bandwidth to support array operations, and their numbers will be high since they will be distributed all over the operational area—with each design supporting specific functions and spectrum allocations. DRFMs will be the future of EW, and there will be many suppliers of all the types of systems designed for specific applications and threats. Since DRFM, or DRFM functional systems, will be everywhere, the only defense against DRFMs will be networks of DRFMs to detect and gather the situational awareness of the spectrum operations. DRFM, or DRFM-like technologies, are only going to grow in the future. Using available systems, analyses can be carried out to investigate vulnerabilities of these DRFMs against themselves in order to develop DRFM countermeasures. In addition to using DRFMs in tests against existing acquisition systems, there is also a need to consider how to defend (create countermeasures) against these future threat DRFM systems.

Additionally, the GPS research path includes future simulators. These simulators need to account for a wide range of current and soon-to-be-operational satellites. Regarding navigation, the testing of inertial navigation systems integrated with GPS receivers will require the expertise of an of inertial navigation systems engineer. This expertise is not currently present in the EW team.

Finally, using the results and data from testing to improve or develop models did not appear to be a universal process within the Analysis and Assessment Campaign. The development of models from test results needs to be considered. Providing these models to other Department of Defense (DOD) analyses and test and evaluation agencies also needs to be considered.

Suggested Citation:"8 Analysis and Assessment." National Academies of Sciences, Engineering, and Medicine. 2017. 2015-2016 Assessment of the Army Research Laboratory. Washington, DC: The National Academies Press. doi: 10.17226/24653.
×

CYBERSECURITY

Accomplishments and Advancements

The cybersecurity team at ARL functions as a blue team for Army acquisition programs and operational units. A blue team is an organization that applies its expertise in cybersecurity to find areas of vulnerability in systems and then provides the systems’ developers or operators with guidance to enable them to improve their security for the future. To this end, the cybersecurity team conducts exploratory development aimed at improving ARL’s capabilities, which are applied in support of the blue team mission. The term blue team is used in contrast with a red team, which focuses only on finding and reporting vulnerabilities.

ARL has established a relatively large cybersecurity team of approximately 50 people. From the team’s description of its activities and accomplishments, it appears to be competent and well qualified. The team provides valuable services to Army programs and organizations, and its services are in demand from various Army organizations. There is, however, more demand for cybersecurity services than there is the capacity to provide those services. As a consequence, there is not enough time to carry out necessary tool development and maintenance. The team is performing applied research, some of which has the potential to be beneficial to the ARL mission of assessing the security of Army systems.

The team members have completed a large number of industrial and governmental certifications, consistent with DOD requirements. The certification courses are valuable, not for the certification, per se, but for the knowledge that they impart. The certification-related computing environment courses provide good, deep dives into subjects. The security courses also serve as boot camps and provide basic foundational knowledge (especially for new hires) that is beneficial to the team members’ effectiveness in conducting vulnerability analyses and risk assessments.

The team has discovered new and previously unknown vulnerabilities (called zero day vulnerabilities) in Army systems. These, and other discoveries, have enabled Army organizations to remediate vulnerabilities in developing and deployed systems. Contributions have also been made to the secure design of new Army systems by participating in cyber table-top exercises.

Analysis and assessments are carried out using tools such as SAGE and NS-3, as well as other tools to predict and assess the performance of mobile ad hoc networks. SAGE has been used to study the effects of protocol manipulation and exploitation. In particular, assessments on networks that employ the routing information protocol (RIP) have been performed. The team demonstrated good use of visualization tools to understand networks and their connectivity—along with the application of best practices—to document network assessment events in real time and play back events that occurred during the assessments. The recorded information is also used to train new staff. Approaches and exploits that have been successful in previous assessments are cataloged so they can be used, or built on, in the future.

Opportunities and Challenges

While the cybersecurity team is discovering vulnerabilities in Army systems, it was not clear how effective the team is at discovering vulnerabilities compared to other teams in similar roles. Metrics are not well established in the industry in this area; there is a need for such metrics.

While the team holds numerous industry certifications, more advanced training and development of cybersecurity professionals comes through attending industry events and conferences. The team was not allowed this year to attend Black Hat and Defcon, which are probably the industry’s premier practical (as opposed to academic) cybersecurity conferences. These conferences are where state-of-the-art attacks and vulnerabilities, and the techniques used to find them, are discussed.

Suggested Citation:"8 Analysis and Assessment." National Academies of Sciences, Engineering, and Medicine. 2017. 2015-2016 Assessment of the Army Research Laboratory. Washington, DC: The National Academies Press. doi: 10.17226/24653.
×

The cyber table-top exercises present a limited view of the security of systems under development. System security problems can be introduced at various stages throughout the system life cycle.

It does not appear that the team is taking full advantage of the vulnerabilities they discover to improve the security of Army systems. In particular, best practices for newly discovered vulnerabilities include generalizing them and seeking to identify their occurrence broadly throughout systems whose security is of concern. The focus appears to be on remediating discovered vulnerabilities, rather than on taking the next steps of generalizing them, and seeking to eliminate their occurrence. Newly discovered vulnerabilities need to be broadly investigated and, where feasible, approaches to removing similar vulnerabilities from systems need to be created and applied broadly.

To date, there have been limited engagements that required assessing embedded systems (e.g., the automotive-type systems connected via CanBus). These sorts of engagements are likely to become more important in the future. A research effort needs to be started on the cybersecurity of embedded systems because they are a major likely source of future security problems. Earlier inputs also need to be provided on security architecture and the design of systems under development where cybersecurity is a concern.

The demand for the services of the team has had the effect of limiting its capacity to develop or maintain a suite of assessment tools and apply best development practices to work on those tools. The team’s software engineering practices (source code management and bug tracking) are not at the level of common industry practice. The team needs to use a standard software development environment, including a source code repository, a version control system, an issue tracker, and of course, regular backups. The same is true of any other software developed by other groups.

The team also needs more time and resources to develop new tools and to make current and new tools more effective. So far, the team has been unable to identify the needs or approvals that would be required to release non-sensitive tools to the public as open source, to be consistent with government policy. Release of these tools would benefit the cybersecurity community at large and help to improve the credibility of the team, which directly impacts its ability to recruit staff.

There were limited indications that the team conducts security assessments of ARL systems and tools. In particular, it was not clear that the OMEN system had undergone a comprehensive cybersecurity review.

Security review practices also need to be benchmarked against peer groups from the other services and the National Security Agency (NSA) in order to calibrate the quality of practices and to identify opportunities for improving practices.

Thought could also be given to setting up a cybersecurity future directions group with senior members and perhaps outside advisors. The current system of bottom-up project initiation and direction-setting builds staff enthusiasm but can easily miss areas that experience suggests will become important. Furthermore, the team needs a senior champion and advocate, both to serve as a mentor and technical leader and to provide senior-level advocacy (e.g., for important advanced training activities) with ARL and Army leadership.

COMPLEX ADAPTIVE SYSTEMS ANALYSIS

Accomplishments and Advancements

Complex interacting systems permeate many facets of Army institutional and warfighting operations. Accordingly, incorporating CASA into the ARL’s Analysis and Assessment Campaign is likely to provide very valuable insights and important contributions to the Army. This is a very recent endeavor; therefore, this initial review and evaluation provides guidance for this new and important activity.

Suggested Citation:"8 Analysis and Assessment." National Academies of Sciences, Engineering, and Medicine. 2017. 2015-2016 Assessment of the Army Research Laboratory. Washington, DC: The National Academies Press. doi: 10.17226/24653.
×

The goal of the CASA program is to develop a family of simulations and associated analyses suites to provide test beds and to support experimentation. At present, only two models are available: S4 software (System of Systems Survivability Simulation) and SAGE. S4 permits a stochastic scenario exploration, while SAGE is fully deterministic. SAGE is a useful tool for visualizing and, to some degree, predicting communication performance of agents in realistic environments; for example, SAGE was shown to contribute to simulating EW and cyber impact on a radio network. The use of the OMEN and DMECS to validate the performance of SAGE was also studied.

Three primary CASA applications are as follows: counter-improvised explosive device (IED) systems sensor fusion; counter-unmanned aircraft system (UAS) and manned-unmanned teaming; and EW threats to communication networks during a recent network integration evaluation (NIE). These projects are directly relevant, are appropriate for initial efforts, and have produced insightful results.

Opportunities and Challenges

The modeling efforts embodied in S4 and SAGE were initiated almost two decades ago and are narrowly focused on a few operational settings. It is not evident from these initial applications whether, and to what degree, emergent behaviors and/or adaptive learning are either being captured or described by using these two particular models. Furthermore, it has not been shown that S4 is filling a gap not already met by other tools, such as one semi-automated force (OneSAF); a strategic pause in development of S4 is needed until this can be demonstrated.

As the ARL campaigns begin to develop processes to enhance cross-laboratory collaboration, the Analysis and Assessment Campaign is encouraged to develop stronger ties to other parts of the laboratory. The Analysis and Assessment Campaign in its current state appears to be a SLAD campaign rather than a laboratory campaign. Additionally, the CASA program goals and approaches need to be clarified. Such systems analyses need to have a role across the full-spectrum of life-cycle challenges and include the post Milestone C-stage,2 where the vast majority of Army costs are incurred. CASA capabilities and expertise could also contribute to other Analysis and Assessment Campaign critical campaign enablers, especially the personnel survivability and human systems critical campaign enablers.

A clear perspective of how CASA contributes and crosses boundaries within the ARL and DOD needs to also be articulated as soon as possible, with buy-in from the relevant constituencies. The current CASA capability is neither adequate, nor well positioned, to engage the wide spectrum of ARL needs.

CASA capabilities could enhance and support the ARL Sciences for Maneuver Campaign areas of logistics and sustainability. Innovative logistics systems and improved technologies for sustainment operations can have a dramatic impact on reducing the significant logistics burden that encumbers operating forces across major classes of supply (e.g., food, fuel, water, ammunition, repair parts, etc.). Encompassing the entire acquisition life cycle, especially long-persisting logistics and sustainment challenges, could provide an opportunity for engineering analysis to directly support the Sciences for Maneuver Campaign and the full range of force operating capabilities. The campaign also needs to be directed toward the Army standard force on force models.

Accurately modeling an adaptive enemy in an action-reaction cycle to better anticipate likely threat responses and rapidly counter them, or even preempt these responses, is an incredibly valuable capability. Such an approach could be incorporated into the CASA activity and complement the vulnerability

___________________

2 Milestone C is a milestone decision authority-led review at the end of the engineering and manufacturing development phase. Its purpose is to make a recommendation or seek approval to enter the production and deployment phase. AcqNotes. Retrieved September 9, 2016, from http://www.acqnotes.com/acqnote/acquisitions/milestone-c.

Suggested Citation:"8 Analysis and Assessment." National Academies of Sciences, Engineering, and Medicine. 2017. 2015-2016 Assessment of the Army Research Laboratory. Washington, DC: The National Academies Press. doi: 10.17226/24653.
×

assessment teams in ARL. ARL needs to devote CASA and other relevant analytical methods to support the vulnerability assessment teams. Additionally, it would offer an opportunity to incorporate a systems-of-teams perspective (the human dimension) as a particularly relevant paradigm.

Understanding and accurately replicating human decisions in conflict environments and scenarios (e.g., war gaming) has been a significant challenge for the modeling and simulation communities. For agent-based simulations specifically, obtaining credible decision logic is essential for model calibration, and ultimately accreditation for use. Acquiring and transferring this military knowledge and experience can be difficult to achieve in nonintrusive ways and provides a challenge for complex adaptive systems modeling. Possible sources for obtaining the knowledge and validating decision logic imbedded in agent-based models are the pre-command course war gaming exercises at Fort Leavenworth, the Battle Command Training Program, and the Command Training Centers (e.g., National Training Center at Fort Irwin). This empirically derived decision logic, which reflects current military doctrine and the military decision-making process, could then be incorporated into agent-based models for the purposes of better understanding and replicating tactical operations, including manned-unmanned teaming and autonomous systems. ARL needs to survey and evaluate these potential sources for extracting empirical decision logic from constructive, virtual, and live simulations used by operating forces. This empirical decision logic needs to then be incorporated into agent-based models for the purposes of better understanding and replicating tactical operations.

Additionally, consideration could be given to joining existing communities of excellence. For example, with respect to the modeling and simulation domain of complex systems, ARL could locally collaborate with the Santa Fe Institute on newly emerging concepts and methods for adaptive systems. They could also collaborate with Sandia National Laboratories for engineering processes, as well as practical applications in neural networks and genetic algorithms, which are used by Sandia’s Center for Systems Reliability (CSR).

Relevant academic institutions and programs to engage include the Massachusetts Institute of Technology’s (MIT’s) Engineering Systems Division and their new Institute for Data, Systems, and Society. The Naval Postgraduate School has strong graduate and research programs in autonomous systems engineering, operations analysis, a unique cross-disciplinary program called modeling, virtual environments and simulation (MOVES), and annually presents an informative technology refresher update (TRU) on emerging technologies.

Current manpower levels and skills mix are insufficient in both capacity and capability to adequately support a broader CASA program as it expands across ARL. ARL needs to address the lack of any data scientists and operations research analysts within the complex adaptive systems and analysis program, as the lack of this expertise is a conspicuous shortfall that is a top priority. Furthermore, future project groups need to be supported and sustained by a nucleus of operations research expertise, including data analytics, that can guide multidisciplinary groups with the skill sets relevant to the tasks undertaken. The vital role that operations research and operations research analysts can, and need to, play in this activity cannot be overemphasized.

ARL could also consider establishing, at least on a temporary basis, a small external advisory group with the appropriate mix of disciplines and experience from the information, computational, human, and analytical sciences.

OVERALL QUALITY OF THE WORK

The ARLTAB assessment of this campaign is different from that of most, if not all, other campaign assessments because the Analysis and Assessment Campaign is intended to be more of an analytically focused, crosscutting, activity rather than being research focused. As a result, the criteria are different

Suggested Citation:"8 Analysis and Assessment." National Academies of Sciences, Engineering, and Medicine. 2017. 2015-2016 Assessment of the Army Research Laboratory. Washington, DC: The National Academies Press. doi: 10.17226/24653.
×

from those of research-focused campaigns. Nevertheless, the work needs to have technical depth, and the panel needs to be presented with material that exhibits this technical depth. Specific assessment factors for any panel visits need to be developed and made known to the staff. These assessment factors need to be guided by the general categories of analytical capabilities—capacity, utilization of analytical resources, and contributions.

Additionally, ARL needs to apply analytical resources as part of the Analysis and Assessment Campaign to better understand critical relationships and how they impact a wide range of variables, from recruiting standards to battlefield tactics to force design and budgetary allocations. ARL also needs to acquire and/or develop a comprehensive set of analytical capabilities that leverage other modeling, simulation, and high-performance computing capabilities to ensure adequate support for future Analysis and Assessment Campaign endeavors.

The quality of work for EW systems was observed to be good, and the staff is knowledgeable. Good insight, however, was not obtained on the technical quality of the modeling and simulation, experimental design analysis, and analytical science. Tools are being developed, and tests are being performed that answer the questions of the Army customers.

The cybersecurity team’s presentations and discussions indicated that the team is competent and contributes to cybersecurity research. Improvements are needed, however, for the overall quality of the research to attain the level of top-tier government or industry cybersecurity assessment or consulting organizations. This observation is based both on the presence of some indications of quality (discovery of new vulnerabilities and demand for services) and the absence of others (contributions to secure design, generalization of findings, and application of software engineering tools and practices).

The challenge of complex interacting adaptive systems is critical and permeates many facets of ARL, and it is commendable that this challenge is recognized and resources are being pooled to address it. In its present form, the CASA capability is not well positioned to engage the wide spectrum of ARL needs. Developing a comprehensive vision, purpose, and plan for this activity will help design a more robust software environment and architecture that leverages other developments in modeling, simulation, and high-performance computing within DOD. Such developments will also put the team in a better position to integrate forthcoming efforts in human factors and EW (such as physics-based jamming models for more complex scenarios). A better assessment of the uncertainties associated with enemy-induced jamming or countermeasures or environment-induced interference would also be conducive to better decision-making.

CONCLUSIONS AND RECOMMENDATIONS

Many of the professional staff are native to the local area and also tend to come from universities that are local to the area. In addition, the universities that support the laboratory with research are the same as the ones attended by staff members. As a result, the diversity of experience and perspective within the staff can be limited, and this is a potential weakness. This can lead to emerging developments elsewhere being missed.

Recommendation. ARL should broaden the perspectives of the Analysis and Assessment Campaign staff members. Approaches that should be considered to accomplish this should include the following:

  • Utilize the ARL open campus initiative which will involve setting up relationships with research centers across the country.
Suggested Citation:"8 Analysis and Assessment." National Academies of Sciences, Engineering, and Medicine. 2017. 2015-2016 Assessment of the Army Research Laboratory. Washington, DC: The National Academies Press. doi: 10.17226/24653.
×
  • Utilize the other transaction authority (OTA) of a consortium acquisition model, which can introduce industry, as well as university participation, to increase the diversity of ideas and developments and open up the technology base.
  • Utilize virtual collaboration and a virtual community of excellence, which may attract the best and brightest without pulling them to a physical location.
  • Support personnel with degrees from local universities to attend and obtain a degree from a leading university in their field that is outside the local area.
  • Join and participate in existing communities of excellence or establish new ones appropriate to the Analysis and Assessment Campaign mission.
  • Increase engagement with the broader Department of Defense (DOD)/Intelligence Community and thereby increase leveraging of Army/DOD science and technology communities.
  • Encourage coordination with various DOD analyses and test and evaluation communities.
  • Increase interaction with the commercial industry for sharing of methods.

The cybersecurity team was not allowed this year to attend Black Hat and Defcon, which are probably the industry’s premier practical (as opposed to academic) cybersecurity conferences. These conferences are where state-of-the-art attacks and vulnerabilities, and the techniques used to find them, are discussed.

Recommendation. ARL should take whatever steps are required to ensure that Analysis and Assessment Campaign staff members attend the premier—practical as well as academic—conferences in their area. The cybersecurity team members should attend top-tier industry events, in particular, Black Hat and Defcon, as well as comparable hacker conferences.

To date, there have been limited engagements that required assessing embedded systems (e.g., the automotive-type systems connected via CanBus). These sorts of engagements are likely to become more important in the future.

Recommendation. ARL should start an effort on cybersecurity of embedded systems—they are the wave of the future and, hence, are likely to be a major source of future security problems.

The modeling efforts embodied in S4 were initiated almost two decades ago and are narrowly focused on a few operational settings. It has also not yet been demonstrated that S4 is filling a gap not already met by other tools such as ONESAF.

Recommendation. ARL should initiate a strategic pause in the development of S4 until it can be shown that this tool will fill a gap not already met by other tools such as ONESAF.

Current manpower levels and skills mix are insufficient in both capacity and capability to adequately support a broader CASA program as it expands across ARL. The lack of expertise in operations research analysis is a conspicuous shortfall. The vital role that operations research and operations research analysts can, and should, play in this activity cannot be overemphasized.

Recommendation. ARL should address the lack of data scientists and operations research analysts within the complex adaptive systems and analysis program. Future project groups should be supported and sustained by a nucleus of operations research expertise, including data analytics, guiding multidisciplinary groups with the skill sets relevant to the tasks undertaken.

Suggested Citation:"8 Analysis and Assessment." National Academies of Sciences, Engineering, and Medicine. 2017. 2015-2016 Assessment of the Army Research Laboratory. Washington, DC: The National Academies Press. doi: 10.17226/24653.
×
Page 183
Suggested Citation:"8 Analysis and Assessment." National Academies of Sciences, Engineering, and Medicine. 2017. 2015-2016 Assessment of the Army Research Laboratory. Washington, DC: The National Academies Press. doi: 10.17226/24653.
×
Page 184
Suggested Citation:"8 Analysis and Assessment." National Academies of Sciences, Engineering, and Medicine. 2017. 2015-2016 Assessment of the Army Research Laboratory. Washington, DC: The National Academies Press. doi: 10.17226/24653.
×
Page 185
Suggested Citation:"8 Analysis and Assessment." National Academies of Sciences, Engineering, and Medicine. 2017. 2015-2016 Assessment of the Army Research Laboratory. Washington, DC: The National Academies Press. doi: 10.17226/24653.
×
Page 186
Suggested Citation:"8 Analysis and Assessment." National Academies of Sciences, Engineering, and Medicine. 2017. 2015-2016 Assessment of the Army Research Laboratory. Washington, DC: The National Academies Press. doi: 10.17226/24653.
×
Page 187
Suggested Citation:"8 Analysis and Assessment." National Academies of Sciences, Engineering, and Medicine. 2017. 2015-2016 Assessment of the Army Research Laboratory. Washington, DC: The National Academies Press. doi: 10.17226/24653.
×
Page 188
Suggested Citation:"8 Analysis and Assessment." National Academies of Sciences, Engineering, and Medicine. 2017. 2015-2016 Assessment of the Army Research Laboratory. Washington, DC: The National Academies Press. doi: 10.17226/24653.
×
Page 189
Suggested Citation:"8 Analysis and Assessment." National Academies of Sciences, Engineering, and Medicine. 2017. 2015-2016 Assessment of the Army Research Laboratory. Washington, DC: The National Academies Press. doi: 10.17226/24653.
×
Page 190
Suggested Citation:"8 Analysis and Assessment." National Academies of Sciences, Engineering, and Medicine. 2017. 2015-2016 Assessment of the Army Research Laboratory. Washington, DC: The National Academies Press. doi: 10.17226/24653.
×
Page 191
Suggested Citation:"8 Analysis and Assessment." National Academies of Sciences, Engineering, and Medicine. 2017. 2015-2016 Assessment of the Army Research Laboratory. Washington, DC: The National Academies Press. doi: 10.17226/24653.
×
Page 192
Next: 9 Crosscutting Conclusions and Recommendations and Exceptional Accomplishments »
2015-2016 Assessment of the Army Research Laboratory Get This Book
×
Buy Paperback | $69.00 Buy Ebook | $54.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The National Academies of Sciences, Engineering, and Medicine's Army Research Laboratory Technical Assessment Board (ARLTAB) provides biennial assessments of the scientific and technical quality of the research, development, and analysis programs at the Army Research Laboratory (ARL), focusing on ballistics sciences, human sciences, information sciences, materials sciences, and mechanical sciences. This biennial report summarizes the findings of the ARLTAB from the reviews conducted by the panels in 2015 and 2016 and subsumes the 2015-2016 interim report.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!