of using structured databases to provide reasonable answers to queries is being supplanted by emerging technologies that are able to interpret, address, and integrate unstructured, natural data by using unstructured, natural-language queries that more closely reflect human language processing and analytic approaches.

Big data as a concept has implications for the demands placed on humans to understand rapidly growing bodies of information and for the possibility of augmenting human cognition through breakthroughs in computer-based processing. Sensor suites used in science, military applications, and complex system operations can flood a human operator with data and result in a decrease in system performance due to cognitive overload. Computational systems to augment human cognition are being developed to address this problem. Research is addressing the best ways for people to interact with and control systems that are increasingly complex, but the research seems to be limited in its generalizability. An example is the “optimal” cockpit in military aviation. Despite nearly 100 years of experience, the optimal design of cockpit displays, controls, and automation tradeoffs is still controversial. The question of how much a human should be involved in the operation of a system will be critical as HPM technologies associated with big data continue to evolve.

Big data has been made possible by a series of IT advancements, including the reduction in the cost of data storage and computing, the emergence of the Internet, widespread deployment of video and other sensor technologies, advanced computer networks, cloud-computing technologies, social networks, and smart phones. Although the notion of big data is discussed primarily in the context of commercial and business applications, where it promises great advances in productivity, it is clear that big data technologies will be critical for—and beneficial to—military-oriented HPM applications.

Managing big data poses substantial challenges, and the following discussion assumes that organizations, countries, or other entities that make the most progress toward solving the fundamental scalability challenges associated with big data and cognitive information processing will have the best basis for developing game-changing HPM applications.

Scientists and engineers working in the IT sector are making—and will continue to make—substantial improvements in many areas, including leveraging high-speed networking, parallel and cluster computer programming, cloud computing, machine learning, advanced data-storage technologies, continued scaling of Moore’s law (which states that the number of transistors on an integrated circuit board will double about every 2 years), and security technologies. However, as discussed in more detail below, the current trajectory does not support future computing requirements, especially as it applies to big data and cognitive information processing. Even today, serious challenges are evident in relation to scalability and sustainability of the current trends. Three examples illustrate the problems: unsustainable energy needs, the explosion of data produced, and the current inefficiency of computing solutions that mimic human intelligence.

Energy needs are a recognized challenge. Although IT technology continues to provide better performance with less power consumption every year, it is now well established that past advances have not been able to keep pace with information-processing demands (Izydorczyk, 2010). Specifically, a study estimated total U.S. data-center energy consumption in 2005 to be about 1.2 percent of total U.S. consumption, which is an increase of 15 percent from 2000 (Koomey, 2007). Total power consumption for IT is still low compared with other sectors, but the growth rate of 15 percent (that is, doubling every 5 years) is alarming and suggests that the continued expansion of IT is not sustainable. Clearly, improved IT energy efficiency will be critical to addressing the sustainability of growth. Major innovations will be required to scale big data technologies to levels where they will provide the anticipated benefits (U.S. Code, 2008).

The explosion of data is also a recognized challenge. The problem is illustrated by the new radio telescope developed by the Square Kilometer Array (SKA) Consortium (Crosby, 2012). The telescope will produce 1 exabyte (1 billion gigabytes) of data every day. To put that into context, the whole Internet today handles about 0.5 exabyte per day. It would be naïve to assume that



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement