National Academies Press: OpenBook
« Previous: 2 Universities, Industry, and Government: A Complex Ecosystem Yielding Innovation and Leadership
Suggested Citation:"3 Reflections on Resurgence." National Academies of Sciences, Engineering, and Medicine. 2020. Information Technology Innovation: Resurgence, Confluence, and Continuing Impact. Washington, DC: The National Academies Press. doi: 10.17226/25961.
×

common goal. The model makes the results of that project available to both contributors and noncontributors alike (the results are thus largely nonappropriable). It is an important mechanism for capturing, combining, and refining research results and making them available in usable form and at low cost to companies and other organizations. Sometimes academic groups or academic groups with industry partners elevate open-source code to the quality needed for widespread adoption; in other cases, start-ups are created to support production-quality development.

An important characteristic of open source is how it can raise the level of abstraction for future efforts of researchers and developers. This fundamental property of IT, the power of abstraction, amplifies the impact of open sources, from the statistical libraries of R, enabling generations of data scientists, to the role of TensorFlow, PyTorch, Spark, Postgres, and more throughout large-scale computing and machine-learning research.

Academic researchers, by virtue of not being so focused on commercial advantage, are often in a good position to emphasize interfaces and modularity that foster composability and interoperability in the artifacts that stem from their research. These attributes accelerate the impact of research, foster the absorption of research results into diverse sectors, and propel commercial investment by creating network effects. By virtue of the confluence phenomenon discussed above, these effects are not limited to IT companies but are amplified across many U.S. industry sectors.

Suggested Citation:"3 Reflections on Resurgence." National Academies of Sciences, Engineering, and Medicine. 2020. Information Technology Innovation: Resurgence, Confluence, and Continuing Impact. Washington, DC: The National Academies Press. doi: 10.17226/25961.
×

3
Reflections on Resurgence

The path from discoveries and innovations in fundamental research to economic impact is rarely straightforward. Some trajectories illustrated in this report show research evolving unevenly through many steps, at different sites, involving collaborations from many sources, and achieving significant impact a decade or more after first efforts. Progress is rarely uniform—research activity may slow down if results lag expectations or if essential prerequisites are missing or slow to emerge. But research can also surge when a new idea or technique is found to be powerful, perhaps even applicable to several unrelated research activities under way.

This report calls this uneven, but eventually successful, pattern of research resurgence. In resurgence, although research at first is promising, ongoing results or the rate of progress fall behind expectations and can stagnate, attracting few new researchers and diminished funding; but later, sometimes a decade or more later, new ideas or capabilities reinvigorate the field and progress surges ahead and leads to significant impact. Several important research directions in information technology (IT) have shown this resurgent behavior. And there is an important lesson from these examples: It is imperative to sustain research funding for important problems through discouraging periods so that new ideas can emerge and so that researchers can pull in new developments, often coming from other areas of IT or even from other fields, that reignite progress.

This section presents a few examples of recent resurgent research tracks in IT. Each has a unique story, but there are common themes underlying resurgence that illuminate catalytic effects that might apply to all IT research, at least at some point

Suggested Citation:"3 Reflections on Resurgence." National Academies of Sciences, Engineering, and Medicine. 2020. Information Technology Innovation: Resurgence, Confluence, and Continuing Impact. Washington, DC: The National Academies Press. doi: 10.17226/25961.
×

in time. Examples of recent resurgent research tracks are described below (and documented in Appendix B).

  • Machine learning. Early artificial intelligence (AI) work in neural networks was later revived as the basis for deep learning—enabled by advances in computational power and an abundance of data drawing from a newly interconnected and digitized society connecting millions of people.
  • Formal methods. Techniques derived from mathematical logic developed starting in the 1960s eventually become practical for improving software quality by identifying errors (bugs) or by proving that a computer program correctly follows its specification.
  • Virtual machines (VMs). 1960s-era industry work on VMs allowed the sharing of a lone mainframe computer among many users. Virtualization was repurposed by 1990s academic research to allow a single system to run multiple software environments, so that large data centers could flexibly allocate load among servers—the foundation for cloud computing.
  • Virtual reality (VR). Although the first research demonstrations of virtual reality occurred in the 1960s, work in this area proceeded slowly for decades, waiting for parallel advancements in headset displays, image-generation techniques, and methods to measure head positions and user input. Contributing to and benefiting from research in computer graphics, smartphones, accelerometers, and inertial navigation algorithms, VR today is a more than $7 billion worldwide market with applications in gaming, entertainment, automobiles, manufacturing, education, and defense.

Although each resurgence is unique, there are some effects that have contributed to more than one of the advances (and indeed are fundamental drivers of innovation in computing more broadly), including the following:

  • Computing power. In the past 10 years, there has been a surge in computing power available for research and for deployment of new techniques. Large data centers run by companies such as Google, Microsoft, and Amazon have enabled tremendous advances in machine learning. This computational horsepower makes it possible to use vast amounts of data, for example, creating the training for better recognition of human speech. Likewise, techniques for proving mathematical statements used in formal systems have also been boosted by more computing.
Suggested Citation:"3 Reflections on Resurgence." National Academies of Sciences, Engineering, and Medicine. 2020. Information Technology Innovation: Resurgence, Confluence, and Continuing Impact. Washington, DC: The National Academies Press. doi: 10.17226/25961.
×
  • Microelectronics revolution and Moore’s law. Many IT technologies, not just computing, are gated by microelectronic advances, including sensors such as accelerometers, Global Positioning System (GPS) receivers, wireless networking transceivers, network transmission electronics, optical fiber, and optoelectronics. Recent virtual reality headset designs have awaited small, light, bright, flat-screen displays, image sensors for precision position tracking, and many other microelectronic components.
  • Connectivity. Real-world research impact requires pervasive infrastructure, such as cloud computing, cellular telephony, and the Internet. For example, smartphones routinely perform speech recognition and phrase prediction relying on the connection of the individual device and global cloud and have enabled remote monitoring and the management of disease outside of the clinic in ways not previously possible.
  • Data. Developments in AI, especially machine learning, have been gated by the availability of training data. Information from billions of Internet users and widespread sensors—together with enormous advances in networking, database, and storage technologies—make it possible to collect enormous amounts of data. Some training requires data that is labeled (e.g., speech with corresponding text labels); groups have formed to label and curate certain data sets.
  • New IT solutions to emerging problems. As collections of multiple computers became necessary to deliver massive-scale online services, new problems in networking and system management had to be solved. The resurgence of virtualization solved one major problem. To solve another, delivering video-streaming content to millions of viewers, network protocols and routers were optimized. And when cyberattacks on data centers started exploiting vulnerabilities in software, static analysis techniques developed by formal systems research were adapted and extended to find many of the bugs.

Although paths from fundamental research to impact of course encounter unexpected stalls, successes, and changes, many important results have emerged from dedicated research support and determined researchers. These advances are frequently gated by and catalyzed with changing economic conditions. Decreasing prices and increasing availability of components of IT technologies often lead to new progress, especially when launching a product or starting a company. Increased connectivity and adoption generate new data and new business opportunities. This virtuous cycle of new research insights, followed by bursts of research

Suggested Citation:"3 Reflections on Resurgence." National Academies of Sciences, Engineering, and Medicine. 2020. Information Technology Innovation: Resurgence, Confluence, and Continuing Impact. Washington, DC: The National Academies Press. doi: 10.17226/25961.
×

accomplishments, fueled by changes inside and outside of the IT ecosystem, and eventually leading to significant impact, is the story of resurgence.

Looking to the future, there are many promising research trajectories that are, or could be, temporarily stalled in their eventual path to impact. Underneath these “hype cycles” lauded in the media, one must look to the set of external and internal conditions in the IT community that will propel research to break through. Without sustained support, these research trajectories may “die on the vine,” lacking a dedicated community to act on new opportunities and insights.

VIRTUALIZATION

Historians say that history repeats itself. So too does technology. This recycling of ideas from bygone eras is perhaps most apparent in the story of virtualization. Born in the 1960s, virtualization provided an alternative to timesharing, in which all users (jobs) share the same software stack (e.g., operating system), and the computer takes turns running each user’s application. By contrast, virtualization creates the illusion that each job has its own machine. This allows a single machine to run concurrent jobs for different purposes without interfering with each other—for example, a batch-processing job, a transaction-processing job, a job making a backup of disk files, and a job running a new version of the batch-processing system that is being tested.

Research

IBM introduced virtualization in the 1960s in the context of the OS/360 system.1 Although IBM’s CP-67/CMS laid the foundation for the system software of a successful line of commercial offerings (IBM System/360), it began as a research effort.2 The two key artifacts of this effort, the Virtual Machine Control Program (CP) and the Cambridge Monitor System (CMS), are the ancestors of today’s hypervisors and guest operating systems, respectively. System/360 and its successors, System/370, System/390, and zSeries, were all supported with VM systems such as VM/370 (introduced in 1972) and ultimately z/VM. Although this product line remained commercially successful for decades, research in virtualization essentially vanished by the 1980s.

___________________

1 R.J. Adair, R.U. Bayles, L.W. Comeau, and R.J. Creasy, 1966, A Virtual Machine System for the 360/40, Cambridge Scientific Center Report 320-2007, IBM Corporation, Cambridge, MA.

2 R.A. Meyer and L. H. Seawright, 1970, A virtual machine time-sharing system, IBM Systems Journal 9(3): 199-218.

Suggested Citation:"3 Reflections on Resurgence." National Academies of Sciences, Engineering, and Medicine. 2020. Information Technology Innovation: Resurgence, Confluence, and Continuing Impact. Washington, DC: The National Academies Press. doi: 10.17226/25961.
×

In 1997, Mendel Rosenblum’s group at Stanford University reintroduced VMs to solve an entirely new problem—allowing a multiprocessor (several computers connected together) to run multiple commodity operating systems.3 This effort was part of a research project funded by the National Science Foundation (NSF) and the Defense Advanced Research Projects Agency (DARPA) to seek ways to increase performance from multiprocessors. Typically, performance stopped improving after a dozen or so processors were sharing a single operating system, an impediment largely relieved by virtualization. Although the initial implementation was for the MIPS R10000 microprocessor, they soon developed a VM monitor for the Intel x86, which was becoming the de facto hardware platform of choice. When multicore microprocessors (essentially multiprocessors-on-a-chip) were introduced in 2001, virtualization found even greater opportunities. This combination of the resurgence of VMs, the industry standardization around the x86 architecture, and the transition to multicore microprocessors paved the way for both a resurgence in virtualization research, research in networked system design and implementation, and the reality of today’s cloud computing environments.

Just as the 1997 resurgence in virtualization addressed a problem quite different from the one for which it was designed, today’s use of virtualization differs from that of Rosenblum. Virtualization enables cloud computing—a cloud data center hosts thousands of independent VMs on a smaller number of real machines, each with a rich software stack tailored to its needs. Today’s cloud computing environments have replaced many locally run corporate computing environments. Rather than purchasing real machines, enterprises can lease a collection of VMs or even entire virtual data centers. Historically, setting up a data center in an enterprise was a protracted affair involving construction of, or repurposing of, space in which to place a data center, quotes from multiple vendors, long waits for equipment delivery, and a lengthy installation process. Today, one can create a virtual data center on a cloud in a few minutes.

Applications

The ripple effect of cloud computing is enormous. First, it changes business economics; cloud resources are elastic. Enterprises can dynamically change the size of their computing infrastructure to match demand. If you are a seasonal business, you increase your allotment of VMs during your busy season and then shut them down during the slow season. If you do a monthly financial rollup, you need to allocate VMs for only a few hours after the end of the month. If you offer an online game, you

___________________

3 E. Bugnion, S. Devine, K. Govil, and M. Rosenblum, 1997, Disco: Running commodity operating systems on scalable multiprocessors, ACM Transactions on Computer Systems 15(4): 412-447.

Suggested Citation:"3 Reflections on Resurgence." National Academies of Sciences, Engineering, and Medicine. 2020. Information Technology Innovation: Resurgence, Confluence, and Continuing Impact. Washington, DC: The National Academies Press. doi: 10.17226/25961.
×

can rent more VMs when more players enter your game and scale back when the load diminishes. If you are a start-up, you buy a few VMs to start and buy more as needed, with no long-term capital expenditures. Second, the cloud provider reaps and passes on many economies of scale in areas such as buying computers; purchasing electric power; building, outfitting, and operating data centers; testing and installing software; offering hot switchover to backup data centers; backing up data; and defending against cyberattacks. Third, virtualization changes personnel needs. Historically, if your business had significant IT demands, you had a team of IT professionals, such as systems administrators and computer operators. Cloud vendors either perform those functions for you or provide you tools such as dashboards and schedulers to command your cloud resources. Fourth, the cloud is accessible to an increasingly less technical user base. First-generation cloud providers gave users bare machines; next came machines configured with standard software packages; today, the cloud provides fully developed services as well as cloud-management tools. As the services provided become richer, users need less knowledge about the technology and more knowledge about applications leveraging the technology.

Impact

The economic impact of virtualization is clear. Cloud services had 2019 worldwide revenues estimated at $227 billion; annual growth for 2020 is estimated to be about 17 percent.4 All but the most specialized computational problems are moving to the cloud. This modern capability owes its roots to industry computing research in the 1960s that was resurrected by academic research decades later, creating vast returns today.

Outlook

New forms and approaches to virtualization continue to be developed. Virtualization of computation has broadened to include new forms of packaging such as containers. Virtualization is also being used in new contexts. Software-defined networks, a logical offshoot of computing virtualization, has its origins in 1990s research and has more recently been commercialized, standardized, and widely deployed. Another more recent example of virtualization is to sensor networks, where it is used to share and reuse sensors in a resource-efficient manner. Virtualization is a powerful idea that will surely continue to be recycled well into the future.

___________________

4 Gartner, 2019, “Gartner Forecasts Worldwide Public Cloud Revenue to Grow 17% in 2020,” https://www.gartner.com/en/newsroom/press-releases/2019-11-13-gartner-forecasts-worldwide-public-cloud-revenue-to-grow-17-percent-in-2020.

Suggested Citation:"3 Reflections on Resurgence." National Academies of Sciences, Engineering, and Medicine. 2020. Information Technology Innovation: Resurgence, Confluence, and Continuing Impact. Washington, DC: The National Academies Press. doi: 10.17226/25961.
×

VIRTUAL ENVIRONMENTS

Virtual environments (VEs) present computer-generated renderings of three-dimensional (3D) virtual worlds on smartphone displays, computer displays, or immersive 3D displays such as headsets.5

  • VR presents VEs using immersive displays, replacing your perception of the physical world around you with virtual elements. The most common and prolific examples today are viewing 360-degree video, informal education, and games.
  • Augmented reality (AR) and heads-up displays (HUDs) present a virtual image overlaid on your perception of the real world. HUDs present computer-generated information without attempting to align the virtual elements with your view of the physical world. For example, a HUD that projects images of your speedometer on your car’s windshield adds important information to the real-world view through the windshield.
  • More challenging AR applications (sometimes dubbed mixed reality (MR) or extended reality (XR)) carefully align the virtual images with the real-world images so that the virtual elements appear to be part of the real world. The best example is the first-down yard line stripe overlaid on televised football images. More complex applications may visualize computer-modeled objects interacting with the real world, such as a (virtual) table lamp that appears to sit on a (real) coffee table.

Virtual environments and their variants are categorized as a resurgent research track because research originated in the 1960s and proceeded slowly for three or four decades, but then surged. On this “roller coaster,” progress intensified, start-up companies were formed to commercialize new inventions, and the technologies and products are now a major option for consumer computing, specifically for design and entertainment applications. The first VR visualizations did not use computer-generated imagery, but rather tiny cameras steered over a scale model, such as used to train pilots to land airplanes. The first computer-generated imagery in VR was

___________________

5 General references for this section include the following: The website, “XinReality, Virtual Reality and Augmented Reality Wiki” at https://xinreality.com/wiki/Main_Page, last updated November 25, 2018; K. Marriott, F. Schreiber, T. Dwyer, K. Klein, N.H. Riche, T. Itoh, W. Stuerzlinger, and B.H. Thomas, eds., 2018, Immersive Analytics, Springer, https://www.springer.com/gp/book/9783030013875; and R. Azuma, 2019, The road to ubiquitous consumer augmented reality systems, Human Behavior and Emerging Technologies 1(1): 26-32, https://onlinelibrary.wiley.com/doi/full/10.1002/hbe2.113.

Suggested Citation:"3 Reflections on Resurgence." National Academies of Sciences, Engineering, and Medicine. 2020. Information Technology Innovation: Resurgence, Confluence, and Continuing Impact. Washington, DC: The National Academies Press. doi: 10.17226/25961.
×

demonstrated by Ivan Sutherland in 1968 in a federally funded research project,6 but the computers and graphics displays were expensive, the headset heavy and uncomfortable, and the images were simply line drawings—suggestive but unpersuasive renderings of “reality.” Until far better computers, image generation, headset displays, and headset-position-measuring methods emerged, VR progress was slow.

But in the past two decades, VE technologies have advanced rapidly and seen significant commercial offerings. For at least the past 10 years, SIGGRAPH, the annual conference that showcases computer graphics technologies, has had dozens of VE/VR/AR demonstrations by researchers, start-ups, and large computer companies. These demos cover a wide range of applications, many of which have unique VR requirements and solutions. But the “holy grail” of VE, a headset as lightweight, effective, and socially acceptable as a pair of eyeglasses, is still an elusive challenge.

Today VE interaction is more than measuring your head position and generating realistic images. More senses can be stimulated, such as binaural audio, touch/feel (haptic feedback), and even smell. VE applications are driven by user inputs from devices such as joysticks, gaming controllers, instrumented gloves, and hand-held “pucks” that report position and orientation. Input devices vary widely depending on the application: flight trainers mock up a cockpit full of switches, knobs, and of course, the control stick.

Research

Today’s virtual environment systems depend both on research that addressed limitations in the early systems, but also on developments imported from research and development (R&D) for goals other than VE. Major developments that have been adopted or adapted for VE include the following: (1) the AR headset for Sutherland’s 1968 system (developed for night helicopter operation); (2) algorithms, hardware, and software to generate real-time, shaded images of 3D models with hidden surfaces removed (commercialization driven by entertainment, computer-aided design (CAD), training, and scientific applications); (3) microelectromechanical systems devices such as gyroscopes and accelerometers (used in smartphones); (4) inertial navigation algorithms used to determine headset position (first developed for missile and aircraft navigation); (5) image-processing algorithms that track reference points to determine the position and orientation of the headset; (6) simultaneous location and mapping to align measured positions with maps or models (pioneered in robotics); (7) lightweight, low-persistence displays; and (8) gesture-recognition (developed primarily for touch-sensitive screens).

___________________

6 I.E. Sutherland, 1968, “A Head-Mounted Three-Dimensional Display,” pp. 757-764 in AFIPS Conference Proceedings, Vol. 33, Part I, doi:10.1145/1476589.1476686.

Suggested Citation:"3 Reflections on Resurgence." National Academies of Sciences, Engineering, and Medicine. 2020. Information Technology Innovation: Resurgence, Confluence, and Continuing Impact. Washington, DC: The National Academies Press. doi: 10.17226/25961.
×

In addition, research was required to tackle problems specific to VE/VR/AR applications:

  • Reducing delay (latency) between head-position measurements and consequent changes in the displayed image. Predictive tracking algorithms estimate headset position and orientation slightly in advance, to mask some image-generation delay. There have been dozens of research contributions to specifically address this problem.
  • CAVEs, rooms with images projected on all walls in which a user moves to examine a scene, for a totally immersive experience. Only head position, not orientation, is needed to generate appropriate images. CAVEs have been used to treat phobias, such as acrophobia, by systematic desensitization: adapting to a series of situations of increasing stimulus.
  • Lightweight headsets using flat panel displays rather than CRTs. These are now standard in all commercial headset products.
  • Headset-measuring techniques that allow a large working volume, such as a room of a few hundred square feet. For example, the room’s ceiling contains an array of LED lights that are flashed on in sequence; the headset has several cameras to determine its bearing to an LED that is on. Large working volumes can also be achieved with inertial navigation, together with techniques for canceling drift.
  • Optics that maximizes field of view. Field of view is critical for achieving a sense of presence in an immersive VE. AR applications that require rapid situational awareness need peripheral vision, and thus a wide field of view.
  • High display frame rates and low persistence displays, important especially for rapid motion in games.
  • Development of input devices specifically for VE, such as a 3D mouse (“puck”), (instrumented) gloves, body trackers (e.g., Microsoft Kinect, designed to sense the presence of multiple game players in a room).

Research in VE/VR/AR has been funded over a long history by NSF, DARPA, the Office of Naval Research, universities, and industry.

Applications

3D simulations are a limited form of VR but are characteristic of a wide range of computer applications that model or simulate objects that are visualized with 3D graphics, sometimes using VR techniques to let a viewer inspect the model more

Suggested Citation:"3 Reflections on Resurgence." National Academies of Sciences, Engineering, and Medicine. 2020. Information Technology Innovation: Resurgence, Confluence, and Continuing Impact. Washington, DC: The National Academies Press. doi: 10.17226/25961.
×

closely. For example, an architect might use CAD software on a laptop to display a “walk-through” of the design.

More ambitious VR applications use headsets to immerse the viewer in a VE; for example, making it easier to evaluate designs for high-value products such as automobiles or training operators of expensive or dangerous equipment (airplanes, fighter aircraft, oil tankers, spacecraft) or those carrying out dangerous procedures (surgery, refueling a nuclear reactor, landing on the Moon). VR is used in training medical staff to engage in conversations with each other and with patients about critical and sensitive issues (e.g., Shadow Health7). Some medical students are learning anatomical dissection using VR, an especially welcome option when the COVID-19 pandemic precludes laboratory dissection.8

Immersive visualization is also used to examine data sets by visualizing a graph, map, network, or other form suggested by the data. Today, the greatest consumer market for VR/AR is games. Since about 2010, the rapid increases in capability and decreases in price have led to the emergence of numerous products and games. Early games used fast 3D graphics and hand-held controllers with conventional displays that engage but do not immerse the user in the game. Recent offerings introduce headset designs that offer immersion and greater presence.

Some applications immerse multiple people in a virtual environment. Networked VEs are also used for nongame applications such as SIMNET, a pioneering example of networking military trainers—networked simulators of tanks and aircraft to train commanders to coordinate battlefield warfare. Distance collaboration—for example, a health care team consulting on a patient plan, or even collaborating during surgery—may immerse all participants in a VE relevant to the collaborators’ task.

Impact

The worldwide AR/VR market was estimated at $7.3 billion in 2018, of which $3 billion was in the United States.9 Although there are numerous small companies and consultancies offering products and services, a handful of large companies (e.g., Microsoft, Facebook, Google, Autodesk, Dassault Systems) is driving the market and covers over half the market share. The market is projected to reach $16.3 billion by 2022.

___________________

7 See the Shadow Health website at https://www.shadowhealth.com, accessed July 1, 2020.

8 S. Montanari, 2020, “Body of Knowledge,” Slate, June 23, https://slate.com/technology/2020/06/med-school-cadaver-dissection-virtual-reality.html.

9 Fortune Business Insights, 2020, “Virtual Reality (VR) Market to Reach USD 57.55 Billion by 2027; Emphasis on Advanced Virtual Solutions by Eminent Companies to Support Growth,” press release, September 2, https://www.fortunebusinessinsights.com/press-release/virtual-reality-market-9265; Market Watch, “Virtual Reality Market 2019: Global Industry Size, Demand, Growth, Analysis, Share Revenue and Forecast 2026,” September 12, https://www.marketwatch.com/press-release/virtual-reality-market-2019-global-industry-size-demand-growth-analysis-share-revenue-and-forecast-2026-2019-09-12.

Suggested Citation:"3 Reflections on Resurgence." National Academies of Sciences, Engineering, and Medicine. 2020. Information Technology Innovation: Resurgence, Confluence, and Continuing Impact. Washington, DC: The National Academies Press. doi: 10.17226/25961.
×

The market shows significant revenues in gaming and entertainment (40 percent), health care, education, automotive, aerospace and defense, and manufacturing. The price of headsets is a decreasing impediment—what cost $40,000 in 2013 is now less expensive than a smartphone; there are estimated to be 20 million headsets in the United States now.

Applications apart from gaming and entertainment are diverse—most are custom designed for specific needs, although they often involve common components. For example, Walmart is issuing 17,000 headsets to store employees for use in training on new technologies, compliance, and soft skills (empathy, customer service).10 Walmart used VR to train employees how to use its new Pickup Tower even before the towers were installed in its stores. The U.S. Army has contracted with Microsoft for $479 million to supply HoloLens headsets and software for its Integrated Visual Augmentation System (IVAS),11 which integrates a number of functions needed by soldiers. Some involve warfighting (night vision, target identification, viewing models of the threat environment derived from overhead surveillance, collaborating with other units on the battlefield) and others that are more general (training, maintenance). Soldiers in field tests interact daily with software developers to refine the system.

Outlook

The technologies in VR/AR are not mature. Research will continue to tackle unsolved problems and explore new opportunities; examples that are visible today are provided below (for a discussion of challenges and opportunities, see Welch et al.12).

  • Devising socially acceptable headsets and input devices. Will the AR “holy grail”—AR glasses and input devices you can wear all day in all activities—ever occur?
  • Improving all the performance parameters of headsets—weight, power, field of view, working volume, and merging and aligning virtual and real images.
  • Eliminating sources of discomfort or fatigue in using VR systems—errors in head position sensing, jitter, image lag, proprioceptive errors (e.g., when motion detected by the eye differs from that detected by the inner ear).

___________________

10 J. Incao, 2018, “How VR is Transforming How We Train Associates,” Walmart, September 20, https://corporate.walmart.com/newsroom/innovation/20180920/how-vr-is-transforming-the-way-we-train-associates.

11 S. Sprigg, 2018, “US Army contract for supply of 2,500 ‘IVAS’ headsets,” Auganix.org, November 29, https://www.auganix.org/microsoft-awarded-usd-480-million-us-army-contract-for-supply-of-ivas-2550-headsets/.

12 G.F. Welch, G. Bruder, P. Squire, and R. Schubert, 2018, “Anticipating Widespread Augmented Reality: Insights from the 2018 AR Visioning Workshop,” https://stars.library.ucf.edu/ucfscholar/786/.

Suggested Citation:"3 Reflections on Resurgence." National Academies of Sciences, Engineering, and Medicine. 2020. Information Technology Innovation: Resurgence, Confluence, and Continuing Impact. Washington, DC: The National Academies Press. doi: 10.17226/25961.
×
  • Simplifying the design and implementation of VR applications, especially simplifying modeling real-world scenes.
  • Understanding how to develop hardware and software that operates naturally with human users—in other words, the user-centered design of VE applications. For example, in some cases, showing “reality” may not be as effective as an alternative that emphasizes a particular message—for example, a cartoon.13
  • Understanding when VR/AR-immersive experiences produce a significant impact in user experience (e.g., learning a new task) compared to less expensive methods.
  • Further exploring the role of haptics in creating a truly immersive experience

The affordable hardware for VE applications now available creates demand for expanding, diverse work on applications—developing content for VE experiences. Even if a “killer app” emerges, a wide range of custom-designed applications will require custom software. A software ecosystem is likely to grow to support these developments with components such as user interface frameworks, 3D model and scene creators, iconography to navigate virtual worlds, etc. As was the case with the user interfaces dominant today for personal computers (dubbed WIMP: windows, icons, menus, pointer), a lot will be learned from experimentation, innovation, refinement, competition, and continuous improvements.

FORMAL METHODS

Formal methods are increasingly used to verify the correctness of computer hardware, software, and systems. This section focuses on their application to software correctness.

Although testing is widely used to identify errors in systems, it cannot verify the absence of errors. The time required to exhaustively test a system, trying every possible combination of inputs, results in a “combinatorial explosion” and a test time that may exceed the expected lifetime of the planet.

___________________

13 J. Jerald, 2015, The VR Book, Morgan & Claypool Publishers, https://books.google.com/books/about/The_VR_Book.html.

Suggested Citation:"3 Reflections on Resurgence." National Academies of Sciences, Engineering, and Medicine. 2020. Information Technology Innovation: Resurgence, Confluence, and Continuing Impact. Washington, DC: The National Academies Press. doi: 10.17226/25961.
×

The following are two principal uses of formal systems to check hardware and software designs:

  • Verification, which checks that an implementation correctly adheres to a specification. Although producing accurate specifications is difficult, they are usually much shorter, more intuitive, and more easily inspected than implementations. Formal systems are used to help refine specifications and to verify the implementation or specification.
  • Static analysis, which applies formal systems to implementations to look for suspicious elements such as common programming mistakes, violations of rules of electrical design in hardware, and the like. These techniques examine the detailed structure of the implementation; they do not test the hardware or software. Because software tools for static analysis are much simpler, faster, and easier to use than those for verification, they are widely used to catch design errors.

Verification requires a formal specification that captures the intended behavior of the system being examined; a rigorous process, often requiring construction of mathematical proofs, is then used to establish correctness of an implementation. Although specifications are usually shorter and simpler than the hardware circuits and software code that implements them, it is a challenge to get them right and to express them in a formal way that permits verification.

The point of verification is to detect inconsistencies between the millions of implementation details in a typical piece of hardware or software and the comparatively simple specification. But to bridge the gap between implementation and specification, a developer may need to construct hierarchies of ever-more-detailed specifications and verify each level of the hierarchy, each depending on verification of more detailed levels. Verification can thus become a huge effort, even with automated tools that help construct and check specifications and proofs. In practice, verification effort is applied where the risk is greatest—for example, to ensure safety or security. It is not a panacea: Despite formal verification, computer systems may not operate as intended. If a system’s requirements are incomplete, or if they are interpreted incorrectly by the engineer writing specifications for verification, or if they cannot express important requirements (e.g., that the airplane does not crash), the result may be a failure.

Suggested Citation:"3 Reflections on Resurgence." National Academies of Sciences, Engineering, and Medicine. 2020. Information Technology Innovation: Resurgence, Confluence, and Continuing Impact. Washington, DC: The National Academies Press. doi: 10.17226/25961.
×

Research

Formal analysis techniques invented in the 1960s were theoretically sound but impractical: They lacked automated tools and had to be carried out by hand. Despite several decades of modest progress, research continued as the growth in complexity of hardware and software made correct and trustworthy computing an ever more important goal. However, as of 1990, verification was still a research topic and not used much in practice.

However, the most recent two decades have seen vastly increased R&D of a variety of formal analysis techniques, of software to automate their application, and in the scale and complexity of systems that have been verified. During that same time, the need for formal methods expanded. Concurrency effects in emerging multiprocessors, multi-core microprocessors, and multi-thread programming presented new challenges to verification. Model checking, a formal technique often used to verify correctness of communication protocols used on computer buses or networks, helped address these needs. Amazon Web Services uses formal methods to check configurations of cloud computing components—for example, to ensure that one customer cannot access another customer’s data or interfere with their service. Configuration-checking is essential to comply with mandates that secure financial transactions (e.g., Payment Card Industry (PCI) standards).14

Although 20 years ago formal methods research was done almost entirely in academia, several major companies now have groups that do fundamental research and also apply formal systems in their business. Also, in addition to commercial hardware-analysis tools offered by CAD companies, there are commercial offerings for analyzing and verifying software.

Although there have been many developments in the field and many contributors to progress, there are three important themes that have galvanized a resurgence of interest in and use of formal methods:

  • SAT and SMT solvers. High-performance solvers for SAT (satisfiability) and SMT (satisfiability modulo theories) that have made many different kinds of verification more practical.
  • Analysis and verification tools. Software tools, mostly open source, that more smoothly integrate verification into conventional software engineering workflows.
  • Verified artifacts. Successful verification of bigger systems, which has led to techniques for designing the systems and their specifications to simplify verification.

___________________

14 J. Kagan, ed., and T.J. Catalano, 2020, “PCI Compliance,” Investopedia, https://www.investopedia.com/terms/p/pci-compliance.asp.

Suggested Citation:"3 Reflections on Resurgence." National Academies of Sciences, Engineering, and Medicine. 2020. Information Technology Innovation: Resurgence, Confluence, and Continuing Impact. Washington, DC: The National Academies Press. doi: 10.17226/25961.
×
Image
FIGURE 3.1 Ten years of competition performance improvements in which more problems are solved in less time. SOURCES: Courtesy of Matti Järvisalo, University of Helsinki, Daniel Le Berre, Université d’Artois, Olivier Roussel, Université d’Artois, and Laurent Simon, University of Bordeaux. See M. Järvisalo, D. Le Berre, O. Roussel, and L. Simon, 2012, The International SAT Solver Competitions, AI Magazine 33(1): 89-92, https://doi.org/10.1609/aimag.v33i1.2395.

SAT and SMT Solvers

At the heart of much verification are “solvers” that search for solutions to a system of logic equations. Although the basic Davis-Putnam-Logemann-Loveland (DPLL) search algorithm,15 developed in 1962, remains the basis of many of today’s solvers, many incremental refinements have been developed, often by observing patterns that arise in practical verification problems. Twenty years of “SAT competitions” have motivated and demonstrated dramatic performance improvements: Competing software must be open sourced and described at a conference accompanying the

___________________

15 M. Davis, G. Logemann, and D. Loveland, 1962, A machine program for theorem-proving, Communications of the ACM 5(7): 394-397.

Suggested Citation:"3 Reflections on Resurgence." National Academies of Sciences, Engineering, and Medicine. 2020. Information Technology Innovation: Resurgence, Confluence, and Continuing Impact. Washington, DC: The National Academies Press. doi: 10.17226/25961.
×

competition. Figure 3.1 shows 10 years competition performance improvements—more problems solved in less time.

There are dozens of SMT solvers, most open source, catering to different theories and styles of programming, using different techniques to accelerate search. As with SAT, there is an SMT competition.

Analysis and Verification Tools

In conjunction with advances in solvers, languages for expressing specifications and tools that convert specifications and program code into formulas to pass to a solver have been developed. In some cases, a programming language and specification language are developed together (e.g., Dafny16). For established languages, a separate specification language is needed (e.g., JML for Java17), but usually the specification can appear as annotations embedded in the implementation. Tools process specifications and code to prepare equations for one or more SMT solvers; Microsoft’s Z3 is one such open-source tool in wide use today.18

Analysis tools inspect software to detect common coding errors, such as “memory leaks” (allocating memory that is never de-allocated), use of a “null” pointer, and others. These “static analysis” tools require neither specifications nor running the code; they check for a few classes of mistakes that can occur many times in large systems. An exemplar of these tools is Infer, an open-sourced tool from Facebook that checks for errors in Java, C, C++, and Objective-C. The Facebook team and technology trace back to early research funded by the National Security Agency (NSA) to detect errors in hardware-definition-language coding for cryptographic chips.

To support software development, it is common to integrate some of these tools into the development workflow, so that analysis is performed whenever specifications or implementations are modified. Although analysis often reports many false warnings, new approaches such as those of Muse.Dev19 filter the output to show an engineer only those warnings caused by the most recent modification and thus encourage repairs before moving on to other tasks.

___________________

16 Microsoft Research, “Dafny,” https://rise4fun.com/Dafny, accessed July 1, 2020.

17 University of Central Florida, “The Java Modeling Language (JML),” http://www.eecs.ucf.edu/~leavens/JML//index.shtml, accessed July 1, 2020.

18 K. Rustan and M. Leino, 2016, “Dafny: An Automatic Program Verifier for Functional Correctness,” Microsoft Research, https://www.microsoft.com/en-us/research/wp-content/uploads/2016/12/krml203.pdf.

19 See the muse.dev website at https://muse.dev, a spinoff of Galois, Inc.

Suggested Citation:"3 Reflections on Resurgence." National Academies of Sciences, Engineering, and Medicine. 2020. Information Technology Innovation: Resurgence, Confluence, and Continuing Impact. Washington, DC: The National Academies Press. doi: 10.17226/25961.
×

Verified Artifacts

A growing list of practical software artifacts have been verified to be functionally correct. A few prominent examples include the following:

  • CompCert: a C compiler,20
  • CertiKOS: concurrent kernel framework,21
  • s2n: TLS/SSL secure communications protocol (see Box 3.2),
  • seL4: a micro-kernel22 designed to support safety-critical applications; the autonomous helicopter mentioned below uses sel4 for its “mission computer.”

Today, computer chip design routinely uses formal techniques, supported by commercial CAD tools. Many open-source software tools are available, as are commercial tools and services for software verification. There are numerous demonstrations and some production uses of verification for complex software systems. An impressive example of applying formal methods to a cyber-physical system is the following: An autonomous helicopter was successfully flown in 2017 controlled by a verified operating system (seL4 “microkernel”) that resisted continuous cyberattack.23 Although seL4 is only about 10,000 lines of C code, it has complex verification conditions because it must enforce stringent isolation and security requirements of an operating system.

Research Advances and Sources of Support

Using logic to reason about computer programs stems from the work of Sir Charles Antony Richard (Tony) Hoare (Oxford University) and Robert W. Floyd (Stanford University) in the late 1960s. Academic research has continued ever since, usually small teams collaborating with those at other institutions; the research community is international. Progress, and especially during the resurgent period of the last 15 years, has been characterized by many incremental steps that are aggregated into ever more powerful techniques and tools. Examples include the following:

___________________

20 X. Leroy, 2009, Formal verification of a realistic compiler, Communications of the ACM 52(7), https://dl.acm.org/doi/10.1145/1538788.1538814.

21 R. Gu, Z. Shao, and H. Chen, 2016, “CertiKOS: An Extensible Architecture for Building Certified Concurrent OS Kernels,” pp. 653-669 in OSDI’16: Proceedings of the 12th USENIX conference on Operating Systems Design and Implementation, https://dl.acm.org/doi/proceedings/10.5555/3026877.

22 See the seL4 Microkernel website at http://sel4.systems, accessed July 1, 2020.

23 G. Klein, J. Andronick, M. Fernandez, I. Kuz, T. Murray, and G. Heiser, 2018, Formally verified software in the real world, Communications of the ACM 61(10): 68-77.

Suggested Citation:"3 Reflections on Resurgence." National Academies of Sciences, Engineering, and Medicine. 2020. Information Technology Innovation: Resurgence, Confluence, and Continuing Impact. Washington, DC: The National Academies Press. doi: 10.17226/25961.
×
  • DPLL (Davis, Putnam, Logemann, Loveland) satisfiability search procedure;
  • Apply SAT solvers to (symbolic) model checking;
  • SAT evolutions such as “conflict driven clause learning”;
  • Combining theory decision procedures with SAT search to yield SMT solvers;
  • Frameworks for proof checking: CVC3, CVC4, LFSC;
  • Formulating more theories for SAT solvers: strings, floating-point, bit-vectors, regular expressions;
  • SMT extensions to higher-order logic;
  • Competitions to spur advances in SAT and SMT; and
  • Universal commitment to building open-source tools.

Fundamental research has also been sponsored by the computer industry, especially in the wake of the Intel floating-point divide (FDIV) bug in 1994 (Box 3.1). Industrial consortia such as the Semiconductor Research Corporation supported university research, especially in model checking for hardware verification. Industrial laboratories also performed fundamental research: SMT solvers trace back to a procedure developed by Greg Nelson and Derek C. Oppen at Digital Equipment Corporation in 1980.24 Z3, a solver from Microsoft Research, has had wide impact, both as research and as an open-source tool.25 Today, Microsoft, Amazon Web Services,

___________________

24 G. Nelson and D.C. Oppen, 1980, Fast decision procedures based on congruence closure, Journal of the ACM 27(2): 356-364, https://doi.org/10.1145/322186.322198.

25 L. de Moura and N. Bjørner, 2008, “Z3: An Efficient SMT Solver,” Lecture Notes in Computer Science, 4963:337-340, https://link.springer.com/content/pdf/10.1007/978-3-540-78800-3_24.pdf.

Suggested Citation:"3 Reflections on Resurgence." National Academies of Sciences, Engineering, and Medicine. 2020. Information Technology Innovation: Resurgence, Confluence, and Continuing Impact. Washington, DC: The National Academies Press. doi: 10.17226/25961.
×

Facebook, Google, and others have teams that apply formal systems but also launch or contribute to open-source tools that are widely shared.

Several agencies of the U.S. government have sponsored sustained work on formal systems. NSF has had continuous programs to support academic work in the theory of computation, as well as methods to mechanize and apply techniques to practical problems. NSA and the Air Force Office of Scientific Research were early and adventurous supporters; NSA was an early customer of Galois, Inc. DARPA has supported research in both academia and industry. They have undertaken several substantial programs that focused support on formal systems, including the Cyber Grand Challenge (2015) and High Assurance Cyber Military Systems.26

Impact

The greatest economic impact of formal systems comes not from a market for tools and services, but rather from the economic benefits that are attributable to the use of the tools. Static analysis tools are widely used to eliminate bugs from high-value production software, such as systems run by cloud vendors, financial services companies, communications service providers, widely used software libraries (Box 3.2), and the like.

The economic value to the United States of labor saved by static analysis tools is on the order of $500 million. The savings are estimated with the following assumptions: 600,000 software engineers in the United States with $75,000 salaries, half of the engineers are programmers who spend 20 percent of their time debugging, and 10 percent of their debugging time is avoided because bugs have been eliminated by static analysis during coding. A certain amount of static checking is standard during software development, but this estimate would apply if tools like Infer were widely deployed.

Major information-technology companies are coming to appreciate the power of formal analysis to prevent attacks and losses. As an example, a number of Amazon Web Services features use these technologies, including Amazon inspector, S3 block public access, IAM Access Analyzer, and Amazon CodeGuru. In addition, well-publicized internal projects such as a proof of the correctness of the key management store and the boot loader use these techniques.27 Microsoft Research pursues major projects in formal systems, including the Z3 SMT solver. Smaller

___________________

26 K. Fisher, J. Launchbury, and R. Richards, 2017, The HACMS program: Using formal methods to eliminate exploitable bugs, Philosophical Transactions, Series A, Mathematical, Physical, and Engineering Sciences 375(2104): 20150401, https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5597724/.

27 Amazon Web Services, “Provable Security,” https://aws.amazon.com/security/provable-security, accessed July 1, 2020.

Suggested Citation:"3 Reflections on Resurgence." National Academies of Sciences, Engineering, and Medicine. 2020. Information Technology Innovation: Resurgence, Confluence, and Continuing Impact. Washington, DC: The National Academies Press. doi: 10.17226/25961.
×

companies offer products and services for formal analysis and verification (Galois and its spin-off, Muse.dev).

Today, there is no large market for formal systems themselves, but the technologies are arguably essential to and enabling huge markets in trustworthy computing, especially safety-critical embedded computing systems in cars, trucks, trains, airplanes, and electric power plants, for example. These systems must not only operate correctly, but their mission requires that they communicate via untrusted networks with other computers that may also be untrusted. Formal systems are already helping reduce the “attack surface” of these systems.

Outlook

Verification is far from a “solved” problem. Verification is often limited by the completeness of the specifications that can be verified: automated tools are limited in the complexity of the checking they can do; specification-writing requires specially-skilled engineers, in short supply; available verification tools do not always mesh well with software development and maintenance processes, and there is only modest commercial support for verification. While the labor required to do verification often makes its cost uneconomical, static analysis tools are effective deterrents for the most common software bugs and continue to spread. But the need for verification continues to grow as systems vital to safety and commerce proliferate

Suggested Citation:"3 Reflections on Resurgence." National Academies of Sciences, Engineering, and Medicine. 2020. Information Technology Innovation: Resurgence, Confluence, and Continuing Impact. Washington, DC: The National Academies Press. doi: 10.17226/25961.
×
Page 26
Suggested Citation:"3 Reflections on Resurgence." National Academies of Sciences, Engineering, and Medicine. 2020. Information Technology Innovation: Resurgence, Confluence, and Continuing Impact. Washington, DC: The National Academies Press. doi: 10.17226/25961.
×
Page 27
Suggested Citation:"3 Reflections on Resurgence." National Academies of Sciences, Engineering, and Medicine. 2020. Information Technology Innovation: Resurgence, Confluence, and Continuing Impact. Washington, DC: The National Academies Press. doi: 10.17226/25961.
×
Page 28
Suggested Citation:"3 Reflections on Resurgence." National Academies of Sciences, Engineering, and Medicine. 2020. Information Technology Innovation: Resurgence, Confluence, and Continuing Impact. Washington, DC: The National Academies Press. doi: 10.17226/25961.
×
Page 29
Suggested Citation:"3 Reflections on Resurgence." National Academies of Sciences, Engineering, and Medicine. 2020. Information Technology Innovation: Resurgence, Confluence, and Continuing Impact. Washington, DC: The National Academies Press. doi: 10.17226/25961.
×
Page 30
Suggested Citation:"3 Reflections on Resurgence." National Academies of Sciences, Engineering, and Medicine. 2020. Information Technology Innovation: Resurgence, Confluence, and Continuing Impact. Washington, DC: The National Academies Press. doi: 10.17226/25961.
×
Page 31
Suggested Citation:"3 Reflections on Resurgence." National Academies of Sciences, Engineering, and Medicine. 2020. Information Technology Innovation: Resurgence, Confluence, and Continuing Impact. Washington, DC: The National Academies Press. doi: 10.17226/25961.
×
Page 32
Suggested Citation:"3 Reflections on Resurgence." National Academies of Sciences, Engineering, and Medicine. 2020. Information Technology Innovation: Resurgence, Confluence, and Continuing Impact. Washington, DC: The National Academies Press. doi: 10.17226/25961.
×
Page 33
Suggested Citation:"3 Reflections on Resurgence." National Academies of Sciences, Engineering, and Medicine. 2020. Information Technology Innovation: Resurgence, Confluence, and Continuing Impact. Washington, DC: The National Academies Press. doi: 10.17226/25961.
×
Page 34
Suggested Citation:"3 Reflections on Resurgence." National Academies of Sciences, Engineering, and Medicine. 2020. Information Technology Innovation: Resurgence, Confluence, and Continuing Impact. Washington, DC: The National Academies Press. doi: 10.17226/25961.
×
Page 35
Suggested Citation:"3 Reflections on Resurgence." National Academies of Sciences, Engineering, and Medicine. 2020. Information Technology Innovation: Resurgence, Confluence, and Continuing Impact. Washington, DC: The National Academies Press. doi: 10.17226/25961.
×
Page 36
Suggested Citation:"3 Reflections on Resurgence." National Academies of Sciences, Engineering, and Medicine. 2020. Information Technology Innovation: Resurgence, Confluence, and Continuing Impact. Washington, DC: The National Academies Press. doi: 10.17226/25961.
×
Page 37
Suggested Citation:"3 Reflections on Resurgence." National Academies of Sciences, Engineering, and Medicine. 2020. Information Technology Innovation: Resurgence, Confluence, and Continuing Impact. Washington, DC: The National Academies Press. doi: 10.17226/25961.
×
Page 38
Suggested Citation:"3 Reflections on Resurgence." National Academies of Sciences, Engineering, and Medicine. 2020. Information Technology Innovation: Resurgence, Confluence, and Continuing Impact. Washington, DC: The National Academies Press. doi: 10.17226/25961.
×
Page 39
Suggested Citation:"3 Reflections on Resurgence." National Academies of Sciences, Engineering, and Medicine. 2020. Information Technology Innovation: Resurgence, Confluence, and Continuing Impact. Washington, DC: The National Academies Press. doi: 10.17226/25961.
×
Page 40
Suggested Citation:"3 Reflections on Resurgence." National Academies of Sciences, Engineering, and Medicine. 2020. Information Technology Innovation: Resurgence, Confluence, and Continuing Impact. Washington, DC: The National Academies Press. doi: 10.17226/25961.
×
Page 41
Suggested Citation:"3 Reflections on Resurgence." National Academies of Sciences, Engineering, and Medicine. 2020. Information Technology Innovation: Resurgence, Confluence, and Continuing Impact. Washington, DC: The National Academies Press. doi: 10.17226/25961.
×
Page 42
Suggested Citation:"3 Reflections on Resurgence." National Academies of Sciences, Engineering, and Medicine. 2020. Information Technology Innovation: Resurgence, Confluence, and Continuing Impact. Washington, DC: The National Academies Press. doi: 10.17226/25961.
×
Page 43
Suggested Citation:"3 Reflections on Resurgence." National Academies of Sciences, Engineering, and Medicine. 2020. Information Technology Innovation: Resurgence, Confluence, and Continuing Impact. Washington, DC: The National Academies Press. doi: 10.17226/25961.
×
Page 44
Suggested Citation:"3 Reflections on Resurgence." National Academies of Sciences, Engineering, and Medicine. 2020. Information Technology Innovation: Resurgence, Confluence, and Continuing Impact. Washington, DC: The National Academies Press. doi: 10.17226/25961.
×
Page 45
Suggested Citation:"3 Reflections on Resurgence." National Academies of Sciences, Engineering, and Medicine. 2020. Information Technology Innovation: Resurgence, Confluence, and Continuing Impact. Washington, DC: The National Academies Press. doi: 10.17226/25961.
×
Page 46
Next: 4 AI Resurgence »
Information Technology Innovation: Resurgence, Confluence, and Continuing Impact Get This Book
×
Buy Paperback | $85.00 Buy Ebook | $69.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Information technology (IT) is widely understood to be the enabling technology of the 21st century. IT has transformed, and continues to transform, all aspects of our lives: commerce and finance, education, energy, health care, manufacturing, government, national security, transportation, communications, entertainment, science, and engineering. IT and its impact on the U.S. economy—both directly (the IT sector itself) and indirectly (other sectors that are powered by advances in IT)—continue to grow in size and importance.

IT’s impacts on the U.S. economy—both directly (the IT sector itself) and indirectly (other sectors that are powered by advances in IT)—continue to grow. IT enabled innovation and advances in IT products and services draw on a deep tradition of research and rely on sustained investment and a uniquely strong partnership in the United States among government, industry, and universities. Past returns on federal investments in IT research have been extraordinary for both U.S. society and the U.S. economy. This IT innovation ecosystem fuels a virtuous cycle of innovation with growing economic impact.

Building on previous National Academies work, this report describes key features of the IT research ecosystem that fuel IT innovation and foster widespread and longstanding impact across the U.S. economy. In addition to presenting established computing research areas and industry sectors, it also considers emerging candidates in both categories.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!