a viable exabyte computer can be built without major technology innovations. The innovations will need to include new architectures, new approaches to software, and new hardware technologies, including advanced accelerators, 3-D stacked chips for more energy-efficient computing, novel optical interconnect technologies to optimize data transfer and communication, and new high-performance storage systems based on next-generation tape systems and novel phase-change memory (IBM, 2012).
Finally, mimicking of human intelligence in today’s computing environment is recognized as inefficient. The IBM Watson computer is arguably one of the most advanced computing systems that specializes in natural human language and provides specific answers to complex questions at high speeds (Ferrucci, 2012). Specifically, the system “understands” questions in natural language, finds information in relevant sources, and determines the confidence of different options and responses with factual answers. Watson applies technologies from different fields, such as machine learning, natural-language processing, information retrieval, knowledge presentation, and hypothesis generation. In February 2011, the Watson computer won the game of Jeopardy against two all-time champions. But the cost of such achievement is high: the system consists of 90 Linux servers, 2,880 Power7 processor cores (3.55 GHz), and almost 16 terabytes (16,000 gigabytes) of random-access memory. The computer comes in four racks and weighs more than 10,000 pounds, needs 25 tons of cooling equipment, and consumes around 100 kW of electricity (the consumption of 40–50 households).
Although the Watson computer is one of the most impressive instances of artificial intelligence and cognitive information processing, it is far from the capabilities of the human brain, which by comparison uses only a few watts of electricity, weighs around 3 pounds, and fits into the palms of two hands. Specifically, given today’s technology trajectory, which improves power performance of computing systems by 25-30 percent per year, the Watson computer would still consume about 1,000 W in 15 years—still far from the power consumption of a human brain (apart from the fact that the human brain can do much more than the Watson computer).
It is clear from the three challenges discussed above that innovations in computing infrastructure are required if important advances are to be made. To provide context, the committee describes the current computational structure briefly below and then discusses needed changes.
The fundamental reason that the human brain is far superior to any of today’s computers is that the brain operates in a completely different manner from a conventional computer.
Conventional computers are commonly referred to as von Neumann machines; they were developed over 40 years ago for the applications and programming problems that were relevant then. A distinct feature of von Neumann machines is that they share a bus between a central processing unit (CPU) and memory. Information cannot be exchanged back and forth between the CPU and the memory at the same time, and this makes the bus a bottleneck—commonly referred as the von Neumann bottleneck (Backus, 1978). Before an instruction or word can be sent through the bus, the CPU must know the address, which had to be sent before. The von Neumann architecture results in an instruction-at-a-time processing approach, substantially unlike that of the human brain, which uses a massively parallel system of slower processing units (neurons) that are connected by weights.2 The weights are called synapses; each neuron has about 103 synapses. The strengths of the synapses are constantly modified by the learning experience and memory. A
2It should be noted that the comparison between the human brain and the von Neumann architecture is recognized as limited in applicability; other characteristics of the human brain slow responses in some circumstances, as in socially constrained environments.