government, and with other countries, since even a company as big as GE was unable to move rapidly enough on its own.
But, in many cases, the old models remain in use in the United States—within the government, industry, universities, and the National Laboratories—and they were not moving fast enough. Other countries are wrestling with how to speed up the innovation process, especially in such areas as energy, job creation, health care, and the environment, where national needs are not being met with sufficient speed, in part because the models are too slow.
It was to consider these questions that the symposium was convened, Dr. Edelheit said, and the current panel would offer three speakers with interesting perspectives on the issue of partnerships or the lack thereof, among government, industry, and universities in the United States. The first, Ken Flamm, would talk about the case of supercomputers, a technology that clearly was critically important for a great number of industries.
University of Texas at Austin
Dr. Flamm, thanking Dr. Wessner for the invitation to speak, said he would use his time to “tell a story” about the supercomputer industry. Some of the material he would present has already been published in an earlier version, having formed the basis for recommendations of a 2004 report on the future of supercomputing by a National Academies panel on which he served.14
He would begin with the field’s early history, he said, in order to make sure that his listeners understood what is meant by the term “supercomputer” and where the supercomputer came from. The machines built to decrypt code traffic during World War II were the direct precursors of the modern electronic digital computer and the famous ENIAC machine, built for other purposes, appeared shortly thereafter.
At the computer industry’s very beginning in the 1940s and 1950s, all computers were essentially supercomputers—every new model being the “biggest,