World War II was a war that was ultimately won by disruptive advances in technology—the first electronic digital computers, radar, nuclear weapons, among other advances. For the first time, the entire scientific enterprise in the United States—in universities, in industries, in research labs—was mobilized and harnessed to the war effort, developing new technologies for military use. What had been episodic support by the government for the development of technologies critical to defense was transformed into a broad and sustained commitment. The wartime compact between American government and industry, to team in developing new technology to serve the national defense, was sustained into the Cold War that followed.
In the United States, many of the great research universities that were to become the backbone of the U.S. innovation system had developed in part with subsidies from the Federal government, as grants of land to the states. One explicit mission given to the land grant colleges was to serve in advancing the useful technical arts, and by the early twentieth century, many land grant schools—MIT, for example—had established important outreach programs that connected their faculty and students to industry. The role of the military in supporting technological development useful for defense had already been well established, but advances in medical technology had also played an important role during the Second World War, and after the war, a large scale program of research grants to universities