THE COMPETITIVENESS EQUATION— THE RESEARCH ENTERPRISE
It will be necessary to have more and better scientists and engineers, but that alone will not be sufficient to ensure America’s ability to compete in the 21st century. Funds must be available to underwrite the efforts of scientists and engineers who decide to pursue careers seeking the new knowledge that in turn creates new jobs. The funds must provide for modern laboratories and instrumentation and must support the conduct of research itself. As President Bush observed, “It’s research that will keep the United States on the cutting edge.”
Although the research establishment in America remains extremely productive, ample warning signs are to be found in considering the future. For example,
In 2004, federal funding of research in the physical sciences as a fraction of GDP was 54% less than in 1970. In engineering, it was 51% less.
By the end of 2007, China and India will account for 31% of the global R&D staff, up from 19% as recently as 2004.
The share of US post-doctoral scientists and engineers who are temporary residents has grown from 37% to 59% in two decades.
In 2005, only four American companies were among the top 10 in receiving US patents.
The National Intelligence Council reports that in 2003 “foreigners contributed 37 percent of the research papers in Science, 55 percent in the Journal of Biological Chemistry, and 71 percent in the journals of the American Physical Society.”
For the first time, the world’s most powerful particle accelerator does not reside in the United States; this virtually ensures that the next round of breakthroughs in this fundamental discipline will originate abroad.
In the recent ranking by the Organisation for Economic Co-operation and Development (OECD), the United States is in 22nd place in the fraction of GDP devoted to nondefense research.
Federal annual investment in research in the physical sciences, mathematics, and engineering combined is equal to the increase in US health care costs experienced every 6 weeks.
The National Science Foundation (NSF) has indicated that it can now fund only one in five research proposals that it receives, the vast majority of which are deemed meritorious by peer reviewers. As funds have become more scarce, peer reviewers have been less inclined to allocate grants to younger researchers as opposed to more senior researchers with “safe” track records, even though history shows convincingly that the most significant scientific advances have been attributable disproportionately to younger researchers pursuing cutting-edge, high-risk science. The median age for first grants to individual researchers by the National Institutes of Health (NIH) has recently reached 42 years, and it should not go unnoticed that even greater risk aversion has evolved among many of those who fund research on behalf of US industry.
In contrast with the deteriorating situation in the physical sciences and engineering, America has in recent years made a substantial investment in the biologic sciences, doubling the federal budget for health-related research at NIH over a 5-year period. The impact of the increase in investment in understanding the causes and cures of diseases has been remarkable. However, this gain was eroded as inflation ate away at flat or even declining budgets in the years that followed the buildup—a trend that was reversed once again in the current year’s federal budget. It is, of course, of the utmost importance that increases in the funding of the physical sciences are not accomplished at the expense of investment in the biologic sciences. It is noteworthy in this regard that advances in the biologic sciences, the physical sciences, and engineering have often been highly interdependent, and they are increasingly so. For example, it is said that the human genome could
not have been sequenced without the benefit of progress in computers and robotics, and modern medical imaging would not have been possible without advances in computers and mathematics. Correspondingly, promising new fuels could not have become serious candidate energy sources without accomplishments in biology and agriculture.
One might argue that investment in research should be the province of industry because industry is often a principal beneficiary of research and its direct descendant, innovation. In fact, during the past 40 years, as the fraction of the nation’s R&D spending provided by the government steadily declined from two-thirds to one-third, industry made up the difference, increasing its share of the total investment from one-third to two-thirds. Significantly, however, the composition of industry’s effort changed markedly during this period: development, not research, became industry’s priority. Although overall federal investment in research in constant dollars has been increasing, the growth has almost entirely been focused on the life sciences.
There are several reasons for industry’s perhaps counterintuitive behavior. First, there is the inherent possibility that an investment in research may produce no new knowledge at all; research is a risky business. One study in information technology concluded that only one new research “idea” in 500,000 results in a commercially profitable product. Furthermore, even when an effort is successful, there may be uncertainty as to the applicability of a particular research project to a firm’s own competence and business interests. For example, while working in the composites laboratory of an aerospace company—the same firm at which I was later employed—Howard Head conceived the idea for the skis and tennis racquets produced by the firm that now bears his name. The return to society as a whole from investment in research often far exceeds the rewards to the corporate underwriter or performer of an individual piece of research.
In addition to the implicit riskiness and uncertain applicability of investment in basic research, there is always the matter of its long-term nature, not uncommonly involving a decade or more of effort before results can be introduced into the marketplace. That constitutes a significant deterrent to investment by industry, which tends to have a “next-quarter” focus.
One might ask, Isn’t that short-sighted? The answer, of course, is yes; it is very short-sighted. But before condemning industry, consider the following incident that occurred a few years ago at the company where I was employed. Motivated by an unusually large stable of highly promising research opportunities, the company’s management conducted a briefing for Wall Street analysts to inform them of a planned increase in investment in research and the promise this would offer for the company’s future growth and profitabil-
ity. At the end of the briefing by the company’s president, most members of the audience ran from the room and sold the firm’s stock. The company’s share price dropped by 11% during the next few days, then gradually declined for nearly 2 years before the tide could be stemmed. When, shortly after the debacle on Wall Street, as the event became known in the company’s research laboratories and executive suite, I asked one of the attendees at the briefing what had been said that was wrong, the analyst impatiently responded, “You should know that it takes 10 or 15 years for research to pay off … if it does at all. Your average shareholder owns your stock for about 18 months, doesn’t care what happens to you 10 or 15 years from now, and certainly doesn’t want to pay for it. In fact, by that time the investor will probably own one of your competitors’ shares and would be just as happy if your firm were not competitive.” The analyst then administered the coup de grâce, explaining, “Our firm does not invest in companies with such short-sighted management.”
Is that one example of excessive focus on short-term profits at the expense of long-term substantive gains in the provision of goods and services perhaps simply an anomaly? To obtain insight into the answer to that question, consider a result of a survey by the National Bureau of Economic Research: 80% of the senior financial executives questioned said they would be willing to forgo funding R&D to meet their public projections of near-term profitability. Then consider that the outstanding value of derivative contracts worldwide recently reached 8 times the value of all the homes and land in the United States and over 5 times the combined yearly output of all the world’s nations. Patience does not seem to rank highly on the list of attributes of today’s investors, nor does making money “the old-fashioned way.”
Margaret Thatcher eloquently summarized the significance, as well as complexities, of basic research in her remarks on the overall topic of innovation:
Although basic science can have colossal economic rewards, they are totally unpredictable. And therefore the rewards cannot be judged by immediate results. Nevertheless, the value of Faraday’s work today must be higher than the capitalization of all shares on the stock exchange…. The greatest economic benefits of scientific research have always resulted from advances in fundamental knowledge rather than the search for specific applications…. Transistors were not discovered by the entertainment industry … but by people working on wave mechanics and solid state physics. [Nuclear energy] was not discovered by oil companies with large budgets seeking alternative forms of energy, but by scientists like Einstein and Rutherford.
It has long been recognized that pursuits that are important to the public interest, but have disproportionately large societal returns as opposed to individual returns, often
of necessity become the province of government. But in the case addressed herein, US federal support of research in the physical sciences, mathematics, and engineering—when adjusted for inflation—has been stagnant for 2 decades. As already noted, as a percentage of GDP, federal investment in research in the physical sciences and engineering has been reduced by more than half since 1970. The federal government not only will need to increase its investment in research but also will need to find a way to forge closer ties among industry, academe, and government. That will require working arrangements to overcome such inherent barriers to cooperation as industry’s 3-month rhythm (until the next quarterly report) vs academe’s 6- or 8-year operating time constant (the period typically required to qualify for a PhD); academe’s culture of “publish or perish” vs industry’s “publish and perish” mentality; and government’s periodic attacks on both academe and industry generally concerning overhead ceilings, visa policy, and antitrust matters, the latter including such assaults as the one courageously and successfully challenged in court by MIT several years ago.
Even when undertaking all reasonable steps to remain competitive in science and technology, it is unlikely that on the flat earth any nation, even one as wealthy as the United States, can maintain a position of such broad prowess as the United States has enjoyed in recent decades. A few areas can undoubtedly be singled out in which to seek prominence, more areas can be pursued wherein a nation can be a “fast follower” in applying new knowledge, and still more will simply have to be monitored or even forgone. That is, choices must be made—and these will be difficult choices bearing significant consequences. Those making such decisions will no doubt seem to face the sort of dilemma that comedian Woody Allen once described in the following terms: “More than any other time in history, [we] face a crossroads. One path leads to despair and utter hopelessness. The other, to total extinction. Let us pray we have the wisdom to choose correctly!”
The decline in support of basic research in America has been particularly pronounced in the case of the Department of Defense (DOD), an organization that for the latter half of the 20th century was arguably the pivotal underwriter of basic research and innovation in the nation. Examples of commercial products stemming from research investment by DOD are as varied as the Internet, freeze-dried foods, weather satellites, GPS, communication satellites, and nuclear power. But during the past 30 years, the fraction of overall defense research, development, test, and evaluation funds devoted to science and technology has dropped, from 20% to 13%. Real funding of basic research by DOD has been essentially flat for 30 years in spite of the growing overall defense budget and the growing importance of technology to national security. Its science and engineering workforce
declined from 45,000 to 28,000 during the 1990s alone, according to testimony before Congress by DOD officials.
Competitiveness problems are exacerbated when national security is addressed—a realm wherein scientific and engineering leadership—or lack of leadership—can have profound consequences. President Bush, echoing the sentiments of several presidents speaking of their own eras, noted that “science and technology have never been more important to the defense of the nation and the health of the economy.” In the aerospace industry, most engineers and scientists require security clearances, the granting of which generally demands US citizenship. During my service as CEO of the Lockheed Martin Corporation, that firm employed over 80,000 scientists and engineers. The defense establishment cannot simply outsource its software to a shop somewhere in Bangalore, as many commercial firms can and do. Executives at several US government organizations have told me that whereas they used to go to US universities and companies for information about leading-edge technologies, they now find that they increasingly must go abroad.
The recent century’s most decisive new military capabilities—such as the atomic bomb, night vision, stealth, digital computers, precision-guided missiles, nuclear propulsion, precision geolocation, space surveillance, and the airplane—all had their roots in new discoveries and innovation.
America’s national security challenge has been complicated by the ongoing transition of the nation from a manufacturing economy to a service economy. Today, fully 77% of America’s jobs reside in the service sector, which is in general not the arsenal of military might. It may be possible to base a prosperous society on a service economy, but a nation cannot successfully fight a major conflict purely with a service economy. Armored vehicles, missiles, airplanes, sensors, and communication satellites are still among the instruments of survival and success in modern combat, not the production of reality television programs, sports extravaganzas, mass-media exposes, audits, and legal depositions. And finally there is the all-important underlying issue that a weakened economy may simply be unable to afford the resources needed to defend itself and its interests. The Soviet Union imploded in trying to provide an immense defense capability with an undernourished economy.
Several years ago, before the events of 9/11, Congress established what became known as the Hart-Rudman Commission, of which I was a member, and assigned it the task of examining America’s national security needs in the decades ahead and making any recommendations deemed appropriate. It was assumed by the mass media and many others interested in the effort that the group’s findings would concern the number of air wings,
infantry divisions, and carrier battle groups that the nation should maintain to prevail in possible future conflicts. Instead, in its two (sadly prescient) major findings, the bipartisan group warned that a major terrorist attack would probably take place on US soil and produce thousands of casualties, and stated that “the inadequacies of our system of research and education” pose a threat to national security “greater than any potential conventional war that we might imagine.” It noted that, “second only to a weapon of mass destruction detonating in an American city, we can think of nothing more dangerous than a failure to manage properly science, technology, and education for the common good.”