Michael R. Nelson, Moderator
David B. Nelson
Paul R. Young
I am Mike Nelson from the White House Office of Science and Technology Policy. I am one of the two people at the White House who work full-time on information technology policy issues. I think it is probably the second most enjoyable job at the White House, after the President's. He has the advantage of not having to report to anybody, except to the American people.
I am actually a geophysicist by training, and it is interesting that I have ended up where I am. I came to Washington from the Massachusetts Institute of Technology about eight years ago on a one-year fellowship. Apparently I contracted "Potomac Fever." I hope it is not terminal. It might seem odd that a geophysicist would be working in this area, but as I have told many of you, the training is actually perfect. Having a sense of geologic time in Washington is very important.
My career in Washington has nearly paralleled the existence of the Computer Science and Telecommunications Board (CSTB) in terms of time. We have had a very interesting, fruitful, and I would say symbiotic, relationship. In Washington, you must have a good mentor if you are going to accomplish anything. I have been blessed with several, and several of them are on the Board. In working on information technology issues during the past eight years, I have worked very closely with the Board. I probably have relied on CSTB's reports as much as anyone and have been able to accomplish a lot more because of the information provided to me by individual Board members and the reports that CSTB has produced.
When I first came to Washington, I started working on earthquake issues. I spent two or three months doing this and realized that these were really hard issues, and they were very depressing. I would start my morning by reading scenarios of a major earthquake in Los Angeles killing 20,000 people. So I started working on global warming issues, where I read scenarios in which 2 million people would die. Then I decided it was much more fun to work on information technology issues: we all know that information technology will solve all of our problems; so this is where I now spend all my time.
The first hearing I organized for Senator Gore, then chairman of the Science, Technology, and Space Subcommittee in the Senate, was on computer technology and high-speed networks. Robert Kahn, among others, testified. Leonard Kleinrock was our lead witness, testifying on CSTB's second report, Toward a National Research Network (1988). It was a very influential, very important report. I know it was influential because I took large portions and inserted them directly into the briefing memo, which was given to all the senators who came to the
Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 8
--> 2 Linking the CSTB Community to the Federal Government: Expert Advice for Policymakers Michael R. Nelson, Moderator David B. Nelson Paul R. Young Howard Frank Michael R. Nelson I am Mike Nelson from the White House Office of Science and Technology Policy. I am one of the two people at the White House who work full-time on information technology policy issues. I think it is probably the second most enjoyable job at the White House, after the President's. He has the advantage of not having to report to anybody, except to the American people. I am actually a geophysicist by training, and it is interesting that I have ended up where I am. I came to Washington from the Massachusetts Institute of Technology about eight years ago on a one-year fellowship. Apparently I contracted "Potomac Fever." I hope it is not terminal. It might seem odd that a geophysicist would be working in this area, but as I have told many of you, the training is actually perfect. Having a sense of geologic time in Washington is very important. My career in Washington has nearly paralleled the existence of the Computer Science and Telecommunications Board (CSTB) in terms of time. We have had a very interesting, fruitful, and I would say symbiotic, relationship. In Washington, you must have a good mentor if you are going to accomplish anything. I have been blessed with several, and several of them are on the Board. In working on information technology issues during the past eight years, I have worked very closely with the Board. I probably have relied on CSTB's reports as much as anyone and have been able to accomplish a lot more because of the information provided to me by individual Board members and the reports that CSTB has produced. When I first came to Washington, I started working on earthquake issues. I spent two or three months doing this and realized that these were really hard issues, and they were very depressing. I would start my morning by reading scenarios of a major earthquake in Los Angeles killing 20,000 people. So I started working on global warming issues, where I read scenarios in which 2 million people would die. Then I decided it was much more fun to work on information technology issues: we all know that information technology will solve all of our problems; so this is where I now spend all my time. The first hearing I organized for Senator Gore, then chairman of the Science, Technology, and Space Subcommittee in the Senate, was on computer technology and high-speed networks. Robert Kahn, among others, testified. Leonard Kleinrock was our lead witness, testifying on CSTB's second report, Toward a National Research Network (1988). It was a very influential, very important report. I know it was influential because I took large portions and inserted them directly into the briefing memo, which was given to all the senators who came to the
OCR for page 8
--> hearings. I also inserted parts of that report directly into the legislation that I drafted for Senator Gore. Luckily, plagiarism is legal on the Hill! After that first hearing, several people were astonished at Senator Gore's grasp of both the technology and the issues. This was due in part to the fact that he had read the report, which was released on the same day. Since that time, I have worked with the Board on a number of critical issues. CSTB has helped us design the high-performance computing legislation and keep the High Performance Computing and Communications Initiative on track. It has helped us deal with issues such as computer security. The Board has helped us evolve the ARPANET into the NSFnet, into the National Research and Education Network, into the Internet, into the Net, into the national information infrastructure, into the global information infrastructure, and into whatever it is we are now creating. Today, we are going to look at how the Board has influenced policy and how it has worked effectively on the interface between science and engineering and policy making. Being one of the denizens of the interface, I know it is a pretty turbulent, unpredictable place to work. I know that this Board has been very effective in informing and enlightening those of us who have tried to help the policy-making process along. Today we have an excellent panel, and I should say that I am impressed with the quality of the entire symposium program. This panel is going to take a Dickensian approach to its presentation by looking at CSTB's past, present, and future. The "spirit of CSTB past" will be provided by David Nelson (no relation), who has been working in this area even longer than I have and has been one of CSTB's primary customers. The "spirit of CSTB present" will be provided by Paul Young, who will talk about what is going on now and some of the ways in which the Board is influencing policy making. The "spirit of CSTB future" will be represented by Howard Frank, since the Defense Advanced Research Projects Agency (DARPA) is always 10 years ahead of the rest of us. He will discuss what might be ahead for the Board. We are going to keep these remarks very short, about five minutes each, so that we can have a full discussion. All of the panelists can talk about the past, present, and future, and they will do so in the question-and-answer period. Our goal is to provide a chronological look at where the Board has been, where it is now, and what directions it might take in the future. David B. Nelson I will cover CSTB past. This does feel a little bit like a role in Charles Dickens's A Christmas Carol. As I was preparing this, I thought, "I am putting myself in the role of a peer reviewer." This reminded me of a story that I think most of you know, but it is so good that I will repeat it. This is the classic peer reviewer's report. It reads as follows: "This paper is novel, interesting, and correct. Unfortunately, the part that is novel is not interesting. The part that is interesting is not correct. The part that is correct is not novel." Fortunately, I do not make that judgment when it comes to the work of CSTB. Before I review some of the Board's efforts from a government standpoint, let me remind you what the federal scene was like in 1986-1987. The Lax report1 had been out for only a few years, and the various agencies were struggling to implement its recommendations. I viewed that as an attempt to cure the VAX disease. What I mean is that every chemistry department could afford a VAX computer; so the standard of excellence in chemistry computation was one VAX unit. Next, DARPA had just begun its strategic computing initiative. Robert Kahn smiles from the audience—he remembers that one well. The National Science Foundation (NSF) was dealing with the question of establishing supercomputer centers, particularly how to link them together in something that would be called NSFnet. To remind you how history is out of our control, remember the study that said that the NSFnet should run Open Systems Interconnection (OSI) protocols as soon as practical. Think back. There is a lot of water under the bridge on that one. 1 Report of the Panel on Large Scale Computing in Science and Engineering . Peter Lax. chairman, sponsored by the U.S. Department of Defense and the National Science Foundation, in cooperation with the U.S. Department of Energy and the National Aeronautics and Space Administration, Washington, D.C., December 26, 1992.
OCR for page 8
--> The Department of Energy (DOE) had responded to the Lax report by consolidating and extending all of its energy research computing, using what is now called the National Energy Research Supercomputer Center, and had just begun the ESnet that went with it. We were struggling with the question of what protocol to run. At just about that time, Cisco came along with multiprotocol routers that allowed us to sidestep this terrible political question—another instance in which technology saves you from your political fate. The National Aeronautics and Space Administration (NASA) was planning the National Aeronautical Simulator, the numerical wind tunnel. A lot of universities were trying to figure out how their budgets could afford advanced computing. As we know, some could and some could not. On the Hill, as Michael Nelson has noted, Senator Albert Gore was thinking rather deeply about networking and about the things that networking could do for the country's future. In the interagency setting, the Department of Defense (DOD), DOE, NSF, and NASA were working on the beginnings of what was then called the High Performance Computing Initiative. In the 1987 White House Office of Science and Technology Policy (OSTP) report, which is less well known than the 1989 report, it was referred to as high-performance computing. I believe it was Allan Bromley who said—as the 1989 report was being prepared to launch the HPC Initiative—"Shouldn't there be a second C? Isn't communications becoming pretty important?" As a result of such suggestions, it became known as the High Performance Computing and Communications Initiative (HPCCI). With this introduction, let me review some of CSTB's work. In terms of quantity, CSTB has been very productive—more than 40 projects since 1987; approximately three to five substantial products per year, including one major study each year. Some of the substantial products included influential colloquia as well as long-term studies. Next, CSTB has usually been ahead of the issues. I think Joseph Traub already mentioned this fact, and I would certainly agree. If we run very quickly through some of the products, we will see that it is true. As early as 1988, in Toward a National Research Network, CSTB addressed issues in the Gore bill and discussed the value of the research network. Among CSTB's reports, there are several that resonated with me and that spoke particularly to policy issues. Many of you probably have a different list, so this is not intended to be complete or in any way a judgment about the ones that are not on my list. In Computers at Risk: Safe Computing in the Information Age (1991), CSTB addressed the question of hackers and penetration. It did so in a somewhat low-keyed way because it considered system design and accidents to be as important as hackers. I would say that these are still issues. Many more people are killed in accidents than in murders and wars. So, while we worry about malicious attacks, our own systems are fairly fragile. Computers at Risk pointed out that inadvertent problems can bring things down as much as malicious penetration. Keeping the U.S. Computer Industry Competitive was a series of reports (issued in 1990, 1992, and 1995) that conveyed initially a sense of doom and gloom about the computer and electronics industry. The reports did not speak particularly to government programs, although one of the major objectives of the HPCCI was to keep the U.S. computer industry competitive. Intellectual Property Issues in Software was published in 1991. A House bill is in committee now on intellectual property issues in electronic media, which indicates that this issue does not go away. It also reaffirms the contribution of CSTB in pointing out that we have patents and we have copyrights, and maybe software has to fit in somewhere either between or beyond them. Computing the Future: A Broader Agenda for Computer Science and Engineering was released in 1992. We in government thought it was very helpful that the report's first recommendation was to continue to support HPCC. It also spoke to academics, saying that they should broaden academic computer science and engineering. Prophetically, the report recommended broadening HPCC to mission agencies, and this certainly happened. In 1992-1993, several additional agencies signed on to the HPCCI: the National Institute for Standards and Technology (NIST), National Institutes of Health (NIH), National Oceanographic and Atmospheric Administration (NOAA), National Security Agency (NSA), and Environmental Protection Agency (EPA) are examples. National Collaboratories: Applying Information Technology for Scientific Research (1993)—I think William Wulf can claim a great deal of personal credit for this report. Again, it set a direction that we are now following
OCR for page 8
--> within agency programs and in the country as a whole. We are trying hard to follow many of the lessons covered in the report. Information Technology in the Service Society: A Twenty-First Century Lever (1993) bemoaned the problems of productivity and the absence of productivity improvements that could be traced directly to computing advances. We are still struggling with this issue. How do we even count, much less affect? Then, in 1994, Realizing the Information Future: The Internet and Beyond revisited the first report that CSTB had done about the Internet, going back and saying, ''Okay, wow, this was really successful. Now what do we do with it?'' One of the things this report emphasized was the importance of the concept of the "bearer service" to allow a broad range of applications to be served by a hardware infrastructure. Research Recommendations to Facilitate Distributed Work, released in 1994, is particularly close to my heart because the Department of Energy commissioned it. It provided DOE with a good research agenda for how to promote telecommuting, which is, of course, becoming more and more ubiquitous. DOE cares because we are worried about energy usage. Evolving the HPCC Initiative to Support the Nation's Information Infrastructure, the famous Brooks-Sutherland report, was released in January 1995. Out in the anteroom of this meeting is the chart from that report (reprinted here as Figure 2.1) showing how research has affected practice and the long time frame required for some of it. These were helpful lessons for those who thought that the research we do today should be in products tomorrow, and if it is not, it has somehow failed. The report also says, "Do not put all the computer science eggs in the HPCC basket," as well as, "Adjust what you call HPCC so that you keep the books properly." Also in 1995, Information Technology for Manufacturing did an excellent job of laying out a research agenda to address how computing can help on the factory floor. The report is, in a sense, a companion to Information Technology in the Service Society; however, it argues that because the shop floor allows better productivity measurement, it is going to be easier to show that computing has actually helped. Finally, I would say that the work of CSTB past, as I have gone through these reports, has had a substantial influence on the administration, on Congress, and on the research and development community. My little tour through these reports helped me to go back and support that statement by asking: Well, what was really happening in 1986 and 1988 and 1992 and so on? What were the important issues? Did CSTB address them in a productive way? Did its recommendations help? Paul R. Young I was very interested in David Nelson's introduction, particularly vis à vis NSF in the past (see Figure 2.2). He pointed out that 10 years ago we were struggling with what to do with supercomputing and networking, establishing the NSFnet. I arrived late this morning because I gave some introductory comments to the panel that is talking about the new supercomputer centers' program, the Partnerships for Advanced Computational Infrastructure. Things from the past continue in revised forms. I think a lot of CSTB's reports continue to influence the future and what we are doing in the present. I was also amused because David mentioned networking, and we are in the middle of a major change in the NSF networking program as we head to high-end networking. As he pointed out, this program has been influenced by the Kleinrock-Clark report (Realizing the Information Future: The Internet and Beyond), which has had a very strong impact not only on policy issues, but also on how we set a research agenda. Finally, David mentioned that, 10 years ago, we were struggling with the issue of what would become of some initiatives in computing and communications. In 1991, a bill was passed on high-performance computing and communications that goes through 1996 and expires at the end of this fiscal year. The administration and the interagency process struggle with how to redefine this concept and with identifying the future role of federal investments in research in computing, communications, and information. So in some sense, policies evolve they do not change. CSTB's reports help us to define these policies and to decide what we are going to do about them. In this context, I was tempted not to discuss the national context in which some of these decisions are made. However, in talking with Anita Jones last night, she suggested some
OCR for page 8
--> Figure 2.1 Government-sponsored computing research and development stimulates creation of innovative ideas and industries. Dates apply to horizontal bars, but not to arrows showing transfer of ideas and people. Reprinted from Computer Science and Telecommunications Board, National Research Council. 1995. Evolving the High Performance Computing and Communications Initiative to Support the Nation's Information Infrastructure. National Academy Press, Washington, D.C., Figure 1.2.
OCR for page 8
--> PAST General-purpose networking for research and education High-performance computing Supercomputing Parallel computing Grand/National challenges Computational science and engineering Education and human resources FUTURE Defining a broad research agenda Computing systems Convergence of computing and communications Human-centered information systems Computational science and engineering Cross-cutting research Education and human resources Learning and intelligent systems Learning technologies Integrating research and education National infrastructure High-end computing Networking Outreach Parmerships International Figure 2.2 NSF issues: then and now. things that might be useful. Michael Nelson spoke in his introduction about his experience with geological time and how things get done in Washington. There are some reasons for this. On the administration side of how policy is made, there are a lot of players, and NSF often sits in the middle. There is an Office of Science and Technology Policy, a National Science and Technology Council that works through the Committee on Information and Communications, an HPCC process, and a lot of different subareas that define research agendas. On the other side, through the Department of Commerce, there is the Information Infrastructure Task Force that addresses information technology implementation issues. Somehow, the advice that CSTB provides has to influence and pull all of these together into a common framework. There is a similar method of arriving at priorities in the National Science Foundation that is unique to NSF. You should not think, of course, that CSTB's audience is just those of us who have the immediate responsibility for budgets. There are a lot of people to please: the National Science Board, an advisory committee, workshops that we hold, proposal pressure, and peer review. Nevertheless, it is clear that external studies are important to us. Peer review is important, and the most prestigious of these reviews come from the Computer Science and Telecommunications Board. Figure 2.3 addresses some issues that I think are important from an NSF standpoint. They provide some indication of places where we might be looking for help from CSTB, without saying exactly how these needs might arise. First, we continue to redefine the research agenda within NSF, but we do this across the federal government as well. I am sure Howard Frank will say more about this process. CSTB really helps us to formulate this agenda and explain it to the various stakeholders. Some issues that NSF is particularly concerned with include the following: How can we do more interdisciplinary research, and what is the role of computer science and engineering, including computational science and engineering, in that? What is the proper home? What is the relationship between computer science and engineering and computational science and engineering? Within NSF, we are currently interested in cross-cutting research that addresses human factors. People from different disciplines—biology, cognitive science, and engineering—are addressing questions such as, What does it mean to learn? What does intelligence mean? What is the distributed nature of this? Not all of this discussion bears directly on information technology, but much of it does, and we are trying to identify the relationships. We are interested in part of the current presidential initiative on learning technologies, a pet project of mine.
OCR for page 8
--> Figure 2.3 Illustrative impacts of CSTB studies. NOTE: NSTC = National Science and Technology Council, CIC = Committee on Information and Computing, HPCCIT = High Performance Computing, Communications, and Information Technology (a subcommittee of CIC). We all know that information technology is going to radically transform education in the twenty-first century, but how? Do we have a road map for this? Let me provide two particular examples. We know that computational methods have transformed how we do much of science and engineering. Computation has joined experimentation and theory as a paradigm for many aspects of how we do science and engineering. When the technology becomes cheap enough and available, it can similarly transform how students learn. It will change the way they think about abstract problems. Do we know what the technical roadblocks are, and do we know what the psychological impacts are? Could CSTB help us address these kinds of questions? I think so. I would like to come back to collaboratories, an area that William Wulf has pushed and how they can help science and engineering generally. We are going to have some form of collaboratories across geographic boundaries that will influence education. Are these going to be the same technologies? Are children going to use these in the same way as scientists and engineers? What are the technical roadblocks to achieving this by the year 2010? CSTB could help NSF answer this kind of question. You see Figure 1.2 of the Brooks-Sutherland report again and again (Figure 2.1 in this volume). You know this figure, you love this figure; I know this figure, I love it. It helps to set policy because it shows a very nonlinear picture of the role of federally funded research in the economy and the nation in general. I have found this slide enormously useful. It enables us to talk with policymakers and the general public about the fact that research has a long life. It is still goal directed. It enables us to talk about the fact that the process is nonlinear and that the particular goals may be diffuse in the interaction. This figure has been very effective. CSTB has another study, "Innovations in Computing and Communications: Lessons from History," currently under way. We are looking forward to the results of this study in particular because the government increasingly is moving toward performance-based budgeting. CSTB can help us with this. The initial idea for performance-based budgeting for research across the federal government was that you were supposed to indicate what the output of a particular program would be in the next two to three years. Of course you could measure the number of publications, but this is not what you are after. You could measure impact. Can you predict this in advance? I do not think so.
OCR for page 8
--> We need more help in selling the research mission to the public as a whole, to Congress, and to people down the street. I think CSTB can help us with that. Howard Frank I have been involved with CSTB and the earlier telecommunications board for quite a few years. Having been in government, I have been involved in two or three different ways. When Mike asked me to speak about what CSTB should be doing in the future, at first I was taken aback a little because, well, how do I know? Then I decided that the key to understanding what CSTB should be doing is to understand what is happening in information technology in general. What I will do is give you a picture of what my personal feeling is. This is not DARPA's position or anybody else's, but what I think the challenges of the future are. If you looked at the list of topics that David Nelson has talked about, CSTB has been involved in all of the real issues of the past and present—right on top of them—sometimes a year or two before the fact, sometimes a year or two after. The history of CSTB has tracked the history of information technology. So I thought I would tell you where I think the problems are. I will point out that there is an anomaly here. We are living through the greatest revolution in information technology that the world has ever seen. The economy is robust. The technology sector is leading the stock market. In the last week there was probably a 5 percent rise in the stock market led by the technology shares, and so on. Nevertheless, I think the future may be rather dim. I think the first challenge is that, in this nation, computer science is basically an obsolete field. Many things that computer scientists have worked on have been wonderful and great. However, some critically important things have been missed. It was good that we had an introduction that talked about information warfare. When I first became director of the Computing Systems Technology Office in DARPA and began talking to my staff about survivability of the infrastructure, I got the response, "Well, we have already solved that problem. The Internet does adaptive routing." I said, "Wait a second, you do not really know what I mean." A year later, we tried to engage the academic community on the topic. We discovered that there was no academic community engaged in the topic. This is just one example of the fact that computer scientists have become smug in the wonderfulness of the technology that exists—technology that has been created, by and large, in spite of them rather than because of them. It does not mean that there have not been great things that have come out of the academic community, but the academic community certainly has not led. Long-term information system technology research in this nation is in great danger. This is an area where we have to cry for help right now. Over the past five years, there has been a collapse of long-term research and development in the industrial sector. Nobody knows exactly how big long-term research really is in the industrial sector, because when companies budget research and development, it may look like $20 billion, but the development part may be $17 billion, $18 billion, or $19 billion. We know that the Bell Labs of the past is gone. There is tumbleweed blowing down the halls of IBM's research facilities and in many of the other major industrial organizations as well. In the federal government, we are under tremendous pressure—there is vast misunderstanding of the relationship of long-term research and development to the nation's prosperity and future. This is the same problem that CSTB was looking at 10 years ago, except now it is worse. We probably will not know the results of it for another decade or more. High-end strategic computing is in danger of collapse. We have seen great technological success in terms of the high-performance computing program itself. It introduced the concepts of parallelism and scalability into the commercial world. If you look at medium-scale computing, it now reflects the results of the government program. If you look at the very high end, however, the market is in danger of disappearing. The very high end of strategic computing is a government marketplace, not a commercial marketplace, but there is no government research program that yet recognizes that fact. There is tremendous pressure from Congress and from our own internal constituencies to cut back on what had been considered high-end strategic computing for homogeneous computing systems. We really do not have a strategy for how to continue to acquire computers from a marketplace that cannot
OCR for page 8
--> afford to spend the R&D dollars necessary for the high-end computation required for many government-unique problems. Finally, and this is an anomaly, we celebrate the wonderfulness of the Internet when I believe it is now moving into a period of decline in performance. The vast social phenomenon arising from the Internet is yet to take place. We see just the beginnings of this phenomenon. Yet yesterday, David Nelson presented some initial results about performance on the Internet for the research community. We need another study called Toward a National Research Network because we no longer have an adequate national research network. The exploding user population has reduced performance below the levels needed by the research community. What are the implications of all of this? I think CSTB is one of the only places on earth that can deal with this class of problem. It has the only group of people who understand not only the technology issues, but also some of the social and political issues. CSTB needs to become much more proactive. I want to use one specific example. This was not CSTB, but it was the National Research Council (NRC). When I was on the study committee that looked at the survivability of the U.S. telecommunications system, we came to some pretty significant conclusions in 1988, 1989, and 1990. We decided to entitle our report The Emerging Crisis in the Telecommunications System because we felt that this is what it was. The report ended up by being called The Growing Vulnerability of the Public Switched Networks. That is my point. CSTB needs to become a much more proactive organization that not only helps give insight into policy, but actually helps to beat people on the head until policy is changed. Discussion MICHAEL NELSON: Before we go on, I want to add one thing that I forgot to mention. Even though this is the National Academy of Sciences and the National Research Council, CSTB is having a growing impact internationally. I meet with a lot of people from other countries who want to know how to build up the Internet in their country. They are learning about resources that CSTB has provided, in part because of Marjory Blumenthal's efforts, because CSTB is now up on the Web and because every time I travel overseas I take approximately 30 pounds of your books and hand them out. I used to take only the little red book, Realizing the Information Future (1994). Now I have to take that and The Unpredictable Certainty (1996). In the past six months, I have handed out dozens of copies in Russia, Beijing, Singapore, and Jakarta. They have been read, and they have influenced a lot of thinking. So I commend you for this effort, and I hope that you will continue to reach out. EDWARD FEIGENBAUM: CSTB operates in the framework of the commissions and councils, and so on, of the NRC, which is in the framework of the National Academy of Sciences. All are extremely slow and conservative organizations, unwilling to say things that make anyone bristle. So a lot of what CSTB might try to do is either squashed or squashed in advance by this elaborate structure. I want to point out how long it takes to get a CSTB report out. It takes forever. Bob Lucky has a comment in the published abstracts about the fact that we are zero for one on predicting Webs. The World Wide Web was invented and Mosaic was invented and had 1 million users—all within the time frame of one CSTB report. MICHAEL NELSON: I would like to second that. Being in the policy-making process, I always want the answer tomorrow, if not yesterday. The only counter to this argument has been that CSTB has been ahead of the curve in identifying the issue that will be hot in two years. This is good, because it takes two years to write the report. I hope the process can move faster. Part of this is our own government's problem. The encryption study, which is coming out soon, has now completed the NRC review process. It would have been nice to have the report six months ago, but it is going to be incredibly timely. In a way, the need for that report is growing, and I think it is going to influence decisions that are just about ready to be made, so the report is right on target. Some other reports have been a little late, but I think the NRC, and especially CSTB because it anticipates, has done better than some other organizations in Washington. I should add, however, that we are losing some of the other organizations that have supplemented the work of the Board. The congressional Office of Technology Assessment has disappeared. The Annenberg Program from Northwestern, which has done a lot of work on telecommunications policy, has been phased out. So the demand for the products of this Board, and the demands put on it, will be even greater. Any other reaction to that comment?
OCR for page 8
--> PAUL YOUNG: I am very sympathetic to the time issue. It would be very nice to cut the time of producing reports in half. This said, there is a lot of policy made in Washington by the close of business today, and there is definitely a place for a group that can take a deep, measured approach and think things through very carefully. I think that this really has to be preserved while one speeds up the process because it is very easy to shoot from the hip. You see a lot of it. MICHAEL NELSON: I should also say that there are reports that, although they are toned down a little bit, can still deliver a very powerful message if they are delivered personally by the people who helped write them. The other thing I would like to see more of is taking these reports and really pushing them in the policy process. People who get the reports often do not read them until somebody comes in the room and says, "Here it is. This is why you have to read Chapter five." This, I think, is something Marjory and the team have been doing more frequently. Many of you have been in my office; you have been in a lot of offices. I think the broader community needs to take these messages and get them to policymakers. MICHAEL DERTOUZOS: I am not sure that I heard what I heard from Howard Frank. Did I hear sort of a Nostradamus thing? I would like to perhaps profoundly disagree or agree because I am not sure I got what was said. I agree our field has become narrow. I see tremendous opportunities ahead. Some predict there will be 1 billion interconnected machines by 2005. I see 15,000 independent software vendor artifacts going for those machines. I see the entire theory of computer science moving away from the single machine and addressing what happens out there when you have billions of machines. We do not have Turing theories for this. We do not have systems theories for it. We do not have software for it. We do not have systems for it. All this has to happen. We tried doing artificial intelligence things and they did not pan out. This does not mean that they are wrong. It took 250 years to progress from steam to jet power, and in computing we have had only 35 to 40 years. HOWARD FRANK: Mike, I think you are violently agreeing. The opportunities are there. DERTOUZOS: But let me have my tantrum. I think there is just a wonderful world ahead, and I am certainly excited about it. FRANK: There is a fantastic world ahead. It would be nice if we had some of the theory now. DAVID NELSON: This is the perennial discussion, and it has taken place in mathematics and other fields. My personal view is that it helps to have very good academics working in the field, rubbing shoulders with those who are trying to apply it. In the Department of Energy, we are bringing together mathematics and computer science in jointly funded projects that are trying to do applications. There is, of course, a danger that you may degrade the quality of the research, and I keep my finger on it. I keep asking people who are managing those programs, and the answer I am getting is, no. This is invigorating and stimulating. WILLIAM WULF: I would like to say just one thing about the issue of the time it takes CSTB to complete reports. I think this is a serious issue, and I keep pushing the staff about it. Part of the problem is internal to the NRC, and that is the part we have the potential to do something about. However, delays frequently occur at the front end. Sometimes CSTB does not get a contract for a long time. This has been a problem on the Ada study, where we are trying to do a fast turnaround.2 So I am going to put in a plug for why it is so important that agencies like yours have provided sustaining funds—core funds, as they are called—because those sometimes allow CSTB to get a quick start on projects. 2 Computer Science and Telecommunications Board, National Research Council. 1996. Ada and Beyond: Software Policies for the Department of Defense. National Academy Press, Washington, D.C.