The Frontiers of Virtual Reality Applied to High Performance Computing and Communications

Maxine D. Brown

Electronic Visualization Laboratory

University of Illinois at Chicago

Our nation is now designing the architecture of the Global Information Infrastructure (GII), yet it is using standard keyboard and pull-down menu interfaces for computing that are inadequate, particularly when the computers can reside anywhere in the world and when there is an emphasis on moving from human/computer dialog to human/computer/human collaboration, with the computer providing real-time data in a shared, collaborative environment. Virtual reality (VR) is, at its most useful, a mode of information display that encourages intuitive navigation and exploration. VR provides views of data and computation that are hard to prespecify in the way currently required by two-dimensional workstation visualization interfaces, and it has garnered interest as an intelligent user interface to the GII. However, its real-time, interactive, high-resolution displays are pushing the limits of today's high-performance computing and communication (HPCC) technologies.

The Electronic Visualization Laboratory (EVL) at the University of Illinois at Chicago, in partnership with the National Center for Supercomputing Applications (NCSA) at the University of Illinois at Urbana-Champaign and the Mathematics and Computer Science Division of Argonne National Laboratory (ANL), is involved in a multiyear, national-scale effort to develop collaborations using supercomputers, high-speed networks, VR, and scientific visualization. EVL, NCSA, and ANL have, for several years, created national testbeds—notably at the SIGGRAPH 92, SIGGRAPH 94, and Supercomputing 95 conferences—to showcase distributed scalable computing and visualization and virtual environment applications from academia, national supercomputer laboratories, and industry.

The partnership designed these testbeds as experiments to help ferret out



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 32
--> The Frontiers of Virtual Reality Applied to High Performance Computing and Communications Maxine D. Brown Electronic Visualization Laboratory University of Illinois at Chicago Our nation is now designing the architecture of the Global Information Infrastructure (GII), yet it is using standard keyboard and pull-down menu interfaces for computing that are inadequate, particularly when the computers can reside anywhere in the world and when there is an emphasis on moving from human/computer dialog to human/computer/human collaboration, with the computer providing real-time data in a shared, collaborative environment. Virtual reality (VR) is, at its most useful, a mode of information display that encourages intuitive navigation and exploration. VR provides views of data and computation that are hard to prespecify in the way currently required by two-dimensional workstation visualization interfaces, and it has garnered interest as an intelligent user interface to the GII. However, its real-time, interactive, high-resolution displays are pushing the limits of today's high-performance computing and communication (HPCC) technologies. The Electronic Visualization Laboratory (EVL) at the University of Illinois at Chicago, in partnership with the National Center for Supercomputing Applications (NCSA) at the University of Illinois at Urbana-Champaign and the Mathematics and Computer Science Division of Argonne National Laboratory (ANL), is involved in a multiyear, national-scale effort to develop collaborations using supercomputers, high-speed networks, VR, and scientific visualization. EVL, NCSA, and ANL have, for several years, created national testbeds—notably at the SIGGRAPH 92, SIGGRAPH 94, and Supercomputing 95 conferences—to showcase distributed scalable computing and visualization and virtual environment applications from academia, national supercomputer laboratories, and industry. The partnership designed these testbeds as experiments to help ferret out

OCR for page 32
--> the right research questions, as well as to be a cyber laboratory in which to work out answers to these questions and to formulate new questions in turn. How can we integrate HPCC technologies and VR devices seamlessly? What are the current limitations of these technologies? How do computational scientists and engineers want to use them? What tools do computational scientists and engineers need in order to use VR for problem solving? As partners, we firmly believe that application requirements should drive our research agenda; accordingly, we have been using the testbed concept to create the needed teams, tools, hardware, system software, and human interface models on an accelerated schedule in order to facilitate solutions to National Challenge and Grand Challenge problems.1 In 1991 EVL began developing projection-based VR graphical displays for use by computational scientists and engineers. Such a VR display had to help the users get to scientific discoveries faster, without compromising the color, resolution, and flicker-free qualities those users have come to expect when using workstations. Most important, the VR display had to link remote data sources, supercomputers, and scientific instrumentation in a functional way. In all, the VR system had to offer a significant advantage to offset its packaging. The CAVE,2 introduced at SIGGRAPH 92, basically met all these criteria and has been successful in attracting serious collaborators in the HPCC community. The CAVE (Cave Automatic Virtual Environment) is a 10 × 10 × 9 foot, room-sized, multiperson, high-resolution three-dimensional video and audio environment. Other VR graphical displays include the ImmersaDesk, introduced in 1995, which is a drafting-table format virtual prototyping device designed as a single-user application development station. The Infinity Wall, also introduced in 1995, is a paneled, large-screen, high-resolution stereo projection display well suited for large audiences. EVL then extended its CAVE library and desktop simulator software package to work with all three displays so that projects designed in any one of these virtual environments could be displayed in the others. EVL, NCSA, and ANL most recently completed a major experiment for Supercomputing 95: the Information Wide-Area Year (I-WAY), an experi- 1   National and Grand Challenges were defined by the National Science and Technology Council, established by President Clinton in 1993 to coordinate the federal research and development agenda. Grand Challenges are fundamental problems in science and engineering (e.g., environmental science, computational chemistry, and computation astrophysics) that have broad economic and scientific impact; National Challenges are fundamental applications that have broad and direct impact on the nation's competitiveness and the well-being of its citizens (e.g., manufacturing, health care, education, and access to information). Finding solutions to these problems can be advanced through the use of HPCC technologies and resources. 2   The CAVE, ImmersaDesk, and Infinity Wall are trademarks of the University of Illinois Board of Trustees.

OCR for page 32
--> mental high-performance network linking dozens of the nation's fastest computers and advanced visualization environments. The I-WAY experiment laid the groundwork for scientific community-based access to heterogeneous distributed computing resources, high-speed networks, and immersive VR environments. More specifically, the I-WAY was an effort to provide a wide-area ATM (Asynchronous Transfer Mode) network to support various experimental activities at Supercomputing 95—notably, interactive video, distributed applications, and remote computations for interactive virtual environments and scientific visualization. Much of I-WAY's physical networking made use of existing smaller ATM research networks, including AAInet (ACTS ATM Internetwork), ACTS (Advanced Communications Technology Satellite), ATDnet (Advanced Technology Demonstration Network), CalREN (California Research and Education Network), CANARIE (Canadian Network for the Advancement of Research, Industry and Education), CASA (Gigabit Testbed), DREN (Defense Research and Engineering Network), ESnet (Energy Sciences Network), MAGIC (Multidimensional Applications and Gigabit Internetwork Consortium), MREN (Chicago Metropolitan Research and Education Network), and vBNS (Very High-Speed Backbone Network Service). The separate networks were linked with the help of several major network service providers, including MCI, AT&T, Sprint, Ameritech, and Pacific Bell. The EVL, NCSA, and ANL partnership also staged a unique conference event, the GII Testbed, during which 60 scientific and engineering application groups demonstrated solutions to Grand Challenge and National Challenge problems. These groups used the I-WAY for remote computation and the CAVE, ImmersaDesk, and Infinity Wall for local presentations of results. Overall, the HPCC community showed tremendous enthusiasm, supporting our interest in wide-area, high-performance distributed computing. Now that Supercomputing 95 is over, the partnership seeks to establish a Persistent I-WAY to provide computational scientists and engineers with a model nationwide facility for both production and research. EVL continues to enhance its VR displays. Primary criteria for these VR systems are that they be network based and that human sharing and collaboration, data distribution, and computational heterogeneity be the core foci. An important evaluation criterion is that these systems are sought after and used regardless of distance; they should be a part of science regardless of whether the researchers are in the same room or spread across the nation. EVL also continues to track developments in the simulation and graphics industry. EVL is interested in problems with unique human/computer interface needs—from navigation intensive (i.e., lots of user interaction) to collaboration intensive (i.e., multiple VR sites participating). We do not yet know how new machine architectures (e.g., distributed shared-memory machines with hundreds of processors), advanced user interfaces (e.g., gaze-

OCR for page 32
--> directed, gesture recognition), and next-generation graphics engines (capable of rendering more polygons/second and real-time volume visualization) will stress test network bandwidth and latency. The research community is just beginning to have enough polygons and texture memory to function at all in VR, as well as to harness the real-time power of supercomputers. With no exaggeration, computational scientists need four orders of magnitude improvement in graphics and processor performance to do in real-time VR what they now do in frame-at-a-time scientific visualizations. Similarly, 155Mb networking is barely enough for intense point-to-point VR with supercomputing; upgrades to OC-48 and OC-192 experimental networks clearly are desirable, especially for multiple VR sessions. The 10- to 20-year challenges in VR/visualization applied to scientific problem solving are as follows: Providing enough anti-aliased image resolution to match human vision (roughly 5,000 pixels at a 90-degree field of view) at rates of 10–48 frames/second. Providing audio output matched to the dynamic range of human hearing as well as mostly flawless voice recognition. Developing haptic (touch and force feedback) devices. Storing, retrieving, and playing back visualization/VR sessions. Filtering and compressing massive amounts of data for presentation and storage. Connecting to remote computations and data sources in collaborative "tele-immersion" experiences via high-speed networks. Developing algorithms to portray complexity in meaningful ways. Providing the security necessary to distribute computing and data at very high speed. Our goal is to integrate heterogeneous distributed computing environments—that is, supercomputers, remote instrumentation, networks, mass storage devices, and advanced real-time three-dimensional immersive interfaces—into computational science workspaces so that scientists will depend on these systems to discover, communicate, and transmit results. Bibliography DeFanti, T. A., D. J. Sandin, G. Lindahl, and M. D. Brown. 1996. High-resolution and high-bandwidth immersive interactivity. Very High Resolution and Quality Imaging Conference, February 1996. SPIE Proceedings 2663:28. DeFanti, T. A., M. D. Brown, and R. Stevens, guest eds. 1996. IEEE Computer Graphics & Applications 16(4)(July):14-17, 42-84. Korab, H., and M. D. Brown, eds. 1995. ACM/IEEE Supercomputing 95 Conference. Virtual Environments and Distributed Computing at SC'95: GII Testbed and HPC Challenge Applications on the I-WAY. New York: Association for Computing Machinery.

OCR for page 32
--> http://www.ncsa.uiuc.edu/General/Training/SC95/GII.HPCC.html McCormick, B. H., T. A. DeFanti, and M. D. Brown, eds. 1987. Special issue on visualization in scientific computing. Computer Graphics 21(6). National Research Council. 1995. Pp. 88-89 in Evolving the High Performance Computing and Communications Initiative to Support the Nation's Information Infrastructure. Report of the Computer Science and Telecommunications Board. Washington, D.C.: National Academy Press. National Science and Technology Council. 1996. High performance computing and communications: Advancing the frontiers of information technology. A Report by the Committee on Computing, Information, and Communications. Washington, D.C.: National Science and Technology Council. Petrovich, L., K. Tanaka, D. Morse, N. Ingle, J. F. Morie, C. Stapleton, and M. Brown, eds. 1994. SIGGRAPH '94 Visual Proceedings. Computer graphics annual conference series. New York: Association for Computing Machinery. http://evlweb.eecs.uic.edu