The grand challenge problems described in Chapter 2 emerged during breakout sessions and were discussed in more depth and refined during plenary sessions. In subsequent discussions, workshop participants were asked to identify general requirements for the experimental infrastructure capabilities and cyberinfrastructure tools associated with addressing the grand challenge problems. The committee consolidated the results of these discussions into descriptions of 14 distinct networks of facilities, which are presented in this chapter.
A key part of the discussions involved the characteristics of the network of facilities, both experimental and cyberinfrastructure. Networking allows collaboration of geographically distributed researchers and team members as they utilize multiple facilities. Collaboration tools include real-time and asynchronous communication, access to databases, simulations, and experiments. Advanced collaboration tools and social media could allow new types of interaction to develop and enhance the educational and outreach functions of the network. Workshop participants reiterated that data storage, search, and mining are critical tools for the network. Access to simulation and analysis software from petascale computers to mobile apps could leverage a substantial toolset in earthquake engineering and other applications, and unleash developers to create new applications to meet the demands of the Grand Challenges. Real-time communication with Quality-of-Service guarantees could allow advances in hybrid simulation and advanced testing methods. Participants envisioned that the user communities for the facilities would encompass a wide range of researchers, practitioners, planners, and other officials, and that the data, models, and information sources would be available and documented for the general community.
In discussing the networks of facilities, participants described the characteristics of a unique community resilience observatory and an “instrumented city” testbed that would create urban-scale laboratories without walls to integrate experiments, simulations, and models of community behavior to advance knowledge about resiliency. An advanced observatory and mobile monitoring system could provide data at urban scales to researchers before and after earthquake events. Specifically, the community resilience observatory could offer researchers an opportunity to develop, test, and evaluate different methodologies for quantifying community resilience in different parts of the country; to monitor and track recovery in areas that have experienced major catastrophes; and to ensure that benchmark data for measuring resilience can be standardized across the country. In turn, products from the community resilience observatory could benefit land use planners, emergency responders, and state, regional, and local policy makers in their efforts to better prepare for earthquakes. An instrumented city, in addition, could allow researchers to integrate the output from many different sensors (from strain gauges to satellites) to monitor the performance of a city during an actual disaster. The continued collection of data—both before and after an earthquake—could allow not only researchers but also community policy makers to generate critical benchmark datasets for use in quantifying the impact of risk reduction measures for a community.
The experimental facilities suggested by participants encompass testing and monitoring over a wide range of scales, loading regimes, boundary conditions, and rates on laboratory and field (in situ) specimens that would be needed to address the grand challenge problems identified during the workshop. At the material scale, facilities can generate data about the properties and behavior of sustainable material. At the full scale, facilities can provide urgently needed information about the performance of complete structures, including the effects of soil and non-structural components. The interlinking of multiple sites through methods such as hybrid simulation would allow experiments of the “whole” to be greater than experiments on the “parts.” Participants suggested that cyberinfrastructure tools are essential for capturing, analyzing, and visualizing experiments and for supporting the advanced simulations identified in the Grand
TABLE 3.1 Linkages Between Facilities and the Five Overarching Grand Challenges.
|Community Resilience Framework||Decision Making||Simulation||Mitigation||Design Tools|
|Community Resilience Observatory||√||√||√||√|
|Earthquake Engineering Simulation Center||√||√||√||√||√|
|Earthquake Engineering Data Synthesis Center||√||√||√||√||√|
|Rapid Monitoring Facility||√||√||√|
|Sustainable Materials Facility||√||√||√||√|
|Networked Geotechnical Centrifuges||√||√||√||√|
|SSI Shaking Table||√||√||√||√||√|
|Large-Scale Shaking Table||√||√||√||√||√|
|Tsunami Wave Simulator||√||√||√||√||√|
|Advanced Structural Subsystems Characterization Facility||√||√||√||√|
|Non-Structural, Multi-Axis Testing Facility||√||√||√||√|
|Mobile Facility for In Situ Structural Testing||√||√||√||√|
Challenges. A simulation center and data synthesis center were identified as separate but interlinked facilities because of their very different services and capabilities.
Table 3.1 shows how the facilities discussed in this chapter could address the five overarching Grand Challenges described in Chapter 2. As one example from the table, the rapid monitoring facility addresses problems described in the Community Resilience Framework, Decision Making, and Simulation Grand Challenges. The ordering of the facilities does not indicate prioritization.
The community resilience observatory, as envisioned by participants in the “community resilience” breakout group, would encompass interlinked facilities that function as a laboratory without walls. It could integrate experimental testing and simulations with a holistic understanding of communities, stakeholders, decisions, and motivations. The observatory could support basic research on interdependencies among systems, the multiple dimensions of resilience, and analytic tools for resilience measurement that take those interdependencies. It could host evolving community data and coordinating models that use that data to produce knowledge about resilience. Participants noted that comprehensive datasets from past earthquakes that quantify both the direct and indirect impacts of these events, empirical indicators (e.g., socioeconomic information on communities) that measure the resilience or sustainability of communities from past disasters, and tools and platforms (software or social networking solutions) that allow researchers to access and use the above data in an open resource framework, would be especially important aspects of this data collection.
Several participants noted that the concept of a resilience observatory is not new. In 2008, the National Science Foundation (NSF) and the U.S. Geological Survey supported a workshop that brought together leading researchers from the disaster research community to explore the creation of a new NSF observatory focused on resiliency and vulnerability. Such an observatory would address obstacles by “(1) supporting development of long-term longitudinal datasets; (2) investing in the development of data collection protocols to ensure comparable measurement in multiple socio-political environmental settings and across multiple hazards; (3) building on and complementing existing data collection efforts and activities in the public and private sectors; and (4) enhancing the sharing of data throughout research and practice communities” (Peacock et al., 2008).
The observatory concept discussed during this present workshop is similar to that of the 2008 workshop. Participants described this observatory as a virtual clearinghouse for a broad range of data that could be used to monitor, measure, and evaluate the resilience of a community. As discussed at the workshop, these data would be housed in different laboratories across the country and would be accessible by all researchers interested in studying community resilience. The observatory was also seen as a series of testbeds to study post-earthquake recovery in different parts of the country. By examining recovery in different regions, researchers could begin to evaluate the scalability of methodologies and models designed to measure community
performance. Finally, the observatory could be used to link researchers from various disciplines in order to study community resilience from a holistic perspective. By taking part in a virtual network from different parts of the country, researchers could better study the physical and socioeconomic factors that affect community resilience.
An instrumented testbed in a high-risk, urban environment could provide invaluable data about a community’s resilience to earthquakes. New instrumentation from strain gauges to satellites could monitor and measure at multiple orders of scale. For complex lifeline systems—including transportation networks—participants emphasized a need for underground sensing and new monitoring devices that are wireless, self-locating, and self-placing. Leveraging other uses of lasers, imaging, satellites, and networks such as smart grids could contribute to collecting data in a region. As such, an instrumented testbed would allow capturing the response of complete, interconnected infrastructure systems and their interactions with urban systems. A constellation of sensors could be connected to a central data repository (e.g., the Earthquake Engineering Sensor Data Synthesis Center). As envisioned by workshop participants, this repository would require new technologies with respect to data management, communication, data fusion, data processing and dissemination, and data sharing. The instrumented city could allow unprecedented research on studying decision-making processes for development and calibration of comprehensive community models. It could be a specific site or region where many of the sensor systems described above are already in place or could be installed as part of other programs.
Massively parallel computers, fast memory access, and large storage in an earthquake engineering simulation center could enable high-performance computing computations for large-scale modeling and simulation. Such a center could bring together earthquake engineering researchers with experts in algorithm development, computational and statistical methods, and high-end computational and cloud development methodologies to enable transformative advances in simulation. Such a center could include theory-based simulation and multi-scale, multi-component modeling, as well as data-intensive, inverse, and hybrids of these paradigms. An interactive visualization capability could be networked and distributed for comparing simulations and experimental data. Participants noted that an important requirement is the capability for regional simulations including integrated visualization and interactive decision making. Such a simulation center could have 100 GB bandwidth network connectivity with the Earthquake Engineering Data Synthesis Center (below), and it could leverage high-performance computing services available through national networks. System and application development would be an essential part of such a service, to create the core simulation services and interfaces needed to support further advances in the earthquake engineering community.
An earthquake engineering data synthesis center, as envisioned by participants across multiple breakout groups, would offer the research community a large-scale database system for ingesting data sources from a variety of sensor types including imaging, remote sensing, video, and information management systems (e.g., BIM, GIS). Such a center could support the execution of models over that data to provide curated reference data, inferred derived information, simulations of normal and disaster scenarios and mitigation and response, and community services supporting data access and decision support. The center could assume federation and harvesting of data as a significant mechanism and focus on integrated, derived, and curated data products, and also offer advanced search and retrieval based on meta-data and action queries on data. Such a rich data source could help researchers understand the response of complete infrastructure systems in a region at multiple scales through networking with sensor galaxies and all experimental and field facilities. The center could provide well-defined abstractions that would empower users to develop tools for data analysis and monitoring to support statistical and inferential discovery.
Many workshop participants expressed a need for integrated continuous and multi-sensor (e.g., aerial, satellite, unmanned aerial vehicle) observations of communities at various scales (e.g., buildings, neighborhoods, regions, and countries) for characterizing the physical attributes of communities and monitoring the effects of earthquakes (e.g., damage assessment and recovery). These earth observation systems could offer optical as well as dimensional views (3-D using radar and LiDAR [light detection and ranging] sensors) of cities that would quantify attributes of cities including location, type, and density of buildings; location of critical lifeline systems; and natural attributes that could contribute to the vulnerability of an area (e.g., low-lying coastal areas subject to tsunami effects). Many of these networks and systems are already in place, and a number of participants noted that existing resources could be leveraged to accomplish the above objectives. To develop a holistic solution for quantifying the vulnerability and resilience of large cities, many participants stressed the importance of including a remote sensing component.
Participants noted that a rapid monitoring facility could provide the earthquake engineering community with a suite of highly portable sensing and data acquisition tools that could be rapidly deployed to structures, geo-facilities, and lifelines to monitor their stability after seismic events. Included in the deployable facility could be robotic systems that would be capable of sensor placement in partially collapsed structures and in lifeline systems with tight, difficult-to-reach locations. Sensor arrays deployed into critical infrastructure systems could provide a wealth of response data during aftershocks, providing valuable data for future modeling.
There is an emerging range of new, sustainable, highly resilient materials that offer opportunities to change the way infrastructure is designed and constructed. Many of these high-performance materials are being developed for the aerospace and mechanical industries, and are not currently appropriate for adoption by the construction industry (because of very high prices and limited availability). Participants noted that there is a significant opportunity to partner with material science facilities to develop and test new construction-grade materials, which might be self-healing, capable of energy capture, or ultra high strength, and to understand the use of sustainable materials for earthquake engineering applications. Although existing materials facilities might be appropriate for some of this development, it is likely that augmented or new facilities would also be needed to test these materials under the conditions they are likely to experience when used in construction, accounting for the influence of aging and degradation.
Multiple networked geotechnical centrifuges, each including innovative capabilities for robotic manipulation and actuation within the centrifuge container during the experiment, could allow new types of experimental modeling of landslides (including submarine), liquefaction, and tsunamis. Unique hybrid simulations would be possible through networked facilities, thus enabling a more detailed assessment of interaction effects between structures and foundation systems and large-scale integrated geotechnical failures.
A large-scale, dynamic shaking table designed for soil-structure interaction (SSI) experiments, as envisioned by participants in the “design of infrastructure” group, would enable a significant throughput of SSI experiments to help advance knowledge of this crucial component of earthquake engineering. A large-scale testing system could facilitate studying of the interaction of geotechnical conditions for both infrastructure components as well as building systems. Self-organizing wireless sensors, as well as other new types of sensing strategies specific to SSI, could enable high-resolution assessment of progression of damage in SSI systems and the development of new strategies for more robust design of structures and infrastructure systems. Hybrid simulation could also provide the realistic, time-dependent loading on specimens that is important for accurate assessment of soil-structure interaction.
A large-scale shake table facility capable of full-scale structural testing was viewed by a number of workshop participants as being important for addressing the Grand Challenges. They noted that there are significant knowledge gaps about structures that are damaged or partially collapsed and the modes of failure. Testing complete structures or full-scale subsystems in multiple directions would allow improved understanding of the response of actual construction and the contributions of lateral and gravity load-resisting systems and non-structural systems. Such a facility could provide fundamental knowledge for understanding the complete system behavior, validating post-earthquake evaluation methods for damaged structures. This knowledge in turn could help determine which structures are safe to occupy and which ones need to be demolished. As envisioned, this facility would require multifaceted testing capabilities, including hybrid methods, with the capacity to test to collapse. Workshop participants discussed the need for a study about whether it is most effective to construct a new full-scale shaking table or develop international partnerships, such as a partnership with E-Defense in Japan.
The tsunami wave simulator described by several workshop participants would be a revolutionary new facility that combines a tsunami wave basin with the capability to shake the ground to simulate liquefaction and subsidence. Participants noted that fundamental knowledge about large-scale coupling between soil-structure and fluid interaction is lacking, and a combined tsunami and liquefaction wave tank could provide researchers with a better understanding of foundation weakening, scouring, and structural failure, which in turn would lead to improved protection for coastal communities. The wave simulator basin would be on the order of at least 150 feet wide by 250 feet long, with enhanced absorption boundary conditions capable of tsunami generation, propagation, and reproduction of local effects on coastal structures.
Many participants noted that to enable the development of more accurate structural models, a networked set of equipment that replicates the effects of corrosion, accelerated aging, and fatigue is needed for the characterization of subsystems, components, and materials. Such a facility could have the capability for multi-axial loading, high-temperature testing, and high pressures. It would need to be able to test full-sized or close-to-full-scale subsystems and components under fully realistic boundary and loading conditions, including rate effects to avoid issues with scaling, and would need to be supported by a comprehensive set of high-performance instrumentation. Such a facility could enable the development of high-fidelity physics-based models for incorporation into simulations of complete structures. It could also enable the characterization of the full lifetime performance and sustainability of structural elements and materials and allow the development of appropriate retrofit and strengthening techniques for existing aging infrastructure.
A significant proportion of the losses following an earthquake are the result of indirect damage to the contents of buildings, rather than damage to the structural frame. A number of participants noted that the requirements of the current seismic qualification codes cannot be fully met with existing facilities,1 highlighting the need for a high-performance multi-axis facility with the frequency range and levels of motion necessary to investigate and characterize the performance of non-structural elements (e.g., partitions) and other content (e.g., shelving, IT equipment, lighting, electrical and mechanical equipment) within a building or other infrastructure. Such a facility would need to deliver very high displacements, velocities, and accelerations so that it could simulate the behavior of floors at any point within a building; however, it may not need to have a very high payload capacity because most non-structural items within buildings are relatively light. Such a facility could permit the development of complete building models, including the building content, and also the development of more robust non-structural elements and equipment that would be significantly less likely to be damaged in an earthquake.
A mobile facility for in situ structural testing, as described by participants in the “design of infrastructure” group, could be equipped with a suite of highly portable testing equipment including shakers, actuators, sensors, and high-resolution data acquisition systems that could be used to test structures, lifelines, or geotechnical systems in place. Examples include modal shakers to introduce dynamic loads on structures, bridges, and soil systems. Additional capability could include large-capacity broadband dynamic seismic wave sources coupled with improved sensing capabilities to allow the high-resolution subsurface characterization essential for regional modeling. Hydraulic actuators capable of in situ lateral loading could provide an experimental capability of testing structures. Intentional and repeatable dynamic loading of buildings, bridges, and other structural systems could allow systems to be dynamically characterized for improved modeling capabilities. Dynamic excitation of geotechnical systems could improve understanding and the modeling of liquefiable soils.
1 For example, IEEE Standard 693-1997, which contains recommended practices for seismic design of substations, cannot be met without significant filtering of the low-frequency content of the signal (Takhirov et al., 2005).