Finally, he felt that the roadmap does not correctly capture the status and future of semantic technologies as they are already in widespread production in NASA but advancements may revolutionize how science is done.
Arcot Rajasekar (University of North Carolina at Chapel Hill) gave the second presentation which focused on integrating data life cycles with mission life cycles. He discussed the challenges in end-to-end capability for exascale data orchestration. He noted that NASA has massive amounts of mission data and there is a need to share all this data over long timeframes without loss. He suggests an integrated data and metadata system, so that the data are useful for future users but currently there is no coherent technology in the roadmap to meet these needs. He believes that the current roadmap showcases the need for data-intensive capability at various levels but provides limited guidance on how to pull and push this technology. He said the information processing roadmap is very impressive but needs a corresponding “evolutionary” data orchestration roadmap. In his view, game-changing challenges to NASA include policy-oriented data life-cycle management (manage the policies, and let the policy engine manage the bytes and files); agnostic data visualization technologies; service-oriented data operations; and distributed cloud storage and computing. However, he thinks the greatest challenge for NASA is a comprehensive data management system (as opposed to doing a stove-piped approach for each mission) and noted that the technology is out there; it just needs to be done. It would be a paradigm shift to go toward an exascale data system that is data-oriented, policy-oriented, and outcome-oriented (i.e., a system that captures behavior in terms of data outcomes).
Neal Hurlburt (Lockheed Martin) gave the third and final presentation on data systems. He started with a summary of a recent study which found a need for community oversight of emerging, integrated data systems. He believes that the top challenges for NASA include current data services are not sufficiently interoperable; the cost of future data systems will be dominated by software development rather than computing and storage; uncoordinated development and an unpredictable support lifecycle for infrastructure and data analysis tools; and the need for a more coordinated approach to data systems software. However, he thinks that NASA can exploit emerging technologies for most of their needs in this area without investing in development. He believes that NASA’s role should be to develop infrastructure for virtual observatories, establish reference architectures/standards, encourage semantic technologies to integrate with astronomy and geophysics communities and provide support for integrated data analysis tools. He sees the widespread use of consistent metadata/semantic annotation as near a tipping point.
Public Comment Session and General Discussion
At the end of the workshop there was some time set aside for general discussion and to hear comments from the audience. This session was moderated by Carl Wunsch (MIT).
Discussion started along the lines of NASA’s role in information technology and processing technology development. It was noted that a lot of these topics are not unique to NASA and there are significant efforts initiating elsewhere, for instance in industry and commercial companies. Some expressed their view that NASA is more of a beneficiary than a key player in the technology development.
It was noted that a key difference for NASA relative to commercial endeavors is NASA’s focus on minimizing risk, particularly with regard to flight systems. There was some agreement that much of what is commercially available is not compact enough, reliable enough, low power enough, etc., to fly in space. An example was given of radiation-hardened computing power or CCDs; industry is way ahead technologically, but it can no longer be used in space. It was noted that space technology is so far behind in those areas that the old technology cannot be purchased in the marketplace. It was suggested that NASA needs to team with DOD, which has deeper pockets and similar objectives. Someone also warned that if NASA does not develop something because it assume commercial interests will do it— but commercial will only do it if there’s economic payoff— there is a risk that NASA/science will be at the mercy of the market.
The example of radiation hardened electronics led to some further detailed discussion. It was noted that a lot of computing now is being done with radiation-hardened FPGAs and ASICs, which are readily available. It was mentioned that FPGAs are harder to validate and every ASIC manufacturer has its own set of simulators, compilers, etc. It was suggested that fault tolerance could be approached in a different way. A proposed technology challenge was made of developing radiation-hardened design using current technology for integrated circuits that does not need specialized facilities to produce.