Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 191
Review of NASA'S Distributed Active Archive Centers APPENDIX B Criteria for Review The following criteria for review provide a checklist of issues for the DAACs and the NRC site visit panels. Not all of them are equally important to all the DAACs, but they are included to ensure that all relevant topics have been covered. Following the review criteria are some suggested measures of performance. These measures are examples of the type of evidence that the DAACs may choose to bring forward to show their success in meeting the review criteria. Relationship to MTPE Mission Maintaining or enhancing position as a primary source of data and information for the global change research community Being ready for data streams from EOS and other flight missions Establishing and maintaining a working relationship with providers of data products, algorithms, and ancillary information Fitting within MTPE strategic plan Data/Holdings Providing access to data, data products, and information of high integrity and quality (including issuing data updates, notifying users of errors, and facilitating access to data from international sources) Fostering the quality of holdings (EOS and non-EOS data, produced by instrument teams or otherwise)
OCR for page 192
Review of NASA'S Distributed Active Archive Centers Ensuring that data and data products are properly documented and that all appropriate ancillary information is available Ensuring that data and associated metadata are secure (e.g., data and associated metadata are not lost, destroyed, or insufficiently backed up) Users Characterizing the user community (e.g., Who are they?) and setting priorities among different user community needs. Are the various user communities satisfied? Ensuring that holdings keep pace with research needs "Marketing" and advertising holdings Educating and reaching out to users about the application and significance of data and information holdings Fostering a sense of ownership among the users (e.g., promoting active participation by the users in DAAC activities) Cooperating with other agencies and international partners to ensure availability of science data to the widest possible user community Technology and Facilities Keeping up with technological advances Using appropriate technology within the constraints of the EOSDIS requirements. Is it sufficiently modern? Is it cost-effective? Ability of the DAAC to choose the hardware and software technology it uses within the constraints of the EOSDIS requirements. Is there a plan for what equipment to acquire? Knowledge and use of appropriate national and international data standards Management Responding to user feedback and emerging data and information needs Entraining, hiring, and retaining high-quality personnel Implementing effective mechanisms for setting and revising internal priorities (e.g., data handling, processing, backlog absorption) Promoting local innovation and initiative Ensuring that an adequate level of quality control is maintained in an operational setting Participating in an effective process for resolving different priorities among EOS units (federal and nonfederal) Interacting and cooperating with other DAACs to address common issues and avoid duplication of effort
OCR for page 193
Review of NASA'S Distributed Active Archive Centers Developing adequate and practical working relationships among responsible agencies and international partners, where applicable Leveraging personnel, technology, scientific guidance, et cetera, from a host data center or other scientific institution, where applicable Developing and implementing a five-year plan for the DAAC (personnel, computers, etc.) Given the extent and quality of data holdings, satisfaction of the users, appropriateness of the technology and facilities, skill of the management, and so forth, is the DAAC cost-effective? SUGGESTED MEASURES OF PERFORMANCE The DAACs should have the primary responsibility for demonstrating that they meet the above review criteria. The following measures of performance are some suggestions that may be helpful: Statistics of processing user requests (e.g., turnaround time, track record in filling orders, error rates) Profiles of users by categories, such as which users dominate the total use; to what types of institutions do occasional users belong; are users seeking or providing information; are they instrument team scientists, other global change scientists, or the general public; are they domestic or foreign users; are they occasional or frequent users, etc. Number of publications involving use of data or information provided by the DAAC Short list of the most important scientific advances in which the DAAC has had a significant involvement Number of users and growth in number and diversity of data requests Spirit of initiative (e.g., DAAC innovations affecting global change research) Regular interaction with the scientific community (important for quality control, relevance of holdings, setting priorities, and keeping pace with new research needs) Leverage DAAC activities with resources from parent or other institutions Turnover of personnel Independence of the User Working Group, the degree to which it has input into the agenda for its meetings, and the degree to which management is responsive to the Working Group recommendations
OCR for page 194
Review of NASA'S Distributed Active Archive Centers This page in the original is blank.
Representative terms from entire chapter: