In formulating the lessons learned described in Chapter 3, the committee interviewed experts, heard presentations, and created an online questionnaire that was distributed to the broader community. The online questionnaire received responses from national and foreign individuals who had an interest in spatial data infrastructure and proved to be a rich source of opinions from users, planners, and policy-makers. The responses are grouped below by issue.
Lessons Learned Questionnaire Responses
|Standardization||Full acceptance of the organization of the need for rigid standardization of its data and information products to agreed international standards
Organizational commitment to internationally agreed metadata standards
[Successful organizations] work within the community to make/improve standards…
[Challenges] Let a thousand flowers bloom – In the past year we are witnessing a global convergence in thinking on how spatial data should be integrated. This is occurring in all technical fields as well as in the library sciences. There are scores (hundreds??) of projects moving forward and we risk duplicating effort or worse, creating divergences in standards, protocols, processes and methods that will make later data integration much more difficult or effectively impossible.
[Challenges] The amount of data. Data from countless agencies and data formats, gathered with varying standards, documented with varying accuracy and amount. It can be difficult to get many people and agencies to agree on standards and work together unless some plan and mutual benefit is in place.
[Challenges] Technical - appropriate standards (metadata, vocabulary)
|[Does not work] Implementation of OGC standards suffers from performance issues. There is lack of leadership.
[Worked well] The development and adoption, though limited, of open standards for geospatial Web Services is a key capability that promotes interoperability. This, in turn, allows for neighboring or overlapping SDIs to work well together without special agreements or translators.
|Data||Scientists must make available data that underpin knowledge products
Federal data are created to some minimum achievable standard
Census using local roads data
Landsat 7…we’ve done the best we could. We need the continuity mission now.
Everything online [what has worked well] - We have seen a paradigm shift in thinking about data and especially spatial data, in the last four years or so. Prior to this, data owners were generally unwilling to share their data for fear of them being misused, losing control over the data, of them being used to scoop the originators of the data, or others getting credit for the data. In the past few years however, there is widespread recognition of the value of making ones data more widely available for others to use. This coincides generally with the release of Google Earth and the rapidly growing expectation that everything, including scientific data, should be readily available online at no cost.
[what has not worked well] Mandated uniformity – Everyone has invested vast resources in their databases and spatial data infrastructure. So, when discussion of data integration came up, the fear was that we would all be forced to convert what works for us, into some format (and operating system, and server configuration) that would be imposed. Many data providers have custom systems and applications that will be prohibitively expensive to re-do. Plus, with cyberinfrastructure in a constant state of change, how could we adopt a system that would not be obsolete before it was implemented.
[Do differently] Put some good people into cataloging existing reports and data sets. Build better metadata tools. Make management accountable to publish geospatial data from all projects. Make all projects identify spatial data results, plan for, and publish them before project is considered complete.
[Challenges] Geography has spent the last decade trying to justify their existence, rather than meeting customer needs. Many of Geography programs have become largely irrelevant, with some very notable exceptions, all of which are long-term commitments of resources focused on data content, such as NED, NHD, NLCD.
|[Challenges] The major challenge is to establish a NSDI organization that is viewed by authors as a robust clearinghouse of their spatial datasets. Authors should be glad to submit their data and should be delighted that others will have easy access, instead of having to handle ‘data requests’ every time someone wants it. The current cultural views the NSDI as an unfunded mandate, with a lot of hassles to submit data, with very little benefits in return. Trying to establish a new organization - or revamping the existing one - is always extremely difficult, and the establishment of this one is even more difficult because of the lack of the basic understanding of the true value.
[What has not worked?] US Topo is a solution looking for a problem. The focus should be more on content, rather than packaging. GeoPDF may satisfy a certain niche, but without excellent content, it serves little purpose.
[has worked] Several national seamless datasets have been very successful, including NED, NHD, NLCD, and NHDPlus. These are providing very useful data that is nationally consistent, well organized, and easy to access.
[not worked] The WRD NSDI node is just a tabular list of 646 datasets - some datasets are listed by theme, such as, ag, aquifers, etc; a lot are listed with obscure names, such as, darea, diffus, etc.; some are listed by OFR #, by SIR # and by WRIR#. How in the world can anyone find what their [sic] after? We need a better way of assigning searchable ‘key words’ to the datasets and tools that can search and retrieve datasets that meet a specified query.
[not worked] Main sticking points are access to updated, high-quality satellite-derived imagery, and access to sufficient field-based observations of vegetation (i.e., we need 500,000 current georeferenced samples – with sufficient vegetation composition and structure documented - maintained and accessible) to support map development and accuracy assessment.
[Challenges] technical challenges come mainly from a lack of certain critical data sets required to develop robust spatial models. We work across local/regional/national/ continental scales, so access to data that are standardized across these scales presents the greatest challenge.
[Domain] Our work is centered on biological data, including the characterization and assessment of ecological systems and habitats for species of concern. But in order to address this domain successfully, we rely on a wide range of non-biological data inputs, such as imagery (of varying types and resolutions), digital elevation, synthesized climate data (past, current, future), surficial geology, soils, surface drainages, wetland location, hydrography, land use, land ownership, and land use policy.
[worked well] The understanding that data needs to be consistent and of known quality so that decisions can be more easily made on how the data can/should be used.
|[didn’t work] We are severely lacking in field-based observation data for about 10,000 plant and animal species in the U.S. that are of conservation concern. With these observations and a foundation, habitat models can be developed to apply to many forms of environmental decision making processes.
[not worked well] Data content standards and schemas have not been widely adopted for key base data sets. This makes the exchange and co-development of data, at least in the US, more difficult.
|Metadata||[what works] State and local grants for data creation and metadata training.
[what did not work well] Difficult metadata standards – It is such a challenge to generate FGDC-compliant data that many individuals, programs, and agencies, do not even try, and instead have their own internal systems. The head of a geoscience division in a large federal agency told us that, yes, the USGS standards are nice, but they could not invest the time and resources to meet them, so they developed their own in-house way of doing things, because they had to get things done. The chief scientist for one of the world’s largest multinational oil company described to me how they were scraping their fourth internal attempt to create a company-wide data base, after spending millions on it. Before they could make significant progress, various offices and branches had gone off in other directions because they could not wait, and they had their own needs to address.
[What has not worked?] It has been too hard to develop metadata [for water data], and the process is usually left to the end of a project and then not done. Management bears much of the blame for this, as they have not enforced the requirement to publish metadata, even though it has been required by executive order since 1994. Although much of our work is supported by geospatial data, much of the data supporting that work is never published because of this.
[What has worked?] Publishing metadata through the NSDI node, when it is done, does work fairly well. I can find datasets I published in the past easier with Google than I can find them on my hard drive or backup media.
[not worked well] Outside the Federal government: metadata collection and dissemination are often non-existent.
|Distribution, Serving||[what works] The National Map accepting state data and using it on … servers
Having the National Map portal at EDC is very useful
|Work with Google [private industry in general] to make it even more useful and friendly. Market expansion for geospatial data could be exponential if the tool people sue to access the data is easy enough to use and does enough analysis. (all we need is overlay analysis and we are in the GIS business online)
[What has worked well] Market driven solutions – the pervasive adoption of free online visualization tools for spatial data (e.g., Google Earth, Google Maps, Bing, ArcGIS Explorer, etc) by even the smallest retailer and organization has made it the norm to share (and promote) your data online. Also, data providers find these free tools of tremendous value, so that when more data are made public, more tools and applications become available to use them.
[Do differently] Make the USGS SDI more powerful by giving it better search and data discovery mechanisms. Requires support from the top, a budget, and a dedicated team - not just USGS but from all agencies. Standards need to be defined. Robust software tools need to be developed to create standard metadata, and to provide the ability to search all NSDI nodes. This really requires a lot of coordination from all agencies to enforce the standards so ‘searches’ have the potential to retrieve all data that meets a query.
[What has not worked?] The National Map is [ineffective]. Viewer 2 is better, but it still lacks compelling content. Much of the current content is not much better than we had in the early 1990’s from TIGER/Line and 100K DLGs.
[worked well] New map services are providing data access in new ways. These include NWIS web services, NWIS Mapper, real-time earthquake maps, and StreamStats. StreamStats provides analytical services rather than just raw data, and is a good example of how far the web service model can be taken.
We were able to work successfully with USGS to access global climate, digital elevation, lithology, and other data sets for critical new advancements in classification and mapping of terrestrial ecosystems. Much new work has been advanced in the U.S., Latin America, Canada, and Africa, due to the accessibility of key data sets from USGS.
[Worked well] The idea of centralizing data so that it can be easily accessed.
[Did not work well] Up until the National Map, we had access to localized lists of available spatial data from both the USGS and other agencies. You had to read through title upon title to find things and getting a grip on what was available for a given geographic region was difficult. You would have to visit each agency where you think data might be available and then it may not be documented very well.
One-stop portals have not materialized and efforts such as the National Map have failed to reach their potential.
|It has worked OK for me. I use several sources for data and those sources are reliable and well documented. However, certain USGS efforts have not reached a level of usefulness (e.g., the National Map).
[Worked well] The increased use of and migration to database technology for storing spatial data.
|Tools||[Challenges] Technical - appropriate standards (metadata, vocabulary) and tools (gazeteers, spatial and keyword search) are still lacking. Existing systems need to lessen reliance on proprietary software. Tools to integrate diverse data types need development.
[worked well] Cutting edge development of tiled map services by Google and others
|Public Relations||Under promise and over deliver
The more successful organizations in building SDIs are the ones that have a long history of collaborating with other organizations and ha[ve] a culture which is focused on making data and information available to the broader community.
State liaisons. Relationships in the field cannot be beat…should be some of the most intelligent and motivated folks…part of their performance evaluation (not sure if this is possible) should come from the people they serve in the states.
Partnership maintenance, state and local venues…local professional organizations…representation on state and local geospatial decision making bodies
[Challenges] High expectations - Increasingly, scientists as well as decision makers, business and the public not only want, but expect all data will be instantly available online at no cost, and fully interoperable. Such systems are standard on a number of popular network television crime shows where all data of any kind sought are brought to the desk top instantly and fully integrated with no need to convert, process, or interpret them.
[Do differently] Demonstrate the results and benefits earlier – Until recently, we have been talking among ourselves primarily and not to the users of the infrastructure. Audiences glaze over instantly with the mention of data exchange standards and semantic ontologies. So, we are starting to showcase what the system will look like and deliver to the average user. A demonstration of the Geoscience Information Network (GIN – http://usgin.org) to the Arizona Legislature in November 2009 was hugely successful, not only in showing decision-makers the potential but to many of our stakeholders and participants who are still somewhat fuzzy about how this will all work and what it will do.
|[do Differently] Make the USGS SDI more powerful by giving it better search and data discovery mechanisms. - Requires support from the top, a budget, and a dedicated team - not just USGS but from all agencies. Standards need to be defined. Robust software tools need to be developed to create standard metadata, and to provide the ability to search all NSDI nodes. This really requires a lot of coordination from all agencies to enforce the standards so ‘searches’ have the potential to retrieve all data that meets a query.|
|Planning||Develop a roadmap that encompasses the business, information, technology, computation and engineering viewpoints, and consistently review and update as required
Well developed business case that articulates to the organization what the value of the proposition of SDIs are
Successful projects are done incrementally…low hanging fruit
Successful projects initially focus only on those projects that are staffed by fully committed people.
[do differently] Appropriate data management starts at the planning phase and proceeds through data collection, processing and use. Tools must be provided that reduce the burden to individual projects/users throughout this process - and that ultimately provide them access to more data than would be otherwise available (or easily discovered/accessed).
[Do differently] 1) Promote data lifecycle management objectives and outcomes as performance indicators for federal agencies, 2) create government centers of excellence for highest priority data sets and require cross agency funding mechanisms for collection and maintenance, 3) promote standards-based, optimized, geospatial data service hosting for federal agencies to increase capacity and uptake.
|Organization||FGDC or some other entity has not been given adequate authority to carry out the mission they were put in place to do. If they are meant to be successful, they need to be put in some place other than USGS…like OMP.|
|[What has not worked] Central or concentrated control (e.g. Data Czars) – in the early days of the Web, researchers starting creating centralized databases for each domain or sub-domain. These required scouring the archives and literature for analog (“legacy”) data, digitizing them, and building an ongoing capability to update and maintain the repository. Very soon, data providers could be barraged by multiple data base owners for copies of their data and constant demands for the latest updates. No one had the time or resources to be repeatedly feed the demands of external bodies for their data. As the number, size, and diversity of data bases grew rapidly, the communities wrestled with how to share and integrate data from disparate sources. Proposals to ‘coordinate’ data integration or oversee standards were viewed skeptically or hostilely by many as creating the potential for ‘data czars’ to impose their will on the rest of the community. This concern was one of the biggest stumbling blocks to getting community consensus in building cyberinfrastructure for the earth sciences in the past decade.
[Challenges] This new organization is not just [about the] USGS, but all stakeholders from all agencies. Since the current organization is disjointed, it almost appears the past approach was to allow agencies to do whatever they wanted, and the ‘best practice’ would float to the top becoming the de facto standard. But the reality is, nothing floated to the top and it is still disjointed.
Security needs and concerns also challenge most government programs
[worked well] Recognition of benefits of web services
[Challenges] The major challenges are primarily organizational, confounded by financial challenges. USGS has not had consistent leadership with the goals of leveraging our geospatial data and the enterprise licenses The majority of geospatial issues in the Department of the Interior (DOI) and USGS is the result of too little attention to the fundamentals of data standards and data applications across the spectrum of spatial data services in USGS. There is a partitioning of data collection among themes and funding of these themes, as well as partitioning of support services for Geospatial Data collections and the research scientists requiring GIS support to use our enterprise license. A very small part of the GIO is able to see the big picture and the result is that GIS application support has fallen through the cracks of constant reorganization.
[Challenges] The most significant challenges to success of SDI are: 1) Clarity of responsibility and government-wide recognition of the stewardship responsibilities, 2) clear governance with regard to collaborative development and stewardship within and beyond the federal government, 3) greater leverage of public and private data resources and value-add capabilities, 4) lack of wide adoption of Web Services infrastructure.
|Organizational Commitment||Executive level support as well as commitment from senior, middle, and junior levels of staff
Champion who is knowledgeable and respected by the community Full understanding of the impacts of the introduction of what is essentially a disruptive technology
Collaboration in sharing of data, agreement of standards etc is critical to the development of an SDI
Scientists must make available data that underpin knowledge products
… properly articulated polices can be an enabler for SDIs….needed at the organizational …whole government level.
[Challenges] Sustainability – Hundreds of millions of dollars have been spent on myriads of projects that, while individually successful, have not led to the creation of an integrated or sustainable spatial data infrastructure. Hundreds of stove-piped projects have been funded, but too many disappeared when they could not get funding renewals. Or the technology has changed and the results are in obsolete formats or buried on a hard drive somewhere. NSF is requiring new informatics projects to address the question of sustainability but having recently reviewed a large stack of proposals on an external panel, the community practitioners are not even close to dealing with this problem realistically or satisfactorily.
[Do differently] Once the new infrastructure is in place, all projects should be required to budget time and money to prepare and submit all spatial data - as intended.
[What has not worked?] The NSDI was initial established in 1994 and was intended to be a repository of all spatial data referenced in reports/publications. I’m not sure how many spatial datasets have been referenced in Water Resources Division (WRD) publications since 1994 to the present, but I would estimate well over 10,000. Keep in mind, GIS started to become main-stream in WRD in the mid 80s.WRD currently only has 646 datasets in the WRD NSDI (http://water.usgs.gov/cgi-bin/lookup/getgislist), so as you see, there is a huge problem getting authors to participate and I’m glad to see this finally getting addressed. To the authors defense, the reasons listed below are why they did not participate.
[worked well] Not much has worked well; no support; standards not well defined; very little guidance; very little incentive; software tools to create consistent metadata lacking; datasets are almost considered a burden, especially large ones; search mechanisms of data in NSDIs lacking.
[Challenges] Cultural - Incentives, if not mandates, need to be provided and a culture needs to be developed that recognizes data management and provision as part of the public trust responsibility of federal and state agencies. This culture will not arise because of theoretical benefits, it will develop when real benefits accrue to users through a) facilitation of data access and use and b) when systems provide relief from burdens of data and metadata development and management.
|[do differently] Make a real commitment to Enterprise GIS and geospatial data management, development, and integration. Current support is nominal and based on the minimum support required to fulfill requirements of enterprise GIS licensing agreements.
[do differently]The USGS Geospatial programs are primarily outward looking, and driven what they feel is public demand. This does very little to support USGS science. USGS Management needs to define a geospatial science commitment and plan
|Personnel||Tertiary trained professionals who understand the technology and are respected
Important to accept the high level of technical skills required to develop an SDI. …people become overnight experts…can annihilate a project very quickly.
[do differently] Requirements for, and funding for, comprehensive data management within a shared infrastructure should be explicitly required in funding requests and performance evaluation.
USGS lacks staff that are as skilled as in the private sector. The USGS is very salary burdened and as such has limited funds to go to outside vendors who could develop infrastructure.
I think it is important to make sure that USGS researchers have a clear stake in the development and maintenance of world class data bases. In line with one of the recommendations of the NRC report (Finding the Forest in the Trees: The Challenge of Combining Diverse Environmental Data Committee for a Pilot Study on Database Interfaces, National Research Council 1995) I think that USGS has to find a way to enable researchers to RGE “credit” for ongoing involvement in the development and maintenance of databases. Leaving database development/ management to IT people or masters-level scientists will inhibit the researcher-driven experimentation, brainstorming, and interdisciplinary mindset needed for the creation and ongoing development a database that serves [an] ambitious science agenda.
|Resources||Adequate funding but not over funding
Avoid big projects with big funding that promise to deliver everything to everyone
SDIs work that have provided economic revenue…easy to get additional funding. Is economic revenue the only benefit that will work?
It works when funding is applied from the fed – state level supplement long term partnerships between fed and state. And it doesn’t take many $$…
Work with Google [private industry in general] to make it even more useful and friendly
|Uncoordinated federal/state/local geospatial budgets and expenditures do not work
Funding geospatial data programmatically rather than strategically does not work
The USGS has not been adequately funded to carry out their mission of civil domestic mapping over the U.S.
What I would do first and immediately is figure out what the SDI is worth in the U.S. and to whom it is worth what? Once you know what everyone does with it, where the gaps are and put a $$ value on closing those gaps you could begin creating the necessary partnerships both programmatically and fiscally to complete a sturdy and useful SDI. We worry so much about the sexy technologies that we forget people just need this stuff to get their jobs done. Those who have worked beside me for years have heard this before. We need to understand the econometrics of our SDI to be able to spread the cost and responsibility in a useful and meaningful way. Maybe we need to get economists and intergovernmental programmatic folks together to monetize the SDI.
[what has not worked well] Non-sustainable business models - Early on, NSF and other agencies funded the creation and population of databases but after a few years it became clear that NSF did not have the mission or the resources to maintain this infrastructure permanently. Data bases shut down for lack of funding. Resources disappeared and people moved on to other projects. Even today, many funding proposals describe their sustainability plans as simply returning to the original funding agency and asking for more money.
[Do differently] Integrate with other domains – To say we would do things differently may be misleading. The problem has been finding resources to do all the things we know need to be done, including integrating our work with that being done in other domains.
[Do differently] Once the new infrastructure is in place, all projects should be required to budget time and money to prepare and submit all spatial data - as intended.
[Worked well] Spatial data infrastructures (SDI) have worked well at the federal level, and have mostly worked well at the state level. With funding problems, SDI has faltered somewhat at the state level, and for the same reason, many counties and other local jurisdictions have had mixed results varying from robust SDIs to non-existent SDIs.
[Not worked well] Outside of the federal government: un-funded mandates for SDI tend to be ignored;
[Challenges] The major challenge is financial: support for SDI requires additional personnel, with changes technology and cultural behaviors. Many academic and non-governmental organizations (as well as a number of governmental entities below the federal level) will not undertake participating in SDI unless the financial support is available since it would take time away from existing activities.
|[did not work] Broad and generic mandates or reliance on “good will” to drive participation in development of community information resources.
[Challenges] Financial - Data management, provision, and integration are the infrastructure for both science and management applications. The resources to build this infrastructure are lacking.
|Coordination||It doesn’t work when there is competition within the state to be the single point of contact. i. e. a state GIS coordinating council. Helping the states get coordinated is a very useful activity for USGS. Through their liaisons and field offices. (suggest NSGIC for these activities - they live and die by coordination and cooperation).
[does not work] States who are not coordinated and have a state level geospatial coordinating body. There must be an entity who can speak with authority on funding issues for geospatial data at the state level, otherwise fed state partnerships are very difficult to put together. The state entity must be recognized by state agencies, and the executive and legislative branches of govt. along with the local governments.
[Challenges] Agency cultural, data, fiscal you name it….silos. I was a fed and a state person for a long time. I know first hand how difficult it is to do intra and inter-agency coordination of anything, let alone intergovernmental cross coordination. But it is critical to the success of an SDI. If geospatial funds and programs were (pipe dream here) coordinated (not consolidated) across the fed level - by OMB – the only people with a big stick in the fed govt. - just the slosh factor of $$ being expended on geospatial activities at the fed level could fund coordination activities at the state level.
I always did think that if we took the lines of business (or whatever the current lingo is at the fed level) not just across the bureaus and down through the fed agencies but on down to the state and local level there would be a logical pathway of responsibilities. In those pathways there is a common need for the same kind of data, geospatial data and practices. How hard would it then be to monetize the value of the necessary data and applications to get the job done at every level it needs doing??
Something like the old a-16 process.[OMB Circular A-16 Coordination of Geographic Information and Related Spatial Data Activities Revised 2002]
|[Challenges] Community adoption and buy-in – The geoscience community has been wary of cyberinfrastructure (including spatial data infrastructure) due to concerns over control of and access to data, recognition of data ownership, costs of converting data and systems, mandates, and how decisions are made. Every domain is dealing with similar issues, and coming up with generally similar approaches. Yet, we are all still mostly working within our community stovepipes. We have much to learn from each other and much we can share so we don’t have to duplicate or relearn what others have done. The library sciences in particular are making dramatic strides in aggregation, archiving, and disseminating digital data in a multitude of formats. We have not made the connections yet with them.
Even within the geosciences community, we are only part way there. Our network is based on geological surveys with only a few example external partners. The NSF-funded National Geoinformatics System (NGS) project to evaluate community needs and wishes has been dormant for more than a year. Could they be watching to see how GIN (and NGDS) develop and serve as core elements of an NGS? We also need to nurture preliminary linkages with the biological, oceanographic, atmospheric, and geographic communities as well as computer sciences
[don’t do well] We also need much greater coordination and dialogue across this community to minimize wasted effort and maximize accomplishment of shared goals.
[Challenges] …the most critical challenge stems for the inadequate dialogue and coordination among developers and users of these critical data. This is a combination of policy (e.g., stovepiped federal agencies), cultural (basically, a ‘stovepiped’ mindset), and financial issues (we’re all scrambling for resources).
[Do differently] The FGDC, USGS, and other bodies need to be better supported, more open in membership (i.e., to science NGOs), and empowered to support more robust dialogue, clarify shared goals, and facilitate sharing of financial resources.
[what worked] Development of systems/processes that engage the “user community” in defining requirements and reflect the technical capabilities available to the users. And, in response, focusing on provision of tools that facilitate use of existing systems (FGDC/GOS) by reflecting the particular search, discovery, and access needs of the users. Working with a specific but broad user community (coastal and marine researchers and managers) to develop tools that facilitate integration of data and model output using open source standards in response to identified needs.
|Vision||[Do differently?] Could we have gotten here earlier? – The debates at workshops, forums, professional meeting sessions, and in the corridors over the past decade were part of a process of exploring and testing ideas in a fast-changing technical and social environment. It is only in hindsight that we see where we were heading. But I doubt that if we presented our current model to ourselves 10 or even 5 years ago, that we would be ready to embrace it. There has been an evolution in thinking that was crucial to developing current models. Based on conversations with colleagues in other fields, and in tracking the literature superficially, it appears that the solid earth geosciences are just a bit ahead of other communities in coming to our present realization and acting effectively on it.
a. Interoperable – data should be seamlessly delivered to desktops regardless of the originating database software, version, operating system, or server.
b. Open-source – data and services need to be compliant with open-source standards such as OGC and ISO. This will help avoid the problem of data that cannot be accessed in obsolete or priority software
c. Distributed – data providers should provide their latest available data directly into the network. They decide what is made public and when. They do not have to continually pass along their revisions to a growing number of data aggregators or central databases.
The data network then looks more like the Web – each provider is responsible for what they want to share. There will be a continuing need for archive and orphan data repositories for data that do not have permanent homes, and for data scavenged from historical and analog sources. But even these central databases will be another layer of distributed nodes in the network.
d. Web-based (SOA) – services and applications are increasingly being served on the Web rather than being on the desktop. This allows for large resources beyond the standard desktop to handle and greatly diminishes bandwidth requirements. Referencing an online resource also means you are using the latest version as are others.
e. Flexible, dynamic, organic, modular – the system has to open to users to choose what tools and applications they want to use and to allow them to develop and implement their own applications. Just as there is not only one Web browser, there should not be components beyond the most fundamental standards and protocols that are mandated to users. Technology is moving too fast to be locked into restrictions that will limit and ultimately make the system obsolete. A modular approach allows anyone with a better idea to link into the network and make their service available. It also means the network developers don’t have to build everything. They can choose among the best work done by others in order to quickly assemble a functioning system, while leaving open the potential for alternatives to be networked
|f. User-friendly – The first Web sites had to be tediously programmed in html, but now user-friendly commercial software and ubiquitous free applications, allow everyone to easily and quickly build Web sites, including specialized sites like blogs.
The early stages of the spatial data infrastructure will require fairly sophisticated developers but emphasis should be on off-the-shelf cookbooks and guides, and eventually smart applications that almost anyone can use to provide data or services to the network.
g. Community of practice – The changes being brought about by the widespread use of digital data delivered via the Web requires that we develop new communities of practice in how we qualify the vast amounts of data that we might otherwise use indiscriminately and how we recognize and reward those who provide data and services in data networks.
Improved search engines should make it easier to find everything. A web service should index everything we have, allowing users to subscribe to any content desired. The system should be distributed, and should aggregate datasets from Science Centers. The Science Centers would go through a streamlined process to document and publish their data sets, and to set access, e.g. local use only, USGS only, or public dissemination. From that point on, it should be automatically harvested and pushed out to the appropriate user groups. The content could be live services, or could be extracted to a local geodatabase, and this could be maintained and updated automatically. Most of the pieces of such a system exist and could be implemented today.
[Do differently] Put some good people into cataloging existing reports and data sets. Build better metadata tools. Make management accountable to publish geospatial data from all projects Make all projects identify spatial data results, plan for, and publish them before project is considered complete.
[Vision] Very simply, my vision for SDI is that it should enable the scientific community to freely access and exchange spatial data with sufficient metadata to allow an interchange.
To look at an image of the U.S./globe, zoom in on an area, and get a listing of ALL the available data for that patch of land. Then be able to view detailed documentation on what the data are and how it should be used and then be able to download a single geodatabase of that information for the patch of land I am interested in.
USGS has a critical role to play in facilitating dialogue among the federal agency, academic, non-government science, and state agency sectors to clarify shared goals, data standards, and data sharing technology. Success in this area will allow us to collectively maximize utility in our investments in spatial data.
|SDI should benefit data collectors and users from planning (evaluation of existing data), collection (standards and requirements), metadata development, archiving, search & discovery, and integration. The system will not be seen as an “overhead” on research activities - but rather as a way to facilitate research, ensure data preservation, and will enhance and expand the application and integration of information resources. Performance will be evaluated not simply on “availability” of data - but on success in enhancing data application to meet diverse research and application needs.
A system that is integrated that provides readily available information from local to national scales. A on-stop integrated portal would be a nice start. Also, the SDI should have a set of tools and interfaces that permit the integration of data … e.g., downscaled climate data and models.
Promotion and development of fast, reliable, web services that provide discovery and access to geospatial data. The users will figure the rest out. Better use of and support to the users of Enterprise GIS tools.
An NSDI that supports the USGS Science Strategy would include relevant base and thematic data that are refreshed at an appropriate rate and yet are maintained as time-accessible snapshots to allow change and context evaluation. The SDI would provide a geographic framework for the publication of most scientific data of the USGS, allowing for easy visual analysis of geographic and temporal phenomena.