Click for next page ( 10

The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement

Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 9
9 since significance decision making is subjective, this contain several tables, each one a tool to make explicit the tool would not be useful. process of deciding what is and is not eligible for listing in An MS Access database. Most of the states already the National Register. The tool is a format for capturing the have databases in place. States were not really interested subjective decisions on resource significance. The focus in making changes to their existing databases or creat- group recommended referring to this tool simply as a "deci- ing a new one. sion aid," a "preliminary screening tool," or a "tool to orga- A Common Electronic Format. The states were very nize your argument." positive about this tool. All noted that it would be use- The benefit of this tool is that it creates focused arguments ful to have electronic access to historic contexts within on eligibility that in turn facilitate discussions and consulta- the states and from other states. tions among agencies. This tool may also result in the creation ESRI's Geography Network. The states had no com- of a "dynamic historic context," focusing on the attributes and ments on this option, as they were not familiar with the elements that make a resource significant. It is "dynamic" network. in that as the tool is used, it can build on previous signifi- cance evaluations and decisions that have been captured in an electronic format. The attributes and elements in the tool IT FOCUS GROUP MEETING can be updated and modified based on these earlier deci- sions on significance. The IT focus group meeting was held at the offices of the The focus group noted that the table should capture the National Academies in Washington, D.C., on March 26, 2003. attributes that make a resource significant (i.e., the elements The IT specialists and users attending the meeting included that link a resource to the theme, geographic area, and time the following: period within which the resource is evaluated). Also, placing Mr. John Byrne, National Register Database Manager, an X in the boxes within the table is not enough. There should be a mechanism for attaching support documentation to the National Park Service; Dr. Charles Hall, State Terrestrial Archaeologist, Mary- table, such as photos, reports, and maps. There should also be land Historical Trust (SHPO); an attachment for narratives that describe how the resource Dr. Elizabeth Hobbs, GIS Technical Lead, Minnesota was inventoried and researched. Department of Transportation; In terms of the table's structure, the selection of one com- Mr. Eric Ingbar, Director of Research, Gnomon, Inc.; and ponent in the table would dictate what other components were Ms. Fennelle Miller, King County DOT--Road Ser- relevant to the evaluation of the resource in question. As the vices, King County, Washington. group discussed the variations in the table attributes that would be required to handle all the different resource types, Kevin Neimond of ESRI briefly attended the meeting to pre- it became obvious that this "table" was really a set of inter- sent information on the Geography Network. related tables with a user-friendly interface that guided the The following sections summarize the group's discussion user through the evaluation process. This prototype would be of each of the four tools. The group stressed that regardless an application, not just a set of tables. The application would of which IT options are advanced to the next phase of the also produce a report that could include a National Register study, it was important to always consider the true life cycle eligibility concurrence signature line. costs of implementing any IT system. Also, it is critical to To develop this prototype, the focus group recommended consider the elements used by agencies and consultants to that existing contexts be used to create the tables, using, for evaluate resources and to build prototypes that include these example, National Register forms as a start. Then, the devel- elements. If spatial data are used, for example, then spatial oped tables should be tested by a sample of state DOTs and/or data need to be included and accessible through these tools. SHPOs using selected resource categories, such as archaeo- logical site, historic structure, or historic district. To deter- mine which attributes (drawn from historic contexts) should A Historic Significance Attribute Table be used to populate the tables, someone from the participat- ing state agencies should interview fellow agency staff to When the members of the focus group first saw the name identify these attributes. of this tool and its description, they had an immediate nega- The success of the prototype would be measured by the time tive reaction. The tool was seen as a mechanical, inflexible, saved in using the prototype. To determine the time saved, quantifiable method to make what is basically a subjective one would first measure the time it takes for DOTs and decision. However, after the URS team described this tool in SHPOs, as well as any consulting firms that they have hired, more detail, the focus group members realized that this was to produce and review evaluation reports without using the not the case. This tool is not really a simple table for orga- prototype. One would then compare this time with the time nizing information to calculate a "significance score." It is it takes for these same organizations to produce and review more of a decision-making application that would potentially evaluation reports using the prototype.