Click for next page ( 58


The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 57
CHAPTER 7 Testing Process The Workforce Toolkit prototype has undergone two phases of testing. 7.1 Alpha Test An alpha test of the Toolkit was conducted with the project panel in April 2008. A demo of the toolkit was conducted via web conferencing. Panel members were encouraged to access the application via a web link and provide comments. The research team also used the April 2008 AASHTO Subcommittee on Human Resources conference in St. Louis, Missouri as an opportunity to get input and early testing of the prototype Toolkit. An introduction to this project and the prototype Toolkit itself was provided at the conference at a plenary session. Session participants provided input on the prototype. As a result of input from this testing, the research team recognized that the Toolkit needed to be structured to incorporate a broader set of resources, including the following: Descriptions of current practices that have proved to be successful (e.g., who has established new compensation policies that address recruiting of engineers in a tight market). In response to this input, the research team collected a sample set of practices for inclusion in the resource database. State DOT policies and procedures for topics related to workforce management. Most state DOTs have a policies and procedures manual for employees. They typically also have documents specifically related to recruitment, compensation and benefits, succession planning, and other topics. To address this concern, the research team collected a sample set of practices for inclusion in the resource database. Organizations whose primary purpose is to support workforce management topics. Most provide access to a wealth of resources related to workforce topic areas of concern to state DOTs. In response to this suggestion, additional organization resources were added to the existing sample set in the database. 7.2 Beta Test The second phase of testing, conducted in October 2008, involved comprehensive review of the prototype by 26 individuals. Fifteen reviewers were members of the panel. Of the remaining 11 individuals, 4 were human resources professionals, 4 were volunteers from the 2008 AASHTO Subcommittee on Human Resources Conference that was held in St. Louis, Missouri, and 3 were transportation professionals from state DOTs suggested by panel members from their organiza- tions. Testers were given 2 weeks to respond, and were asked to enter their input in a survey instrument available on the web. 59

OCR for page 57
60 Tools to Aid State DOTs in Responding to Workforce Challenges 7.2.1 Test Design Direction for the beta testers included an outline of the scope of the project, including the intended audience. Testers were informed that they were previewing a prototype application that demonstrates the functionality needed for a Workforce Toolkit, and that the application had not undergone the design and testing process required in a robust software application. Additionally, they were notified of situations, such as the geographic view, where real data is not yet available. They were also informed that AASHTO will be hosting the application after the project is completed and that AASHTO will work with the human resources Subcommittee on Human Resources to determine the level of data completeness. Testers were provided the link to the Toolkit and asked to walk through the application with the aide of the User Guide available from the site. Specific questions included the following: Is this web site useful for your needs? Please explain why or why not. Which view will be most helpful to you? Least helpful? Do you find this web site easy to use? If not, please explain why. Do you recommend any changes to the design or layout of this site (please specify)? Are the overviews, descriptions (short and long), and resource description types helpful? If not, please explain why. Do you find the resources that are listed for the category selected useful? If not, please explain why. Testers were also asked to notify the research team if they encountered any problems while using the application (i.e., the application stops functioning/freezes, the results being shown do not match the category, etc.). 7.2.2 Test Results Of the six respondents using the survey instrument, all replied the Toolkit was useful. Comments included observations that it was easy to use, well organized, and contained relevant information. The majority of testers identified the Top Ten DOT Needs as the most helpful view, with one respondent indicating that it distills the most relevant and necessary information. The Faceted Search view and Functional View were also popular. Text Search and the FAQ were each mentioned as helpful by one tester. Two respondents felt the Geographic View was least helpful, which may relate to the fact that the view currently does not contain accurate data. State Practices, Faceted Search, FAQ, and Forum were all identified as least useful by at least one respondent; one tester mentioned that the Forum view and Video view would be "a nice bonus." The fact that one tester found a view most helpful while another identified the same view as the least helpful indicates the value of offering a variety of search approaches. Five testers replied that the descriptions were helpful and well written. The sixth respondent did not reply to this question. Beta testers offered additional feedback on the Toolkit through the survey instrument, by email, and by phone. The research team categorized this feedback by type--programming, editorial, content, question--and determined what actions--to address, to include as a recom- mendation for future versions of the site, to address in final report--were appropriate within the scope of the project.