National Academies Press: OpenBook

Managing State Transportation Research Programs (2019)

Chapter: Chapter 1 - Literature Review

« Previous: Introduction
Page 13
Suggested Citation:"Chapter 1 - Literature Review." National Academies of Sciences, Engineering, and Medicine. 2019. Managing State Transportation Research Programs. Washington, DC: The National Academies Press. doi: 10.17226/25436.
×
Page 13
Page 14
Suggested Citation:"Chapter 1 - Literature Review." National Academies of Sciences, Engineering, and Medicine. 2019. Managing State Transportation Research Programs. Washington, DC: The National Academies Press. doi: 10.17226/25436.
×
Page 14
Page 15
Suggested Citation:"Chapter 1 - Literature Review." National Academies of Sciences, Engineering, and Medicine. 2019. Managing State Transportation Research Programs. Washington, DC: The National Academies Press. doi: 10.17226/25436.
×
Page 15
Page 16
Suggested Citation:"Chapter 1 - Literature Review." National Academies of Sciences, Engineering, and Medicine. 2019. Managing State Transportation Research Programs. Washington, DC: The National Academies Press. doi: 10.17226/25436.
×
Page 16
Page 17
Suggested Citation:"Chapter 1 - Literature Review." National Academies of Sciences, Engineering, and Medicine. 2019. Managing State Transportation Research Programs. Washington, DC: The National Academies Press. doi: 10.17226/25436.
×
Page 17
Page 18
Suggested Citation:"Chapter 1 - Literature Review." National Academies of Sciences, Engineering, and Medicine. 2019. Managing State Transportation Research Programs. Washington, DC: The National Academies Press. doi: 10.17226/25436.
×
Page 18
Page 19
Suggested Citation:"Chapter 1 - Literature Review." National Academies of Sciences, Engineering, and Medicine. 2019. Managing State Transportation Research Programs. Washington, DC: The National Academies Press. doi: 10.17226/25436.
×
Page 19
Page 20
Suggested Citation:"Chapter 1 - Literature Review." National Academies of Sciences, Engineering, and Medicine. 2019. Managing State Transportation Research Programs. Washington, DC: The National Academies Press. doi: 10.17226/25436.
×
Page 20
Page 21
Suggested Citation:"Chapter 1 - Literature Review." National Academies of Sciences, Engineering, and Medicine. 2019. Managing State Transportation Research Programs. Washington, DC: The National Academies Press. doi: 10.17226/25436.
×
Page 21
Page 22
Suggested Citation:"Chapter 1 - Literature Review." National Academies of Sciences, Engineering, and Medicine. 2019. Managing State Transportation Research Programs. Washington, DC: The National Academies Press. doi: 10.17226/25436.
×
Page 22
Page 23
Suggested Citation:"Chapter 1 - Literature Review." National Academies of Sciences, Engineering, and Medicine. 2019. Managing State Transportation Research Programs. Washington, DC: The National Academies Press. doi: 10.17226/25436.
×
Page 23
Page 24
Suggested Citation:"Chapter 1 - Literature Review." National Academies of Sciences, Engineering, and Medicine. 2019. Managing State Transportation Research Programs. Washington, DC: The National Academies Press. doi: 10.17226/25436.
×
Page 24
Page 25
Suggested Citation:"Chapter 1 - Literature Review." National Academies of Sciences, Engineering, and Medicine. 2019. Managing State Transportation Research Programs. Washington, DC: The National Academies Press. doi: 10.17226/25436.
×
Page 25
Page 26
Suggested Citation:"Chapter 1 - Literature Review." National Academies of Sciences, Engineering, and Medicine. 2019. Managing State Transportation Research Programs. Washington, DC: The National Academies Press. doi: 10.17226/25436.
×
Page 26
Page 27
Suggested Citation:"Chapter 1 - Literature Review." National Academies of Sciences, Engineering, and Medicine. 2019. Managing State Transportation Research Programs. Washington, DC: The National Academies Press. doi: 10.17226/25436.
×
Page 27
Page 28
Suggested Citation:"Chapter 1 - Literature Review." National Academies of Sciences, Engineering, and Medicine. 2019. Managing State Transportation Research Programs. Washington, DC: The National Academies Press. doi: 10.17226/25436.
×
Page 28
Page 29
Suggested Citation:"Chapter 1 - Literature Review." National Academies of Sciences, Engineering, and Medicine. 2019. Managing State Transportation Research Programs. Washington, DC: The National Academies Press. doi: 10.17226/25436.
×
Page 29
Page 30
Suggested Citation:"Chapter 1 - Literature Review." National Academies of Sciences, Engineering, and Medicine. 2019. Managing State Transportation Research Programs. Washington, DC: The National Academies Press. doi: 10.17226/25436.
×
Page 30
Page 31
Suggested Citation:"Chapter 1 - Literature Review." National Academies of Sciences, Engineering, and Medicine. 2019. Managing State Transportation Research Programs. Washington, DC: The National Academies Press. doi: 10.17226/25436.
×
Page 31
Page 32
Suggested Citation:"Chapter 1 - Literature Review." National Academies of Sciences, Engineering, and Medicine. 2019. Managing State Transportation Research Programs. Washington, DC: The National Academies Press. doi: 10.17226/25436.
×
Page 32
Page 33
Suggested Citation:"Chapter 1 - Literature Review." National Academies of Sciences, Engineering, and Medicine. 2019. Managing State Transportation Research Programs. Washington, DC: The National Academies Press. doi: 10.17226/25436.
×
Page 33
Page 34
Suggested Citation:"Chapter 1 - Literature Review." National Academies of Sciences, Engineering, and Medicine. 2019. Managing State Transportation Research Programs. Washington, DC: The National Academies Press. doi: 10.17226/25436.
×
Page 34
Page 35
Suggested Citation:"Chapter 1 - Literature Review." National Academies of Sciences, Engineering, and Medicine. 2019. Managing State Transportation Research Programs. Washington, DC: The National Academies Press. doi: 10.17226/25436.
×
Page 35
Page 36
Suggested Citation:"Chapter 1 - Literature Review." National Academies of Sciences, Engineering, and Medicine. 2019. Managing State Transportation Research Programs. Washington, DC: The National Academies Press. doi: 10.17226/25436.
×
Page 36
Page 37
Suggested Citation:"Chapter 1 - Literature Review." National Academies of Sciences, Engineering, and Medicine. 2019. Managing State Transportation Research Programs. Washington, DC: The National Academies Press. doi: 10.17226/25436.
×
Page 37
Page 38
Suggested Citation:"Chapter 1 - Literature Review." National Academies of Sciences, Engineering, and Medicine. 2019. Managing State Transportation Research Programs. Washington, DC: The National Academies Press. doi: 10.17226/25436.
×
Page 38
Page 39
Suggested Citation:"Chapter 1 - Literature Review." National Academies of Sciences, Engineering, and Medicine. 2019. Managing State Transportation Research Programs. Washington, DC: The National Academies Press. doi: 10.17226/25436.
×
Page 39
Page 40
Suggested Citation:"Chapter 1 - Literature Review." National Academies of Sciences, Engineering, and Medicine. 2019. Managing State Transportation Research Programs. Washington, DC: The National Academies Press. doi: 10.17226/25436.
×
Page 40
Page 41
Suggested Citation:"Chapter 1 - Literature Review." National Academies of Sciences, Engineering, and Medicine. 2019. Managing State Transportation Research Programs. Washington, DC: The National Academies Press. doi: 10.17226/25436.
×
Page 41
Page 42
Suggested Citation:"Chapter 1 - Literature Review." National Academies of Sciences, Engineering, and Medicine. 2019. Managing State Transportation Research Programs. Washington, DC: The National Academies Press. doi: 10.17226/25436.
×
Page 42

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

13 1.1 General Observations This Synthesis seeks to rise above one-off differences in the structures and processes of state DOT research programs, and comment more holistically on how patterns of differences could relate to program quality and value. In this context, the literature revealed significant gaps in what is sys- tematically known about state DOT research programs and confirmed that there are few tools or resources and little guidance available to state DOTs on how to effectively manage research programs. The study also found that literature frequently contains anecdotal information about the dif- ferences between state DOT research programs. These are usually statements or comments by participants in peer exchanges or open-ended responses to surveys. The anecdotes are helpful for identifying an issue that may be investigated in more detail, potentially through the survey in this synthesis project. This study recognizes that information cited from peer exchanges is based on peer exchange participant information and may not be substantiated. When available, the study identifies quantifiable information such as “X out of Y research programs” or “XX% of research programs.” Some subjects may not require survey questions because data may already be available through previous research or survey efforts. Exceptions are the cases where research might be either dated or have too small a sample size. The Synthesis accordingly attempts to identify prevalence, that is, quantify the distributions to the extent possible by identifying information such as “the number of programs that do X or have Y feature” through a combination of the existing and available data and the information gathered through the survey. With these observations in mind, the following sections discuss specific findings from the litera- ture in the areas of program capability, program management, program quality, and program value. 1.2 Program Capability Program Capability is the ability of a state DOT to undertake research, in terms of its goals, knowledge, expertise, resources, organizational structure, and external relationships. This section focuses on program capability through research direction, research expertise, and research funding. 1.2.1 Research Direction Executive Leadership DOT executive leaders possess both an “inside” and an “outside” view to their DOTs by virtue of their interactions with many different stakeholders, including other DOT executives, practi- tioners, legislators, and so on. These leaders are thus well placed to help set research priorities, C H A P T E R 1 Literature Review

14 Managing State Transportation Research Programs communicate findings, and translate them into practice. Based on existing literature, executive engagement appears to be an important enabling factor for research programs. However, there is variation in the degree to which transportation agency leaders are systematically engaged or aware of outcomes from the research program. One factor that obscures or limits executive engagement is where the research program is placed in the overall DOT organizational structure. If there are a number of organizational lay- ers (i.e., direct reports) separating senior leaders from the manager of the research program, it is possible that the activities of the research program are not readily visible. In the absence of other measures or processes to engage leadership, research programs may suffer from this lack of visibility by not being able to inform decision making, translate research into practice, provide access to resources, or provide overall accountability. One important way to engage senior leaders, to ensure that the research program is visible, and to bring in both inside and outside perspectives, is through the development and adoption of a research strategic plan (RSP). This type of plan is a tool that provides focus to the research program’s broader initiatives for enhancing both the scope and visibility of its research. The process of developing a strategic plan (i.e., the planning) is often as valuable as the plan itself. Literature suggests that RSPs are not used universally, though many programs say that they recognize the importance of an RSP and will be developing one. In the 2014 survey of 22 states conducted by MnDOT, half of the states said that they did not have a research strategic plan or had suspended the update of their plan (MnDOT 2014). The survey showed that RSPs generally fall into one of the following categories: setting a research agenda, research program execution, research program process improvement with performance measures, and research program process improvement with no performance measures. The RSPs currently in existence tend to focus on broader agency issues such as marketing, partnerships, process improvement, research management, and implementation. Less prominent elements include training initiatives for staff, technological improvements, data management, and products or services. A variation of the RSP approach is to include research in a DOT-wide strategic plan with emphasis on research capability and program components. To understand how DOT executive leaders engage the research program, and how their engagement is related to organizational structure and alternative mechanisms such as RSPs, or both, the survey includes a few questions on research direction. Research Advisory Committees Some research programs have formed research advisory committees (RACs) or similar advi- sory groups to inform the strategic direction of research at their agencies. RACs often oversee needs assessment and topic or needs prioritization. They tend to be less involved with the day-to- day execution of research but often engaged again at a later stage to review results or outcomes when research nears completion. This visibility enables them to provide direction and a feedback loop. Figure 5 describes Wyoming DOT’s RAC structure as an example of a RAC providing research direction (WYDOT 2014). Some agencies have more than one RAC or technical steering committee, for each major area of research. In general, such advisory groups allow for senior leaders, technical specialists, and external partners to provide research direction and also ensure alignment with agency goals as well as with broader community trends. On a national level, AASHTO’s RAC supports the AASHTO Special Committee on Research and Innovation (R&I) by promoting research quality along with applying research to state transportation systems. Specifically, RAC members promote the value of research, research implementation, technology transfer, and peer exchanges.

Literature Review 15 Center for Accelerating Innovation Innovation initiatives play a major role in setting a research program’s direction. FHWA established the Center for Accelerating Innovation (CAI) in 2012 to help identify and prioritize innovations. The CAI works with the FHWA Office of Innovation Program Delivery (OIPD) to understand state-of-the-practice and innovation implementation goals of various states and partners. The CAI sponsors three main programs: Every Day Counts (EDC), Accelerated Innova- tion Deployment (AID) Demonstration program, and State Transportation Innovation Councils (STICs). These programs are funded through the FAST Act to help promote innovation. The EDC program is an “On-Ramp to Innovation” by identifying target EDC innovations, best practices, and data to help develop, launch, and administer strategic innovation deploy- ment. This program identifies underutilized but proven innovations every 2 years to “shorten the project delivery process, enhance roadway safety, reduce traffic congestion, and improve environmental sustainability” (FHWA 2018). Performance goals and technical team support are also established for implementation and adoption of each innovation. Since EDC began, each state has used at least 14 of a total of 43 innovations promoted through the program, as of June 2018 (FHWA 2018). To encourage EDC innovations, the FHWA created the AID Demonstration program as a funding incentive under the Technology and Innovation Deployment Program to accel- erate implementation and adopt proven highway innovations. Approximately $10 million is expected to be made available for each fiscal year 2016 through 2020 under the FAST Act. Successful projects are required to document the process, benefits, lessons learned, and methods to support rapid adoption of innovation as standard practice after the project is completed. The CAI also sponsors STICs, a cross section of transportation stakeholders from govern- ment, industry, and/or academia that can help “comprehensively and strategically consider all sources of innovation” to enable rapid technology transfer and accelerated deployment of inno- vation (FHWA 2017). STICs provide a key external perspective for state DOTs on what effective innovations are currently being used in other contexts and work to translate them into wider practice, which helps set the research direction for state DOTs. STICs were initiated in 2009 as part of the EDC program where nominations of effective, inno- vative, and underutilized technologies and processes are sought. STICs are funded by the FAST Source: WYDOT 2014. Wyoming’s Research Advisory Committee reviews, evaluates, and prioritizes all Research, Development & Technology Transfer (RD&TT) proposals and problem statements. The Committee’s evaluation determines if a proposal has sufficient merit to warrant further study. If warranted and funding is available, the Committee submits the research project for executive staff approval. Wyoming’s RAC is led by the WYDOT Research Manager as a permanent, non-voting RAC Chairperson. The RAC consist of all WYDOT members other than a non-voting FHWA Representative: members include a State Bridge Engineer, State Field Operations Engineer, Chief Engineering Geologist, State Highway Development Engineer, State Highway Safety Engineer, Lands Management Engineer/Program Manager, State Materials Engineer, State Planning Engineer, GIS/ITS Program Manager, State Traffic Engineer, and two engineers from each district. Additionally, the State Program Manager and a FHWA Representative also serve as permanent, non-voting members. During a RAC meeting on October 8, 2014, the RAC discussed project scoring sheets and how to insert outcome and output measures into the process. Members felt that implementation and technical transfer, economic development, goals, and implementation needed to be added. These decisions change what types of research projects are funded and the direction of the overall program in supporting these research projects. Figure 5. Example of how a research advisory committee structure provides research direction.

16 Managing State Transportation Research Programs Act, which was signed into law in 2015. The FHWA provides annual funding up to $100,000 and technical assistance for STICs as part of the STIC Incentive Program. As of April 2016, STICs were in all 50 states, the District of Columbia, Puerto Rico, the U.S. Virgin Islands, and the jurisdictions associated with the Office of Federal Lands Highway (Harman 2017). Most DOTs expressed interest in creating a better culture of innovation through enhancing engagement with STICs (Peabody 2017). However, there is a lack of consistency with how these STICs are organized from state to state, that is, who participates on a STIC and how STICs operate (Harman 2017). A representative of the state’s FHWA division or the state DOT typi- cally serves as chair, and membership may include representatives from metropolitan planning organizations (MPOs), local transit and tollway authorities, public works departments, trade associations, construction companies, consultants, and other federal agencies. The state DOT’s chief engineer or executive officer is, in many cases, actively engaged, which is important for creating a research culture of innovation to improve on the status quo. Past literature has not compared the types of stakeholders comprising each STIC nationwide. Additionally, research programs have varied participation in STICs; these differences have not been collectively studied. STICs allow DOTs to share knowledge in order to more rapidly innovate. Research staff members help strengthen the linkages between research and implemen- tation of innovations. Massachusetts and Utah STICs are two representative examples of how STICs provide leadership on innovation initiatives: • The Massachusetts STIC makes research implementation decisions based on the recom- mendations of MassDOT’s Review, Evaluate, Accelerate, Deploy, and Innovation (READi) Committee. The STIC subsequently assembles a deployment team to develop an imple- mentation plan for each of these initiatives. At quarterly STIC meetings, the deployment team reports on progress made. Some innovations that this Massachusetts STIC has imple- mented include “design-build contracting, e-construction technologies, smarter work zones, and ultra high-performance concrete connections for prefabricated bridge compo- nents” (FHWA 2017). The Massachusetts STIC plays an active role in setting the research direction of a MassDOT by providing a rapid technology transfer forum and accelerating deployment of implementation. • Similarly, the Utah Department of Transportation (UDOT) recently revamped its STIC, giving the research division a more active role in coordinating innovation activities. The research division is also responsible for broader communication within UDOT for national committees and an Innovation Working Group to promote and champion innovative ideas, which complements restructuring of the STIC to harmonize these responsibilities. The UDOT STIC also focuses on securing executive leadership, in identifying subject matter experts to lead various innovation initiatives, leveraging research advances through conference partici- pation through the TRB Annual Meeting and EDC Summit, and “planning in advance of grant and incentive submission deadlines” (FHWA 2017). More research on the subject of STICs is needed to understand which stakeholders comprise different STICs, the level of involvement state research programs have within STICs, and what role STICs play in research implementation. Universities Universities provide a distinct set of skills, knowledge, and expertise. Because university principal investigators often have access to specialized labs, equipment, facilities, and knowl- edge sets, they are in a position to assist DOTs with a wide range of research topics along the spectrum from basic to applied. Furthermore, many public universities develop close col- laborative relationships with DOTs over time, allowing them to not only build on past research

Literature Review 17 relationships but also to exercise thought leadership and knowledge-building activities through university transportation centers (UTCs), jointly sponsored conferences, and so on. The Office of Research and Technology (OST-R) of the U.S. DOT is charged with advancing the deployment of crosscutting transportation technologies. This includes the Bureau of Transportation Statistics; the Intelligent Transportation Systems Joint Program Office; the Office of Research, Development and Technology; Positioning, Navigation and Timing and Spectrum Management; the Transportation Safety Institute (TSI); and the Volpe National Transporta- tion Systems Center. The Office of Research, Development and Technology administers the University Transportation Center (UTC) program, among others. UTCs are classified into national, regional, and tier 1 UTCs for purposes of grants given through the FAST Act. These centers consist of a consortium of universities that forms for the purposes of specific transportation research topics. The Council of University Transportation Centers (CUTC) was established in 1979 to provide a forum for universities and UTCs to inter- act with government and industry. CUTC works to promote “university research, education, workforce development, and technology transfer as essential to the nation’s transportation system” (CUTC 2018). University faculty and research staff may also participate in DOT research activities through advisory roles on RACs, STICs, conferences, research workshops, and other areas. Through these activities, universities help set the research direction for state DOTs. For example, research workshops gather members of a state DOT, FHWA, other government agencies, universities, and private organizations together to review, evaluate, and prioritize research needs. The Utah Transportation Research Advisory Committee (UTRAC) hosts an annual research workshop for these members to come together and identify and prioritize research topics affecting the DOT. The New England Transportation Consortium Pooled Fund [TPF-5(373)] is another example where universities play a role in helping to select and develop research projects on a voluntary basis without a guarantee of research awards, which provides a valuable external perspective on the state of surface transportation research (NETC 2002). Private Sector The private sector is also a valuable partner to state DOTs and can provide guidance or knowledge on industry trends, implementation challenges, feasibility, cost structures, and other areas. Similar to universities, the private sector can participate in a myriad of DOT-related research activities such as RACs, STICs, conferences, research workshops, and other areas. These activities allow strategic guidance and involvement by the private sector. STICs, for example, can introduce innovative technologies and processes discovered in the private sector into the public realm. Private companies can also shed light on the impacts of certain innovations on their industries and their implications for the state’s economy. 1.2.2 Research Expertise Research expertise is critical to the capabilities of a research program whether sourced inter- nally within the program, outside the research program but within the DOT, or external to the agency. This next section addresses research expertise through research program skill sets, workforce recruiting and training, and external procurement of expertise. Research Program Skill Sets Many state DOTs struggle with workforce issues related to succession planning, workload management, and recruiting staff from the right disciplines. Research programs are no exception

18 Managing State Transportation Research Programs and require a different skill orientation compared to other DOT positions, often making it more difficult to recruit for these positions. State DOTs’ research programs depend on a reliable labor pool with three main types of skill sets: 1. Fundamental research background and/or technical expertise needed to conduct and/or ensure the technical quality of research with strong scientific rigor and inquiry; 2. Support functions needed to manage or support the research program, such as contracting, legal, accounting, and other administrative services; and 3. Management and communication expertise to foster collaboration between different research stakeholders and successfully administer research contracts. DOTs can either staff internally or access expertise through contractual and procurement arrangements. For example, in some states, research is conducted within the research program, and these states rely on technical specialists that are DOT employees. In others, research is conducted by other divisions within the DOT, and technical specialists may be employed by the DOT, but not by the research program. Many types of research are also contracted out to consultants or academic institutions. Obtaining technical research expertise is, therefore, an important need for research programs across the board, even though they have different ways of accessing expertise. State agencies have different perspectives on what skill sets are most needed in their research project managers. This difference may coincide with the types of research that different agen- cies undertake. Louisiana’s project managers are all engineers, whereas Nevada emphasizes “behavioral competencies” over technical degrees (WSDOT 2017). Studies have not evaluated the correlation between the research program’s method of building a research team, a project manager’s qualifications, and the type of research conducted. Caltrans (2011) conducted a peer exchange to solicit ideas on the characteristics and skill sets of individuals successful at accelerating adoption of innovation at state DOT research programs. Agencies at this peer exchange stated that successful technology transfer champions are those who: • Have strong marketing and communication skills • Are able to plan and run effective, efficient meetings • Are good brokers of information and resources • Are strong negotiators • Have persistence, passion, and drive • Have people skills • Understand the technical aspects of a project, but can also create and implement a successful marketing plan • Serve as a conduit between technology experts and all others—including stakeholders within the organization, potential adopters of innovation, and the public • Are able to recognize gatekeepers and what drives them to accept or reject change • Are trustworthy and credible; have strong personal working relationships • Are empowered to work across organizational lines and are in a position that offers access to many different levels of the organization • Are comfortable working within chaos—have public relations skills • Are able to think outside the box (understand that there is more than one way to get from A to B) To gauge these characteristics, an organization should apply a DISC analysis. DISC is a quadrant behavioral model used to examine the behavior of individuals in their environment. The DISC (dominant, inspiring, supportive, and cautious) behavioral model is an example of a personality test that can be employed to understand staff personalities, as suggested by

Literature Review 19 the Minnesota Department of Transportation at this peer exchange. This model has two dimensions—level of extraversion/introversion and the degree of being task focused or people focused (Marston 1928). Another personality assessment might include the Five Factor Model (commonly known as the “Big Five”), which focuses on Extraversion, Agreeableness, Conscien- tiousness, Neuroticism, and Openness to Experience. The role of a research program manager and that of a researcher are different and require different skills. Most skills suggested during the California peer exchange (Caltrans 2011) are less technical in nature and more focused on management and communication. This is accompanied by critical thinking skills gained through a research background while techni- cal expertise in the research subject area will depend on the agency’s structure of conduct- ing research. Regardless of whether in-house research is conducted by the research program, however, a fundamental research background is needed in research program staff to ensure technical quality of research with strong scientific rigor and inquiry. This can be obtained through advanced degrees with research education and/or through professional research work experience. Maryland DOT’s research program staff’s core functions presented at New Jersey DOT’s peer exchange in 2011 are focused on the following areas: 1. State Planning and Research Program Management 2. State Planning and Research Project Management 3. University Partnerships 4. National Research Program Coordination 5. Technology Transfer 6. Office of Policy and Research Office-wide Functions 7. State Highway Administration-wide Functions (New Jersey DOT 2011). Maryland DOT does not conduct in-house research. Some of the core responsibilities of its research program staff include monitoring expenditures, fulfilling compliance and report- ing requirements, reviewing project proposals, facilitating communication between research- ers and DOT technical offices, reviewing reports for publication, disseminating research results, administering university contracts, serving as representatives on national research pro- grams, conducting literature searches upon request, assisting on survey administration, and other tasks. Research programs that conduct in-house research still require these soft skills but tend to emphasize technical expertise more than those that do not conduct in-house research. The Louisiana Transportation Research Center conducts in-house research, and research project managers generally are staff engineers with technical expertise in the area of study. These project managers are responsible for providing technical advice and guidance; identifying field test locations or sections through coordination with districts or other offices or agencies; scheduling meetings; developing and distributing meeting minutes; monitoring projects to ensure that they stay within the scope, time, and budget; completing and reviewing project reporting and research documents; making recommendations on any project modifications; reviewing and evaluating subject matter content; coordinating implementation activities; assessing research implementation; and rating effectiveness of the research team in contract delivery (LTRC 2016b). Although possessing core research skills and experience is important, research program staff also require management and communication expertise to successfully deliver research projects. European transportation research organizations have also wrestled with the issue of integrating research skill sets. For example, the Hellenic Institute of Transport in Greece takes a broad approach by emphasizing multidisciplinary knowledge—social, economic, and

20 Managing State Transportation Research Programs environmental—and the application of this knowledge to new transportation systems and services. Research is enhanced by professionals who are knowledgeable of other “parallel” fields such as 1. Socioeconomics and human behavior 2. Environment and climate change 3. Computer software including new applications, programming languages, and new operating systems 4. Research management—administrative and management activities to meet research contract requirements 5. Management of staff, both researchers and professional, technical, administrative, and clerical support staff 6. Commercial, legal, and administrative aspects of transportation innovation for successful research implementation. This institute suggests that training transportation research professionals in cross- disciplinary subjects can build practical expertise to help research program staff be success- ful in their daily work activities (Giannopoulos 2015). Previous research has not synthe- sized desired skill sets of DOT research program staff across the country. However, broad guidelines on a fundamental research background or technical expertise to ensure technical quality of research with strong scientific rigor and inquiry; research administrative support functions such as contracting, legal, accounting, and other administrative services; and man- agement and communication expertise are all core skill sets necessary to successfully deliver research projects. Workforce Recruitment and Training Recruiting appropriate personnel is fundamental to a research program’s ability to carry out its mission and tasks. District DOT’s research strategic plan includes a goal to enhance the visibility of its research program by communicating its activities, services, and research. One of the near-term actions for this goal includes “better defining the roles and expecta- tions for subcommittee members and recruiting members from each administration and key branches” (DDOT 2013). Finding the right expertise for different tasks is not straight- forward. Washington DOT found that staffing reductions and retirement of experienced technical advisory committee (TAC) members resulted in a decline in TAC participation. MassDOT shared its practices of encouraging existing TAC team members to bring another person from their same business area to help develop their capability for future TAC mem- bership (WSDOT 2017). However, inadequate succession planning is not just limited to TACs; VTrans expressed encountering difficulty with recruiting qualified staff for its research program positions (VTrans 2016). At the same time, VTrans “retiring staff are taking a sig- nificant amount of personal capital and institutional knowledge with them, and no clear succession plan has been established.” Issues of recruitment prevail at programs both large and small, which should be addressed to fully realize a state DOT’s capability for conducting research. An AASHTO survey on mentoring programs found that 56% of 18 state respondents had a mentoring program in place for workforce development (Arizona DOT 2013). However, this survey was conducted for the entire state DOT and is not necessarily relevant for research programs because of the variance in size and nature of the research programs. Systematic research on recruitment and training for research is sparse. However, anec- dotal remarks at peer exchanges show that there is a broad acknowledgment that recruiting and retaining talent from specific disciplines continues to be an industry-wide and DOT- wide issue.

Literature Review 21 Workforce training to build expertise specifically for research staff members has undergone development through the Ahead of the Curve training program. This curriculum set is run through the TRB Task Force ABG05T: Mastering the Management of Transportation Research and Training Program. The program aims to enhance the knowledge, skills, and abilities of transportation research and innovation staff. Four core courses include 1. Making Research Relevant 2. Running the Program 3. Delivering the Program 4. Program Quality Improvement Twelve electives are also available: Effective Problem Statements; Performance Measurement; Information and Knowledge Management; Advocating/Being a Champion; Innovation Manage- ment and Risk Management; Funding; Scientific Methods; Intellectual Property, Innovation, and Technology Transfer; Strategic Planning for Research; Building Trusted Credible Partnerships; Continuous Quality Improvement; and Program Design (Cambridge Systematics 2016). Universities and UTCs State transportation research programs leverage the specialized expertise of university pro- fessors and university transportation centers (UTCs) through research workforce training such as internships and shadowing programs and other formal and informal programs. This allows research programs to transcend their expertise limitations, given the vast wealth of research expertise available in higher-education institutions. One example of university partnerships with research programs is highlighted by Vermont’s 2016 Peer Exchange. Historically, the University of Vermont conducted approximately 40% of VTrans’ research and was “essentially . . . an exclusive university partner” (VTrans 2016). There were concerns that the agency was overlooking expertise present at other schools due to its reliance on the University of Vermont. Because of significant staff turnover in its research program, a suggestion was made to balance this university partnership by actively allowing the University of Vermont to provide or compete for research services with a potential target range of funding instead of being allocated a fixed amount of funding (VTrans 2016). On a broader level, partnering with the university allows VTrans to gain access to research resources to which it might not otherwise have access, such as federal and corporate grants, along with university transportation centers. Washington similarly has a triparty agreement between WSDOT, Washington State University, and the University of Washington, and research is frequently a shared effort between more than one party. Private Contractors Research programs use consultants and engineering firms for executing research, implement- ing research findings, and scaling up deployments. Such contractors help build up research program capability by providing temporary specialized services on a project basis. State DOTs may reduce procurement burden through the use of an “on-call contract,” a pre- arrangement between a DOT and contractors for a set type of tasks required by the DOT. For example, Caltrans’ contract management department states that on-call contracts are used when • In-house functional resources are inadequate to handle the anticipated workload or specific activity. • The project manager knows the general types of products or services needed, but circum- stances prevent the contract manager from setting a definitive location or timetable for work to begin. Using an on-call contract allows him/her the flexibility to use contract services on an as-needed basis.

22 Managing State Transportation Research Programs • The scope of work is defined at work breakdown structure (WBS) Level 6 or lower. (“WBS Level 6 or lower” means a task planned and scheduled at a lower or more detailed level, not lower in number.) • Most products and services can be completed in a relatively short time (approximately 3 years after contract execution) (Caltrans Architectural & Engineering Contract Management, n.d.). 1.2.3 Research Funding Program capability of a state transportation research program is affected by funding sources that the program is dependent on and the process requirements for receiving and spending those funds. Funding increases program capabilities when an agency can organize itself to man- age compliance requirements of its various research funding sources, which can be extensive at times. Previous research has not systematically and broadly assessed how various state DOTs are funded and how funds are managed by different agencies. This section provides an overview based on case examples and other materials in existing literature. Funding Sources and Distribution State DOTs obtain research funding through one or more of the following main categories: 1. Federal State Planning and Research (SPR) Program Part B: States must set aside 2% of their total apportioned dollars from the Highway Trust Fund to support interstate maintenance (IM), the National Highway System (NHS), Surface Transportation Program (STP), Con- gestion Mitigation and Air Quality (CMAQ), Highway Bridge Replacement and Rehabili- tation Program (HBRRP), and the Highway Safety Improvement Program (HSIP). Of the 2% Highway Trust Fund set-aside, a minimum of 25% must be spent on research-related activities (23 U.S.C. § 505, State Planning and Research). 2. Discretionary Grants: In addition to the SP&R Program Part B, various federal discretionary grants exist for state transportation research, such as the AID program, State Transportation Innovation Council grant, FHWA Research and Innovation program grants, and so on. These programs each have different application and qualification procedures for obtaining research funding. 3. Transportation Pooled Funds: Pooled funds are cooperative programs administered by FHWA and AASHTO, for the FHWA, states, municipal or metropolitan agencies, colleges/ universities, private companies, and other organizations to partner on mutual transportation- related problems through research, planning, and technology transfer activities. More than one agency, college, or private company must commit funds and other resources for the activity (FHWA 2019). 4. State-Allocated Funding: Some state legislatures have set aside research funds specifically for certain transportation research needs, such as roadside safety, or local road research. States are also required to match 20% of SPR Part B funds, but pooled funds are exempt from this requirement. 5. Research Partnerships with Local Research Centers or Universities: Local research centers and universities often have access to funding that state DOTs may not be qualified to obtain on their own. For certain research topics, joint research efforts with universities allow state DOTs to undertake research in a more cost-efficient manner and with specific expertise of relevant academic personnel. 6. Sponsored Research: Other entities such as state and federal agencies—for example, U.S. Geo- logical Survey and the Environmental Protection Agency—along with private corporations with transportation research needs may also appropriate funds to state DOTs to conduct certain qualified research and activities. Other divisions of the state DOT may also sponsor research that is administered by the research program.

Literature Review 23 Previous research studies have not conducted systematic analyses of state DOTs’ sources of funding, fund amounts, or proportions of the total research budget. Additionally, since many of these funds are near-term or 1-year programs, existing research available on state agency funding portfolios is dated. Some states self-publish detailed information on research program spending. Based on these self-reports, examples of Minnesota, California, and Washington’s research funding are pro- vided below as examples of programs with very different sources of funding, which result in different methods of organizational structures and processes for managing funds (Figure 6). Minnesota DOT’s research program is one that is funded by comparable amounts in federal, state, and local dollars (MnDOT 2017a). In contrast, Caltrans’ Division of Research, Innovation, and System Information (DRISI) relies heavily on FHWA SPR funding relative to state funding (Caltrans 2017). A third configuration is research funding that comes partially through trans- portation pooled funds, such as Washington DOT’s program in which 42% of research funding is funneled through this channel (WSDOT 2015). These examples show how state DOTs can have very different sources of funding, fund amounts, and proportions of their total research fund portfolio. Where states obtain and spend research dollars subsequently leads to different ways of organizing themselves to manage these funds and comply with their requirements. Compliance Requirements Funding sources often have compliance requirements to ensure that the resources are used in the manner intended. By requiring certain types of accounting, reports, and declaration, funding bodies can assess how and when funds are used, and the purposes to which they were Source: State Research Annual Reports, TPF: Transportation Pooled Funds. Figure 6. Research funding sources by state.

24 Managing State Transportation Research Programs allocated. As discussed earlier, there is wide variation in funding sources, and most programs receive funds from more than one source. In this context, different compliance requirements for different funding sources can require significant administrative effort. Furthermore, addi- tional organization structures, processes, and back-end systems such as accounting might be required. The methods employed are a reflection of agency resources, organizational structure, fund portfolios, and lessons learned over time. The next sections summarize the requirements and structures of funds management for SPR Funds, multistate pooled funds, local cooperative projects, and reserve and discretionary funds. State Planning and Research Funds. Federal funds through the SPR program come with certain mandates for state DOT research programs. Federal regulations indicate that 25% of SPR funds should be used for research, development, and technology activities which include highway, public transport, and intermodal systems (23 CFR § 420.107). Part of these require- ments includes preparing an annual SPR Part B work program for FHWA by listing all the research projects that use federal funding and the associated activities. Additionally, 5-year-cycle requirements are set by Part 420 as well to conduct peer exchanges and review states’ manage- ment process for research, development, and technology programs using federal funds. These requirements might be managed by the research program itself, such as in Florida, or can be done by different divisions of the state DOT depending on the agency’s organizational structure. What functions manage these funds at each state DOT is not currently known. Knowing how different research programs manage the requirements of each of their funding sources is important to determine its effects on program capability. Additionally, previous literature focuses exclusively on the researchers themselves but leaves out intra-agency resources (legal, financial, procure- ment, and so on). These support functions are structured differently at various state DOTs and support compliance efforts for different research funding sponsors. Multistate Pooled Funds. As discussed previously, some states rely on transportation pooled funds for substantial research funding, such as Washington DOT, with over one-third of its research funding supported by transportation pooled funds. Pooled funds are not a new source of funding, but states can leverage the program for cross-collaboration on mutually ben- eficial research topics. Transportation pooled funds require management coordination between different state DOT systems. The most recent summary of pooled fund studies is the Minnesota Peer Exchange held in 2007. This report provides state suggestions such as allocating funding for transportation pooled fund projects on an annual basis, allowing the research manager to commit funding year-round. Some administrative practices include good record-keeping of all pooled fund solicitations and rationale for participation and non-participation, limiting SPR funding of perpetual projects to 3 years, using standard overhead rates and travel policies as well as other practices (MnDOT 2007). To track funds under the pooled fund arrangement, both WSDOT and the Connecticut Department of Transportation (CTDOT) use a “checkbook-style” spread- sheet to keep updated on federal fund balances, while some states also track research funds within other functional areas of the agency (MnDOT 2007). Local Cooperative Projects. Local cooperative projects with state DOTs and local agencies can also provide additional research insight. Yet, cost-sharing practices between these two agen- cies are not well understood. As of 2007, a small majority of state respondents stated that some type of cost-sharing “laws, policies, guidelines, and/or cooperative program[s]” exist (MnDOT 2013). However, 38% of respondents have defined policies or programs for these cooperative projects with local agencies and only four respondents indicated that these policies or programs address cost sharing (MnDOT 2013). States that have defined cooperative projects, policies, or programs found them to be “very effective in addressing cost participation.” It is not clear what methods for addressing cost sharing are generally seen as most effective, meriting further study.

Literature Review 25 Reserve and Discretionary Funds. Maintaining reserve and discretionary funds for imme- diate needs provides additional flexibility in dynamic research areas. Though transportation research may be identified, selected, and prioritized, some research needs fall outside of this typical sourcing method. On-call research contracts are an area that allow states flexibility in responding to short and urgent research needs for “immediate impact research” as participants suggested in Oregon and New Hampshire peer exchanges (Oregon DOT 2014, NHDOT 2016b). Further research should examine which states have these measures in place and how they affect the capability of the research program. This study begins to address these literature gaps: how research programs comply with research funding requirements, and how the structures of intra-agency resources support these compli- ance measures, to more comprehensively understand current process management practices for research funds. The study illustrates how program capability of a state transportation research program is affected by the sources and structures of funding that the program is dependent on. Specifically, issues emerge, such as: 1. Challenges in meeting funding requirements 2. Sources of research funding and where funds are spent 3. Maintaining discretionary funds for immediate research needs 4. Linkages between funding and performance (research evaluation and implementation) 1.3 Program Management Program Management comprises the processes and protocols in place to execute, manage, and deliver the research function over the research lifecycle. The research lifecycle consists of six distinct stages in a feedback loop: assess needs, select topics, award projects, manage projects through research support services and other program support services, disseminate findings, and implement research. This section addresses the program management practices of state DOTs through the six stages of the research lifecycle. 1.3.1 Assess Needs The Program Capability dimension addressed “‘who’” is (or could be) involved in setting research direction. This section further explores the processes for identifying specific needs. Agencies generally state that processes for identifying and prioritizing research needs should be aligned with the broader strategic vision of the agency and the goals of the research divi- sion to ensure coherence. However, the choice of process could vary depending on the goals of the research program. Understanding the goals of research within each state DOT is, therefore, an important prerequisite to understanding how a program chooses to identify and prioritize research needs. The literature specifically analyzing the research goals of DOTs is limited, however. States obtain research input through both top-down and bottom-up needs assessment in order to align their research with strategic aims. Past literature states that top-down methods are more effective for ensuring strategic alignment to key agency goals and objectives (Special Report 313, 2014). However, bottom-up issue identification is also important to ensure that on-the-ground issues are addressed sufficiently by agency research. An open solicitation process for the submission of research topics can help ensure that state DOTs understand the research needs of transportation stakeholders. An AASHTO Survey revealed that 96% of 26 total respondents stated that their research project selection process included an open solicitation process so that both those internal and external to the state agency were eligible to submit research topics (Oregon DOT 2011). This indicates that state DOTs recognize the importance of outside feedback in assessing transportation research needs.

26 Managing State Transportation Research Programs Some mechanisms for assessing research needs might include the development of a research strategic plan, participation by cross-divisional agency committees, the creation of problem statement lists, and so on. Oklahoma, for example, utilizes a cross-divisional research steering committee (similar to a research advisory committee) to prioritize topic statements. Additionally, the recent adoption of problem statement lists by many agencies has been used to prioritize research topics. These lists are put together with the assistance of TRB standing committees. Figure 7 indicates three different methods for topic solicitation, by Ohio, Georgia, and Nevada DOTs (NHDOT 2016b). Georgia DOT’s change from a primarily external solicitation method to one that is internal made a “huge difference in the quality of the research ideas and the impact of the projects that get funded” since these research topics are better aligned with department needs and strategic goals of “policy/workforce, asset management, mobility, and safety” (NHDOT 2016b). Much of the literature covers the existing methodologies in place for assessing research needs. What is less understood is who sets the overarching agency goals and objectives, to which these research needs should be aligned. This Synthesis moves past the technical mechanisms for needs assessment and seeks to understand who defines agency needs from top-down and bottom-up needs assessment approaches. 1.3.2 Select Topics As with needs assessment, research topic prioritization should align with the strategic objec- tives of the research program. This process usually involves dedicated committees, though the processes vary by state. Different committee challenges arise, however, and the literature has not analyzed how the composition and policies of these committees affect the effectiveness of the prioritization process. For example: • Mississippi DOT uses a research advisory committee that identifies problem statements and a technical advisory committee subject matter expert to oversee each specific research proj- ect. Unique to Mississippi, an elected Commission is in place to approve all research project expenditures as well as contract extensions (Caltrans 2016). • Oklahoma DOT uses a system similar to Mississippi’s through a research steering committee which convenes periodically to review, discuss, and prioritize topic statements for research funds, though without a Commission to approve these expenditures (VTrans 2016). How- ever, some challenges include lack of participation, deterring input from diverse stakeholders Ohio DOT Georgia DOT Nevada DOT Internal research needs are identified only through DOT staff members except external research topic solicitation from students, funding a few projects each year. Research problem statement solicitation is primarily through internal DOT staff members, a change from primary solicitation through universities in 2008. One external problem statement solicitation is issued per year. Problem statements are accepted from all sources as long as a project champion within the department is identified and willing to support the project, ensuring that the project aligns with department needs and goals. Source: NHDOT 2016b. Figure 7. Topic solicitation method examples.

Literature Review 27 because of the formality of the prioritization process, as well as the committee’s decisions not being accountable to a strategic plan. What is not revealed is the committee’s composition to determine its correlation to overarching agency goals and objectives. Inactive committees are a pressing issue at many state DOTs (VTrans 2016). Another research challenge that occurs in some states is path dependence, where there is a tendency for research to “conform to pre-existing department ideas” (Caltrans 2016). New Mexico DOT revamped its research prioritization process, among other functions, by creating a new type of research over- sight committee through a cross section of the agency rather than just agency top leadership. This was a fundamental shift in New Mexico’s practices, which worked in the state’s context by steering the New Mexico DOT away from its routine emphasis on pavements, materials, and 4-year research partnerships with universities. This example shows that focus groups with different division members can be effective. In light of discrepancies between studies, the question of who selects research topics would benefit from additional research. This Synthesis surveys state DOTs on who is involved in the research selection process to better understand what mechanisms have been effective in different contexts. 1.3.3 Award Projects Once research needs have been assessed and the state DOT has selected specific research topics, the agency makes decisions about what research projects are awarded. Research projects may be conducted internally, externally, or through a combination of both internal and external resources. Thus, the process for selecting research projects can vary between different agencies. This section discusses project selection committees, criteria for selecting projects, and researchers’ perceptions of the project selection process. An AASHTO survey found that project selection committees generally consist of a mixture of agency top leadership, agency mid-level managers, agency front-line supervisors, and university faculty in order of decreasing frequency. Four percent of respondents did not use a committee to select projects and 31% stated that other figures were members of the committees as well (Oregon DOT 2011). Criteria for selecting projects consisted 89% of the time of “topic areas currently important to your agency,” 62% on service improvements such as safety, system reliability, and mobil- ity considerations, and 46% on concerns about cost savings as well as avoidance. Half of the state DOTs surveyed stated that these criteria for selecting research projects do not significantly change over time (Oregon DOT 2011). In this same survey, 70% of state respondents agreed that the “credibility of our research project selection process relies more on the credibility of the decision makers than on the use of objective criteria” (Oregon DOT 2011). A follow-up ques- tion revealed that 79% of state DOTs were confident that their selection process was effective in selecting projects that were most likely to provide maximum value to the agency and that 90% of respondents stated that the process engages people who are “best suited and equipped to be making research projects election decisions” (Oregon DOT 2011). The Research Project Selection Process AASHTO Survey (Oregon DOT 2011) was conducted because a peer exchange by Oregon DOT found responses such as this one: “Research Advisory Committee decision process was perceived as a black box. Many participants commented that they didn’t understand why one project was selected and another not” (Oregon DOT 2009). Oregon DOT transformed its research advisory committee so that it consisted mainly of the state DOT’s top leaders to instill greater credibility into the decision-making process. More nuanced findings were developed in this survey, with 52% of 24 respondents finding the credibility of

28 Managing State Transportation Research Programs decision makers more important than the use of objective criteria and that 79% felt that their agency’s method of selecting projects was likely to give the agency maximum research value (Oregon DOT 2011). Survey responses indicated that researchers have mixed feelings about how research is pri- oritized and awarded through the research advisory committee. However, these surveys do not detail what types of objective criteria are normally put in place. Additionally, analysis has not been done on whether a correlation exists between trust in the research selection process and the specific composition of a research advisory committee (i.e., Does the presence of senior leader- ship within a research advisory committee lead to greater trust in the project selection process?). More analysis on how projects are awarded would advance understanding of how state DOTs manage their research lifecycle. 1.3.4 Manage Projects Research project management consists of numerous components including workload management, oversight of research, tracking of projects, and ancillary functions that support research being done across the agency. Research project management tends to involve some of the roles in Figure 8. This study uses this terminology with clarification that exact titles may vary by state agency. In many circumstances, more than one of these roles may be served by the same individual. The process of conducting research itself once projects have been awarded is framed differ- ently across the country. The Association of Project Management’s president remarked that project management is the process of “getting things done” (APM 2017). In a university setting, project management can be phrased as the “process of getting a project completed on time, Term Definition Research Program Manager Individual with responsibilities for administering the research program and its research portfolio Research Project Manager Individual with responsibilities for overseeing a specific research project budget, tasks, and deliverables Technical Lead/Subject Matter Expert Individual with responsibilities for conducting or overseeing the technical components of a research project Project Champion Individual that supports and advocates for the success of a research project, may be the same individual that submits research for consideration (sometimes known as a Research Customer) or another individual with a stake in the research outcome Research Advisory Committee Committee with responsibilities for advising research through activities such as setting research vision and strategy, and prioritizing research topics or problem statements Contractor or Consultant A university or non-university individual under contract to conduct actual research on a project if research component is outsourced Figure 8. Research project management role definitions.

Literature Review 29 within budget, to the desired level of quality, and in a university environment where it is almost unheard of to be able to please everyone” (Johnson 2013). Transportation research programs are similar to academia in balancing a diverse set of stakeholders; the difference lies in the applied nature of research. Research is a task that differs from many other functions within a state DOT in that work- load fluctuates over time. The capacity to manage such research programs is dependent on fund- ing programs and the scope of the project, which can make it difficult to satisfactorily balance the workload. Wisconsin’s 2013 Peer Exchange Report highlighted that capacity for research management is partially dependent on the structure of the funding program, whether the DOT is able to manage the program internally or whether an external partner is managing the research program (Wisconsin DOT 2013). In what ways do research support services enhance research program management (library, data, training, other)? Regardless of the specific characteristics of a research library, it is important to understand the library’s current role in the research process and how it may have changed over time. The storage and maintenance of transportation research publications provide a resource for state transportation research programs in conducting literature reviews, reading research findings, and implementing programs. Research libraries are housed in different locations from Departments of Transportation, university transportation centers, universities, and other locations. They also vary in functionality. A 2015 AASHTO survey of transportation libraries revealed that 75% of libraries are housed at DOTs (40 responses provided) while the remaining are located at universities or other locations (Oklahoma DOT 2015). These libraries are equipped with materials by DOTs 85% of the time, while 52.5% of universities contribute materials as well. As part of external partnerships, 32 of 40 libraries are members of the transportation network report whereas the remaining 6 are not. The library scope also differs throughout the country with 11 of them housing mainly transportation research reports, 15 libraries providing both research and some other trans- portation materials, and 14 libraries holding both research and most other transportation area materials. This underlines that transportation libraries are purposed differently. Indeed, 26 of the libraries utilize SPR funds and 31 of the libraries have one or more full-time equiva- lents. Substantial research has been conducted to understand the different structures and characteristics of research libraries across the country. In addition to a research library, technological tools can support research process manage- ment efforts. Many states have developed or are in the process of developing data warehouses to integrate roadway, safety, and asset data (CTDOT 2017). Systems vary nationwide and may include a GIS database, enterprise software for business intelligence, linear referencing system utilizing a source database, integrated data repository, and spatial and tabular integra- tion. Software systems differ by the needs and organizational structure of each state agency. However, such systems are critical to how research is managed. For example, DDOT’s 2013 Research Strategic Plan noted a need for creating specific organizational structures for data reporting so that the data can be easily and consistently integrated into a data warehouse (DDOT 2013). Various database systems are used by research programs to manage projects. A 2013 AASHTO research management survey (LTRC 2013) surveyed state DOTs on the types of database systems used. This survey provides an update and qualitative perspective from state DOTs following the 2008 AASHTO survey in which 67% of states (43 responses) stated that they currently have a research program management database (WSDOT 2008). “Each state has such a unique system for planning, approving, managing, and evaluating projects it would be a great challenge to address this issue at a national level. To be successful,

30 Managing State Transportation Research Programs a tool would need to be adaptable as each state modifies its programs and processes in the future” (Michigan DOT, LTRC 2013). This statement from Michigan DOT exemplifies the challenge with technological tools as no one size fits all. Technological tools such as a “better spreadsheet approach” was voiced by the District DOT as well (LTRC 2013). Surveys such as the 2008 and 2013 AASHTO surveys have provided a breakdown of what types of systems are being used by state DOTs at that point in time, system features, and project tracking. Much of the literature has examined what IT systems are being used to support upgrading of the project management process. 1.3.5 Disseminate Findings How is research disseminated, and how does it affect the research lifecycle? Once research projects have been completed, disseminating research findings to specific audiences is a key part of increasing the quality and value of the research program as a whole. The Vermont 2016 Peer Exchange Report revealed that Vermont’s primary method of publishing research was on the VTrans website, which is also known as “passive” dissemination (VTrans 2016). This is echoed by a 2015 AASHTO survey on the “value of research,” which shows that website marketing of research is by far the most common method, followed by conference presentations (Value of Research Task Force 2015). Other dissemination methods include brochures, annual reports, at-a-glance reports, webinars, research newsletters, articles in external newsletters, articles in industry periodi- cals, trade show booths, project technical summary or briefs, Facebook, LinkedIn, Twitter, Blog, YouTube videos, and e-mail lists (Value of Research Task Force 2015). The most common planned marketing products were at-a-glance reports, webinars, and YouTube videos. While this AASHTO survey focuses on the marketing of the research program itself, dissemination of research is a natural method for marketing research pro- grams by presenting valuable research findings to stakeholders and the general public. The Ohio Peer Exchange Report in 2017 revealed that the Iowa Highway Research Board disseminates research findings through the county engineers’ association, and that dem- onstration projects following the completion of a research project have been effective as the first step to implementation efforts (Ohio DOT 2017a). Minnesota DOT’s Local Roads Research Board hosts conference booths where preloaded flash drives with research find- ings are handed out (Ohio DOT 2017a). Interactive conference presentations through trivia games and other participatory activities for the audience are used to enhance engagement (Ohio DOT 2017a). 1.3.6 Implement Research Implementation Processes Implementation tracking can be difficult following the initial deployment of results. Some tracking systems cited by the Montana 2017 Peer Exchange report include Minnesota’s ARTS system developed in-house, Ohio’s ARMS system developed in-house using .NET technology, and Utah’s Access system (Seeber and Hirt 2017). Using tracking systems can introduce new issues such as contractor dependence, the initial learning curve and associated new documen- tation efforts, coordination with other DOT systems, IT support, and the costs of continual development and maintenance. The responsibility for implementation reporting varies by state but typically involves the state transportation research office, the functional area or champion, or the initial researcher.

Literature Review 31 It is uncommon for a position to be dedicated to implementation, though participants agreed that appointing an in-house, well-connected official within the state DOT to coordinate imple- mentation as a key component of his or her job enables a higher likelihood of implementation success (Seeber and Hirt 2017). Ultimately, the decision for implementation typically stems from senior leadership. Funding for Implementation Mechanisms for implementation funding also differ among agencies, with some agencies strategically selecting research projects based on immediate implementation potential (i.e., Arkansas), which enables early planning for implementation budgets (CTDOT 2016). Funding a designated position for research implementation demonstrates a recognition of the time, funding, and designated workforce required for implementation (Florida DOT 2013). A committee can also be helpful for overseeing implementation funding allocations, such as the Research Implementation Committee with Minnesota’s Local Roads Research Board. This committee uses a consultant to help implement findings, typically on a 3- to 4-year contract with five to eight implementations per year. This approach echoes what some participants stated in the Montana 2017 Peer Exchange: the same researcher should not lead implementa- tion because the implementation of research findings varies significantly from original research and requires someone with a depth of experience in research implementation itself (Seeber and Hirt 2017). Some others stated that implementation recommendations should come from the agency rather than from the researcher because the agency itself must accrue the benefits of the research. SPR funds cannot be used for research implementation, but research testing and development to determine whether implementation is feasible are eligible for SPR funds. Depending on whether implementation can be defined at the beginning of the research project or whether it must be determined at the end of the research project, funding spe- cific to implementation can be done separately from the initial research project. Addition- ally, participants noted that this is important if committing funds before understanding the feasibility of implementation may end up unnecessarily tying up funds (Seeber and Hirt 2017). 1.4 Program Quality Program Quality is the rigor and diligence with which the DOT adheres to scientific prin- ciples and best practices and is efficient and effective at pursuing research. This section assesses the oversight structure, quality assurance and quality control processes, and performance mea- sures of the research program. 1.4.1 Quality Assurance and Quality Control Principles and Mechanisms for Research Quality Delivering research that is of rigor and diligent in adhering to scientific principles and best practices with effectiveness and efficiency requires set principles for quality. Much of the literature has focused on various principles for research quality and accountability mechanisms to ensure that these principles are met. Although principles for research qual- ity are similar among many states, how states organize themselves to meet these standards of quality differ.

32 Managing State Transportation Research Programs Box 1 provides examples of research quality principles and accountability mechanisms for how states achieve research quality, taken from the Four-States Virtual Research Peer Exchange and the Ohio DOT Research Peer Exchange. The Kansas and Missouri Research Peer Exchange on Improving the Quality and Timeliness of Research Reports (Texas A&M Transportation Institute 2017) provided a number of guidelines for improving research quality (Box 2). However, accountability mechanisms must be linked to personnel who set the benchmarks for quality. The relationship between organizational and reporting structures and research qual- ity should be assessed in greater detail. Contracting Quality An AASHTO 2015 Contracted Services Survey (MDT 2015) found that out of 23 respon- dents, 9 states provide payment by deliverable for contracted services, 1 “sometimes” provides Box 1. Examples of Research Quality Principles and Accountability Mechanisms for How States Achieve Research Quality Principles for Research Qualitya Accountability Mechanismsb • Adherence to the DOT’s key research areas • Clear and achievable project scope • Capable researchers, ideally with a relationship to the DOT • Report clarity and accuracy • Valid findings • Timeliness of research • Ongoing communication with the DOT • Preference for implementable research • Dedicating a Project Manager to control and closely supervise the project • Identifying a Research Champion • Tracking work progress and implementation through technological tools • Allowing sufficient time in the early stages of the project • Tying deliverables to payment • Conveying prestige in doing research • Focusing on usability and imple- mentability of research early and identifying its barriers • Withholding budget assignment until the project is securely supported • Instituting a proposal template and report requirements • Maintaining flexibility and modification in research • Pre-publishing critical findings • Establishing a stable reviewing process • Enlisting the help of an editor and communications staff member aTexas A&M Transportation Institute 2017. bOhio DOT 2015b.

Literature Review 33 Box 2. Guidelines for Improving Research Quality 1. Ensure research projects address the priorities of your department. a. Consider broadening the scope of standard research practices to understand why the research is being done. Transportation can be a tool to solve problems, rather than the problem itself. b. When selecting projects for funding, consider the agency’s strategic plan and focus areas, the value of the research, and the return on investment. Highlight high-priority and high-profile research projects. c. DOTs are encouraged to track involvement and show the value of participating on panels or acting as a PI/PM on a research project. 2. Establish procedures that ensure active oversight and a set of warning signs for the entire project lifecycle that signal if the project is going off track to avoid surprises. a. Consider developing or utilizing a kick-off meeting checklist to ensure all aspects of the project are covered and stress the importance of communicating staffing issues early in the process. Include the schedule of deliverables and project milestones in the proposal or contract. b. Establish performance measures. 3. Precise explanations and a clear objective are needed at the beginning of a project to produce a decent final product. a. Consider providing a “model” product to use as an example for expected deliverable quality. b. Define what “quality” means to the program and how it should be applied to deliverables. 4. Strong final reports are the result of a series of critical steps in the research process, and fully reflect the strength and coherence of that process. a. Concise reporting is a key element, including the length and volume of the report. b. Report quality is a key factor as well, including technical editing for grammar, punctuation, and formatting errors. c. Consider hosting a meeting or conference call for university researchers to discuss report writing, editing, and publication requirements and set expectations. Consider mandating technical writers and editors, or providing a list of approved editors. d. Consider outsourcing communications and marketing responsibilities if in-house staff does not have the required expertise. 5. Consider including procedures to cancel a research project in the contract or master agreements for poor performance or other reasons. Consider imposing consequences for late deliverables. 6. Explore retainage options or contract language to prevent PIs from proposing on new RFPs until deliverables have been submitted. 7. Consider discussing technology transfer procedures and documentation in the contracting phase and include in the implementation plan for a research project. The DOT PM and the PI work together to develop technology transfer, and consider requiring for each project. 8. Assess overhead rates for both public institutions and private industry, and review the impact on research project budgets. 9. Outline the requirements and expectations for data management plans at the beginning of a project. Determine when to discuss the cost of storing and securing data pre- and post-publication. 10. Explore options for repository and define reasonable standards. Utilize DOT and research librarians. National guidance is needed for data management to avoid multiple sets of standards in different states. 11. Recognize project champions, reward volunteers, and show appreciation for staff involvement in the research program. Source: Texas A&M Transportation Institute 2017.

34 Managing State Transportation Research Programs payment through this mechanism, and 13 of them do not. This is an example of a quality assurance/quality control practice that is used by some state DOTs to promote program quality specific to projects contracted out. Following is a statement from the Arizona DOT on payment by deliverable: Yes, as a project manager, I do. It was intended that all project managers do this, but it has come to light that the practice is not universal. Moving forward, we expect that paying by deliverable will be standard practice. After we evaluate proposals, a consultant is tentatively selected. We then ask the consultant to refine the proposed work plan, budget, and schedule, breaking tasks/deliverables into subtasks that can be invoiced (see attached sample). The subtasks are tied to deliverable, approv- able, and—as much as possible—tangible products. This allows the consultant to invoice more fre- quently. Before a consultant is given notice to proceed, ADOT and the consultant agree to what will be performed and delivered, for what price, and by what date. We do end up being flexible on the due dates, but rarely on anything else. We pay for completion of a task/subtask only after the PM approves it as meeting our expectations, which means that a deliverable may need revision or other types of additional work before a consultant can submit an invoice. We do not pay just because a first draft of a chapter or survey questionnaire, for example, is simply delivered. Perhaps because of this practice, I have not had to address the situations your questions imply. Our current research contract requires a 20% retainage. Our longstanding practice was to hold that amount until all study tasks were complete and approved. However, over the past year or so we were informed that federal regulations require us to release retainage within a short period of time after making payment on an invoice. (MDT 2015) The statement from the Arizona DOT reveals practices for paying by deliverable, although this practice does not currently occur universally within the DOT. Notably, this practice does not only pay for the deliverable but builds in accountability mechanisms for quality. The AASHTO survey further revealed that a typical practice of state agencies is to withhold the last 10% of payment until acceptance of the final deliverable. However, some federal agencies such as the U.S. Geological Survey and U.S. Army Corps of Engineers (USACE) do not allow the flexibility for withholding payment. Thus, practices vary depending on what type of research programs and funding mechanisms the state DOT is more dependent on. Whether payment by deliverable enhances quality is examined by another AASHTO survey, “Increasing Accountability to Ensure Good Quality Reports” (Ohio DOT 2015a). Of 24 respon- dents, 88% admitted that they have received poor-quality research reports from researchers (both university and non-university contractors) with only 2 respondents stating that they have not received poor-quality reports. Although this measure of “poor quality” is subjective and the study did not define what a “good” or “poor” quality report constitutes, many state DOTs referenced grammatical and editorial processes needed prior to final acceptance of the deliv- erable. Along the same lines, only slightly more than half of the state respondents stated that they have published guidelines on what constitutes a “good report.” The findings of this survey were not clear as 50% of states have enforced “nonpayment” due to the poor quality of these reports, which typically involves the final 10% to 15% of the fee, and 46% of them found this mechanism effective for improving quality. A New Hampshire AASHTO survey on contracted services within peer exchanges found that 58% of 19 states surveyed have used a contractor to conduct their peer exchange and, remarkably, that 100% of respondents stated that they would recommend their consultant, other than in unfeasible situations where exclusive partnerships exist (NHDOT 2016a). Clearly, mixed feedback arises in the usage of consultants and the quality and value of their services. A broader and unassessed question is how the quality of research is determined by each state DOT. Guidelines on what a “good” report constitutes provide a good starting place, as are grammatical and editorial standards. However, validation and peer review processes for state transportation research for maintaining quality have not been systemati- cally studied.

Literature Review 35 1.4.2 Performance Measures Measuring the results of research and its benefits is necessary to understand process efficiency as well as the overall quality of the research program. However, research evaluation lacks scien- tific rigor at present, and many state DOTs may not consistently measure actual research impact on their transportation systems. This is partially due to the lack of quantitative and qualitative measures available for determining program-wide value in a comprehensive and implementable manner. In Report 512 by the Southeast Transportation Consortium, 88% of respondents stated that no guidelines or methods existed to evaluate research benefits and that the remaining 12% misunderstood the survey question and did not have any guidelines or methods either (Ashuri et al. 2014). Although the findings of this report are based on specific measures, it is clear that major obstacles exist in developing systematic performance measures for research programs. The process of evaluating research programs on the basis of their process, quality of research, and value to the transportation field is dependent on funding. NCHRP Synthesis 300, published back in 2001, provided insight on what performance measures are used by state agencies at each stage of a research program lifecycle (Sabol 2001). The study found that many agencies used a “check-off” approach to research evaluation, meaning that research evaluations were conducted for compli- ance rather than for learning and continuous improvement purposes (Sabol 2001). Since then, performance measures have been widely researched by various peer exchanges, AASHTO surveys, and other TRB reports and have been implemented at more sophisticated levels across agencies. However, there is much less research done on the effectiveness of research evaluation. Factors cited that interfere with successful research evaluation principally involve data scarcity, the difficulty of quantifying benefits until many years after research has been completed and researchers have moved on to new projects, intangible qualitative benefits, and diverse interests between different stakeholders involved in the research program (LTRC 2016a). 1.5 Program Value Program Value is the usefulness of new and enhanced knowledge gained through research and the degree to which it is applied to enhance the transportation system. This section analyzes how program value is maximized through two mutually reinforcing components: communicat- ing and capturing the value of the state transportation research program. 1.5.1 Value Communication How does the research program market its value? Communicating value in research pro- grams remains central to sustaining funding for continuous improvement in transportation “safety, mobility, and infrastructure” (Zmud et al. 2009). While research helps inform impor- tant decisions and solves problems in this process, the contributions of the research program are not always evident. Director of Arizona DOT John Halikowski spoke at the 2011 AASHTO Annual Meeting about how “research must be pervasive throughout the organization and throughout daily operations” (AASHTO Annual Meeting 2011). At the same meeting, Execu- tive Director of Utah DOT John Njord referenced the role of research in relieving state agencies of the worst effects of funding constraints by “finding effective and efficient solutions” through innovative technologies and solutions. The Accelerated Bridge Construction (ABC) techniques in Utah, for example, helped minimize traffic disruption when bridges were moved into place. These types of demonstrable, high-impact research benefits that contribute to agency needs and goals should be highlighted to leadership, the wider state DOT, and stakeholders. Anecdotal stories on the value of research such as the ones shared at the AASHTO Annual Meeting in 2011

36 Managing State Transportation Research Programs were presented by executive leadership at the state DOTs, a detail that should not be lost. There is a lack of previous research done on executive engagement’s ultimate effect on program value through efforts such as research program marketing. Research implementation is highlighted, but it is not clear what role executive leadership played in the research lifecycle. Program value must be realized separate from the value of individual research proj- ects through systematic marketing of the research program. A research marketing plan and accountability mechanisms with managers that are ultimately responsible for the visibility of the research program in itself are two example activities that can strengthen the value of the program. Past literature such as NCHRP Report 610 highlight communication campaigns to convey the value of transportation research projects or programs through a five-step process of context, strategy, content, channels, and style of communication (Zmud et al. 2009). Good communication practices are those that 1. Involve communication professionals 2. Understand the audience 3. Demonstrate a tangible benefit 4. Recognize that timing is relevant 5. Build coalitions 6. Build two-way relationships 7. Tailor packaging to fit the purpose and audience The AASHTO Value of Research Task Force of 2015 followed up on NCHRP Report 610 by surveying marketing efforts by various state DOT research programs. This study provides a framework for how state DOTs market their research programs at present as well as some of the challenges to implementing the practices suggested by NCHRP Report 610 (Zmud et al. 2009). As discussed in the “Disseminate Findings” section under “Program Management,” “passive” com- munication vehicles are the most common methods. This includes publishing research reports online on a state DOT website. Other dissemination methods include brochures, annual reports, at-a-glance reports, webinars, research newsletters, articles in external newsletters, articles in industry periodicals, trade show booths, project technical summaries or briefs, social media, YouTube videos, and listserv e-mail blasts. Active approaches demonstrate tangible benefits at crucial times and tailor the outreach packaging to fit the purpose and audience to maximize the intended knowledge translation and impact. For example, the Iowa Highway Research Board disseminates research findings through the county engineer’s association as well as demonstration projects once a research project has been completed (Ohio DOT 2017a). The California DOT hosts a unique video con- ference series titled “Research Connection,” where researchers and practitioners purposefully set aside time to exchange information and transfer knowledge. A combination of both active and passive approaches is thus more likely to enable knowledge translation. The AASHTO survey on the Value of Research (Value of Research Task Force 2015) show- cases top research marketing challenges of state DOTs (Figure 9). The primary challenge in this survey is “adequate staffing (too few or not enough time)” (Value of Research Task Force 2015). However, forms of research program marketing differ from one another with some being less time or resource intensive than others. This analysis does not dig deeper into what types of staffing structures and processes exist or whether they can be utilized in more effective ways to generate greater value communication to enhance program value. 1.5.2 Value Capture The perceived program value of a state transportation research program is affected by com- munication context, strategy, content, channels, and style. However, research programs do not generate value just through good marketing practices. Real program value must be “captured”

Literature Review 37 by maximizing the impact of new knowledge discovered through research. This most commonly comes in the form of research implementation. Research Implementation Definitions of implementation vary among state agencies but generally consist of “using what we have learned” as Ohio DOT states in the Montana Implementation Survey (MDT 2017). This varies by the agency due to the differences in the nature of research: the implications of research can vary from a broad policy change to a new training session to using a new type of construction material. In Montana’s Implementation Survey (MDT 2017), the New Mexico DOT stated that it does not currently have a full definition for implementation but uses this as a starting point: “the systematic application of research results into accepted and sustained . . . organizational busi- ness practice.” The Virginia DOT uses a similar approach while adding examples, stating that implementation is considered to be achieved when a “research recommendation becomes a part of standard operating procedures through a policy memorandum, spec change, agency guidance document, manual, or other similar document” (MDT 2017). The Georgia DOT remarked that its current working definition follows the Ohio DOT’s but adds alignment with the DOT’s strategic goals and is the only definition in the Montana 2017 survey that provides such a reference. The agency states that implementation is “the incorpora- tion of research findings into a new or revised agency policy, procedure, specification, standard drawing or work method, to advance progress toward achieving agency strategic goals” (MDT 2017). Common to all these definitions is the idea of putting research findings into practice through a new, standardized practice. Perspectives on Implementability Although many studies have surveyed state DOTs on various questions relating to their defi- nitions of implementation and their implementation rates, past literature has not given equal weighting to the definition, applicability, feasibility, and responsibility for implementation. 1. Definition of implementation—when something is considered to be implemented? 2. Applicability of implementation—what types of projects should be implemented? Source: Value of Research Task Force 2015. Figure 9. State DOT top research marketing challenges.

38 Managing State Transportation Research Programs 3. Feasibility of implementation—how plausible is it for a project to be implemented? 4. Responsibility for implementation—who should be responsible for research implementation? In a 2016 AASHTO survey, the Indiana DOT stated that its agency has a 100% implementa- tion rate due to internal requirements to sign commitment forms for implementation prior to developing a proposal (CTDOT 2016). In contrast, many states do not keep track of implemen- tation rates, though the reasons behind this are not clear. Washington State has a 60% imple- mentation rate, and the agency’s perspective is that “[immediate implementation potential] is only one factor in projects selected for research. Not all research can or should be implemented” (CTDOT 2016). Implementation Plans Steps to early-stage implementation plans as detailed by NCHRP Synthesis 461 (Harder 2014) include those in Box 3. Implementation plans are not uniformly required during the research lifecycle across the country. Florida DOT’s 2013 Peer Exchange conducted a survey on research implementation strategies. Minnesota DOT’s Local Roads Research Board, for example, provides researchers with opportunities to transition their findings into practice upon the development of an imple- mentation plan by a Technical Advisory Panel. The Iowa Highway Research Board similarly does not require a completed plan for implementation, though it does ask for a business plan or a technology transfer and implementation section in the final research reports (Ohio DOT 2017a). While implementation plan requirements have been highlighted in past peer exchanges, a sys- tematic review of how they are incorporated into the research selection process or post-project review has not been conducted to date. Box 3. Early-Stage Implementation Planning Exploration and Adoption What are the various methods for implementing this type of practice? Program Installation What structures and resources are eventually necessary to accomplish implementation? Initial Implementation What considerations are important during early use of the new practices? Will these require changes and commitments? Full Operation What considerations are important as changes are experienced and scaled up? How will the organization and practitioner learn the new way of doing things? Assessment What is the evaluation plan and over what length of time to determine if the new practice is beneficial? Sustainability What will ensure the new practice has long-term survival and continued effectiveness?

Literature Review 39 Responsibility for Implementation Implementation requires an initial evaluation of a project’s readiness for implementation through subject matter experts and experienced practitioners through roles such as project champion, or as members of technical advisory panels. If these experts have been involved since the beginning of the research lifecycle, then they are likely able to ascertain readiness for implementation at multiple stages during the lifecycle and advise on the implementation plan during research completion. While readiness in large part depends on the technical outcomes of research, factors such as funding for deployment, the ability of champions and practitio- ners to pursue the process, and other competing needs for time and resources typically become considerations. Figure 10 provides responsible personnel for managing implementation at transporta- tion agencies, taken from the Montana DOT’s study in 2017 (MDT 2017). The arrangements range along a spectrum from passive (i.e., leaving implementation completely to practitioners/ research customers) to active (rigorously managed and coordinated processes across portfolios of projects). Tracking and Evaluating Implementation Tracking the implementation process—from implementation project selection to project performance assessment over time—is another common challenge that agencies face. This challenge is associated with factors described earlier, such as the degree to which implementa- tion is passively or actively managed, the amount of funding available or set aside specifically for implementation, expertise to oversee implementation over time, and attention to specific stages of implementation as a project evolves. The availability of a systematic process and sup- porting infrastructure such as a database or tracking system can be enabling factors in making it easier to track implementation. Implementation can often be an iterative process; projects evolve over time and must be adapted to available resources and sometimes to new findings in related research. Given the evolution as well as the long time that it takes to observe the impact of implementation, some agencies limit their available resources and effort on priority projects and set a time frame of 3 to 5 years after implementation has commenced to assess and report on performance. They also focus on crafting tailored performance measures that are “unified, user-oriented, scalable, systematic, effective, and calculable” methods (Florida DOT 2013). A method developed by Louisiana’s LTRC for recording project implementation status from project initiation through 5 years after the project concludes is being used by at least five DOTs (Colorado, Virginia, Indiana, Florida, and Washington). However, this approach merely flags projects in the process of implementation or recommended for implementa- tion. Performance is not truly assessed for the highlighted projects. Among the states that have a performance tracking procedure, Virginia DOT uses return on investment and projects on time within budget. Missouri DOT looks at the number of projects completed on time within budget and states that almost 90% of its research projects have been completed on time in recent years. Illinois DOT tracks research projects based on monetary outcomes as well as those that reduce person-hours (quantitative) and or improve internal processes (qualitative). Similar to assessment for research project per- formance, a mix of quantitative and qualitative indicators tailored to the project and woven together in the form of an implementation narrative is more likely to make the performance assessment credible. Performance measurements can be rigorous without monetizing or quantifying every aspect of a project or program. Rigor does, however, imply careful and systematic documentation.

40 Managing State Transportation Research Programs Who is responsible for the implementation of research results? California Project Managers Colorado In process of hiring an Implementation Engineer District of Columbia Technical staff in requesting department Florida Functional Area Champions Monitoring: Research Center Performance Coordinator Georgia Research Implementation Manager Illinois Research Implementation Engineer Indiana Project Managers Iowa Office associated with the project Louisiana The Project Review Committee or the Research Champion, with assistance of the Technical Liaison Maine Project Champion Maryland Technical office that requested the research Minnesota Practitioner or an expert in the topic area Missouri Project Manager and the technical research champion Nevada Implementation falls to internal “customers” New Hampshire Project Champion New Jersey Research project customer who initiates a project with a problem statement New Mexico Research Implementation Engineer North Carolina NCDOT customer is ultimately responsible for implementation of research Monitoring: Research Implementation Manager Ohio Technical expert responsible for that type of work Project Managers are involved with project implementation Oregon Research Customer Texas Project Managers Utah Research Division: (1) Innovations and Implementation Manager (2) Research Implementation Engineer Vermont None, but hope all do implementation Virginia Implementation Coordinator, but all have a recognized role in implementation Washington Subject Matter Experts (updated after Montana 2017 Implementation Study) Source: Montana 2017 Implementation Study (MDT 2017). Figure 10. Examples of who is responsible for agencies for managing implementation.

Literature Review 41 Examples include details on the “before and after” attributes of a system and the impacts or changes due to the implementation of a research finding. Evaluating the Research Program Much research has been conducted on performance measures for research programs. NCHRP Web-Only Document 127: Performance Measurement Tool Box and Reporting System for Research Programs and Projects (Krugler et al. 2006) provides a list of 20 common types of performance measures, based on outcome, output, efficiency, resource allocation, and stake- holder metrics, as seen in Box 4. Many states appear to track research program performance, and most of them do so with minimal or moderate levels of effort (Missouri DOT 2014). Figure 11 shows what tools are most often used to collect this data. States also appear to use performance measures readily, although the application varies. Some uses include to “determine program effectiveness” cited by the Iowa DOT, to “make a difference in organizational business processes” as the Montana DOT stated, and for individual performance evaluations as noted by the Louisiana DOT (Missouri DOT 2014). Literature expounds on different marketing practices for research programs and monitoring for research effectiveness, such as the Value of Research Task Force Marketing Survey (Value of Research Task Force 2015). This survey finds that the most common methods for monitor- ing include customer feedback and reviews with management and working groups (Value of Research Task Force 2015). Least common activities include user surveys and reviews with con- sultants or universities. The recent research of how research programs continuously improve Box 4. Performance Measures 1. Return on investment or benefit-cost ratio; 2. Lives saved; 3. Construction, maintenance, and operations cost savings; 4. Reduction in crashes; 5. Reduction in system delays; 6. Positive environmental impact; 7. Quality of life enhancement; 8. Safety enhancement; 9. Level of knowledge increased; 10. Management tool or policy improvement; 11. Public image enhancement; 12. Technical practices or standards upgrades; 13. Leadership; 14. Percent of projects/products implemented; 15. Percent of projects completed on time; 16. Percent of projects completed within budget; 17. Number of contractors; 18. Number of contractor partnerships; 19. Percent of satisfied customers; and 20. Contribution to the overall mission of the department.

42 Managing State Transportation Research Programs and enhance value through these monitoring and evaluation mechanisms allows this study’s analysis to delve into the structural and procedural aspects of maximizing and sustaining value in research programs. 1.6 Conclusion Previous Synthesis reports, peer exchanges, AASHTO surveys, and annual reports published by state DOTs shed light on many important facets of state transportation research programs from the dimensions of program capability, program management, program quality, and program value. Much of the research conducted in the past provides anecdotal evidence of research prac- tices to achieve excellence in process management, research quality, and research value, which is helpful for thinking about different approaches to research under different contexts. This study supplements the research described in this literature review by more broadly identifying the structures and processes in place that enhance program capability, program management, program quality, and program value in conducting transportation research. To this end, this study moves past qualitative comparisons between a few programs and delves into the distribution of structural and procedural differences within state transportation research programs. Source: Missouri DOT 2014. 1 1 1 2 2 3 3 6 8 10 0 2 4 6 8 10 12 State DOT software SiteManager Other Govt databases Website Custom built software Project mgmt software Access Survey Excel Agency Count Figure 11. Tools used to collect performance measures.

Next: Chapter 2 - Survey Results »
Managing State Transportation Research Programs Get This Book
×
 Managing State Transportation Research Programs
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

TRB’s National Cooperative Highway Research Program (NCHRP) Synthesis 522: Managing State Transportation Research Programs identifies the current state of practice of managing state transportation research programs. The report highlights existing resources, desired individual skill sets, core competencies, and structures that are in place for departments of transportation (DOTs) to manage and conduct transportation research, especially federally funded research.

In essence, NCHRP Synthesis 522 addresses how transportation agencies organize and manage their research programs to strive for quality and positive impacts on the transportation system over time (value). The report includes a four-dimensional framework to analyze and shed light on how state DOT research programs with differences in agency needs, resources, and constraints are able to produce programs of high quality and value.

State transportation agencies conduct applied research with a goal of ultimately creating new knowledge to enhance the transportation system. Agency research as an activity requires special skills and capabilities—it convenes practitioners, scholars, and policy makers to identify and pursue the knowledge that is most needed.

These and other attributes of research make it unlike other DOT functions such as planning, programming, construction, maintenance, and operations, even though it eventually enables agencies to perform those functions. The payoffs and innovative outcomes of research can be significant and valuable, although they are rarely immediate.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!