Click for next page ( 19


The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 18
18 TABLE 35 TABLE 37 SATISFACTION WITH CURRENT RIDERSHIP ONE IMPROVEMENT TO RIDERSHIP FORECASTING FORECASTING METHODS METHODOLOGY Agencies Agencies No. Agencies Responding No. Agencies Responding Level of Satisfaction Responding (%) Improvement Responding (%) Satisfied 11 31 Input data 11 44 Partially satisfied 12 34 Methodology 10 40 Not satisfied 12 34 Approaches 3 12 Total responding 35 100 In-house staff expertise/understanding 2 8 Linkages (GIS, regional indicators) 2 8 Total responding 26 100 of improvements desired, this question was open ended. GIS = geographic information system. Table 37 summarizes the results. Improvements to input data and methodology were most frequently mentioned. There is a need for greater data availability, more current data, and Simplify the approach--focus on one or two tools for data at a more detailed level. Methodology needs were more synergy and absence of conflicting forecasts; trend diverse, reflecting that various agencies are at different stages forecasting and professional judgment can be as accu- regarding forecasting methods. Among the specific rate as regression and econometric models; in-house responses were greater sophistication, more consistency, and expertise is more effective and less expensive than easier to apply models. "Approaches" is a catch-all category consultants. that includes adopting written guidelines, basing ridership Caution regarding data and application--understand the forecasting on industry standards and best practices, and limits of the data being used; use trip generation rates with allowing alternate specific constants in FTA procedures. care--these may not apply across the metropolitan area; use caution in applying regional model outputs at a different scale (e.g., route or station level); AFC data overcome limitations of survey/census-based origin/ LESSONS LEARNED destination data, particularly the out-of-date issue. Roughly half of all survey respondents shared lessons Communication and partnership--inform and cooper- learned from the process of developing and using ridership ate with other local agencies (such as the MPO) and forecasting methodologies. The lessons learned can be peers within the transit industry. grouped into seven broad categories, as shown in Table 38. Develop local factors--forecast models from external sources do not work well. They are complicated, time- intensive, data-intensive, and provide inferior results; Responses are summarized by category below. local elasticities preferred over industry; use experience and results from the past. Caution regarding results--be realistic in ridership esti- Simplify the model--car ownership and income do not mates; use a range and confidence level--specific pre- provide enough improvement to warrant the time and dictions are almost always wrong; review model results difficulty in acquiring the data at the appropriate scale. with peers, other corridors, and elasticities; temper with Other: experience; a full understanding of current ridership Smaller versus larger agencies: for smaller agencies, behavior is critical for forecasting. trip rates and population and employment numbers can suffice; for larger agencies, network impacts are TABLE 36 important--evaluate impacts on systemwide basis. DESIRED IMPROVEMENTS TO RIDERSHIP FORECASTING METHODS TABLE 38 Agencies LESSONS LEARNED No. Agencies Responding Improvement Responding (%) No. Agencies Availability and/or accuracy of input 22 81 Agencies Responding data at the appropriate scale Lessons Learned Responding (%) Accuracy of the results 16 59 Caution regarding results 7 37 Inclusion of more predictive variables 11 41 Simplify the approach 4 21 Less time-intensive methodology 11 41 Caution regarding data and applications 4 21 Flexibility to address a wider variety of 11 41 Communication and partnering 2 11 situations Develop local factors 2 11 Simplification of the procedures 8 30 Simplify the model 2 11 Other 7 26 Other 7 37 Total responding 27 100 Total responding 19 100