Skip to main content

Currently Skimming:


Pages 37-53

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 37...
... 37 The survey results provided an overview of the transit agencies' evaluations of their processes and issues in relation to transit service evaluation standards. After a review of these results, six agencies were chosen as case examples.
From page 38...
... 38 Transit Service Evaluation Standards Nine process improvement teams were created around individual performance areas with specific metrics assigned to each team. A basic description of the transit agencies included in the case examples (ridership, revenue hours, and peak vehicle requirements for all services operated)
From page 39...
... Case Examples 39 estimate passenger arrivals to each station by the minute using its automated fare collection system. Therefore, schedule adherence is now based on passenger wait time (measured by the percentage of passengers who wait the scheduled headway or less for a train to arrive)
From page 40...
... 40 Transit Service Evaluation Standards and jobs given access to the network and percentage of transfers)
From page 41...
... Case Examples 41 challenge. As mentioned above, in a resource-constrained environment, prioritizing one standard often occurs at the expense of another.
From page 42...
... 42 Transit Service Evaluation Standards Corpus Christi Regional Transportation Authority, Corpus Christi, Texas The Corpus Christi Regional Transportation Authority (CCRTA) provides an integrated system of public transportation services in the Corpus Christi metropolitan region.
From page 43...
... Case Examples 43 Agency Attitude The executive staff are knowledgeable and supportive of the performance evaluation process. The Operations Division is very involved in the process.
From page 44...
... 44 Transit Service Evaluation Standards rail vehicles in maximum service. Average weekday ridership in 2016 was 345,143, and annual ridership was 103.34 million.
From page 45...
... Case Examples 45 The performance evaluation process is transparent: everyone comprehends why RTD looks at specific routes. The businesslike approach, combined with the use of classes of service for jurisdictional equity, is understood by board members and stakeholders, although it does help to remind them periodically.
From page 46...
... 46 Transit Service Evaluation Standards Keys to Success • Staff who recognized the need for performance measures and took the time and effort to develop the performance evaluation process. • Willingness to evolve as the need arises.
From page 47...
... Case Examples 47 Are Some Standards More Important? On-time performance and farebox recovery ratio are important standards.
From page 48...
... 48 Transit Service Evaluation Standards operates Sound Transit's regional express bus service and Link light rail in King County, along with Seattle Streetcar. According to the 2016 NTD data, KCM's service area population is 2.117 million.
From page 49...
... Case Examples 49 3. Increasing frequency, and 4.
From page 50...
... 50 Transit Service Evaluation Standards mobility. Therefore, when resource allocation is tied directly to performance (as it is at KCM)
From page 51...
... Case Examples 51 ways in order to sidestep the discomfort associated with confronting an identified problem head on. Management must be willing to accept the available data and what it says and take corrective action based on the analysis that is possible.
From page 52...
... 52 Transit Service Evaluation Standards On-time performance is a longstanding concern for the agency. "Dig and find" was the approach taken by each team, and this team found technology-related issues that affected the reporting of on-time performance.
From page 53...
... Case Examples 53 new performance evaluation system was implemented, ridership was being reported separately by the planning and operations departments (the totals did not agree) , data would be pulled at different times each month, and definitions of certain metrics were unclear.

Key Terms



This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.