8
Performance Standards and Educational Cost Indexes: You Can't Have One Without the Other

William D. Duncombe and John M. Yinger

Introduction

Performance standards and educational adequacy have been at the center of recent debate on educational reform. Many states have implemented new performance standards, often based on student test scores, and a district's state aid is sometimes linked to its success in meeting the standards (Clotfelter and Ladd, 1996). National politicians have debated the merits of a nationwide testing program, which is a way to obtain comparable performance indicators across states. In addition, several state supreme courts have ruled that their state constitution requires a system enabling all school districts to reach an adequate performance level (see Minorini and Sugarman, Chapter 6 in this volume), and state aid programs can be used to provide all districts with the funds they need to reach a performance level that is thought to be adequate—or some higher standard.

Performance standards are designed to encourage more effective educational practices, particularly in school districts that are currently not performing well, by holding school districts accountable. The trouble is that a district's performance is influenced not only by the actions of its administrators and teachers but also by factors outside of its control, such as the nature of its student body. A recent article in The New York Times expresses this concern very clearly. In a discussion of report cards and school rankings, now used in 35 states, this article points out that "because such rankings are often based exclusively on test scores, which give only a partial snapshot of a school's performance, some educators worry that schools may be unfairly blackballed, especially those with high populations of poor children" (Steinberg, 1998).



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 260
260 PERFORMANCE STANDARDS AND EDUCATIONAL COST INDEXES 8 Performance Standards and Educational Cost Indexes: You Can’t Have One Without the Other William D. Duncombe and John M. Yinger INTRODUCTION Performance standards and educational adequacy have been at the center of recent debate on educational reform. Many states have implemented new perfor- mance standards, often based on student test scores, and a district’s state aid is sometimes linked to its success in meeting the standards (Clotfelter and Ladd, 1996). National politicians have debated the merits of a nationwide testing program, which is a way to obtain comparable performance indicators across states. In addition, several state supreme courts have ruled that their state consti- tution requires a system enabling all school districts to reach an adequate perfor- mance level (see Minorini and Sugarman, Chapter 6 in this volume), and state aid programs can be used to provide all districts with the funds they need to reach a performance level that is thought to be adequate—or some higher standard. Performance standards are designed to encourage more effective educational practices, particularly in school districts that are currently not performing well, by holding school districts accountable. The trouble is that a district’s perfor- mance is influenced not only by the actions of its administrators and teachers but also by factors outside of its control, such as the nature of its student body. A recent article in The New York Times expresses this concern very clearly. In a discussion of report cards and school rankings, now used in 35 states, this article points out that “because such rankings are often based exclusively on test scores, which give only a partial snapshot of a school’s performance, some educators worry that schools may be unfairly blackballed, especially those with high popula- tions of poor children” (Steinberg, 1998). 260

OCR for page 260
261 WILLIAM D. DUNCOMBE AND JOHN M. YINGER Thus, a focus on performance is inevitably unfair unless it can somehow account for the impact on performance of factors that are outside the control of school officials. Without such an accounting, some schools get credit for favor- able conditions that were not of their making and other schools get blamed for unfavorable conditions over which they have no control. In order to be fair, school report cards and performance-based state aid systems must distinguish between poor performance based on external factors and poor performance based on school inefficiency. Similarly, a state aid program that brings districts up to a minimum spending level, which is called a foundation program, cannot ensure that a performance- based adequacy standard will be reached in all districts unless this spending level is much higher than the amount a typical district needs to reach an adequate performance. This problem reflects the fact that school district performance is also influenced by the cost of education, which varies widely from district to district based on wage rates, student characteristics, and other factors that are outside the control of district officials. Existing state aid formulas either ignore these factors altogether or else use ad hoc corrections, such as “weighted pupil” counts, that account for them only partially at best. As a result, a foundation program that provides enough revenue for an average district to meet an adequate performance standard leaves a high-cost district short, often far short, of the revenue it needs. In this chapter, we explain in detail why a performance standard, whether it is set at a level that defines an adequate education or at some other level, must go hand in hand with an educational cost index; we discuss the alternative methods for estimating educational cost indexes, and show how high-cost districts can be brought up to a performance-based state adequacy standard by incorporating these costs indexes into a foundation aid program. Our analysis is illustrated with data from New York State. We find, for example, that the large central city districts must spend two to three times as much as the average district to reach the same performance standard. THE CONCEPTUAL FOUNDATIONS OF EDUCATIONAL COST INDEXES An educational cost index is designed to measure how much a school district would have to spend, relative to the average district, to obtain any given perfor- mance target. Some scholars, including Guthrie and Rothstein (see Chapter 7 in this volume), have used the term “cost index” to refer only to differences in input prices across districts. However, we use the term to refer to a comprehensive accounting of the reasons why some districts must spend more than others to achieve any given performance level—a definition that, as we will show, in- volves far more than just input prices. After an introductory example, this section explains the relationship between educational performance measures and cost

OCR for page 260
262 PERFORMANCE STANDARDS AND EDUCATIONAL COST INDEXES indexes, and discusses, in general terms, the factors that influence educational costs. The following section reviews alternative methods for estimating educa- tional cost indexes. An Introductory Example Before turning to educational performance and costs, it may prove helpful to explore an example from everyday life that involves the same concepts. The example is the service of providing comfortable shelter. A natural measure of performance for this service is the indoor temperature. This is not, of course, the only dimension of comfort. In some contexts, one might want to know about the extent to which rain leaks in; in others, humidity might be a concern. As with any performance standard, however, some simplification is necessary, and the indoor temperature appears to measure the dimension of comfort that is of most concern to most people in this country. Thus, we will focus on a temperature as a measure of performance. Now suppose some national consumer group interested in comfortable shel- ter sets a standard for adequate performance at 72 degrees. A natural question to ask is: How much would it cost to achieve this standard in different parts of the country? To answer this question, one must consider the technology with which comfort is provided. This technology is straightforward: a household must purchase various inputs, such as a furnace and natural gas and insulation, that can deliver or preserve heat or cold and thereby provide an indoor temperature that is different from the outdoor temperature. The impact of these inputs depends, of course, on the outdoor temperature, as well as on other environmental factors, such as the wind. The cost of achieving the comfort standard therefore depends on two factors. First, it depends on the prices of inputs. The price of natural gas is not the same in all parts of the country, for example, and people in high-price places must pay more to obtain the same amount of natural gas. Second, it depends on the environment. During the winter, people in Minneapolis face much lower outdoor temperatures than do people in San Diego, so it costs them more to bring the indoor temperature up to the standard. To put it another way, people in Minne- apolis must purchase more inputs to reach the standard. In another season (or for other cities), one obviously would have to consider the cost of bringing down the indoor temperature when the outdoor temperature is above 72 degrees. A com- fort cost index that reflected only gas price differences could be developed, but it obviously would provide an incomplete accounting of costs because it would ignore the outdoor temperature. It follows that a comprehensive comfort cost index must consider both the prices of inputs and the harshness of the environment. As we will see, these are exactly the factors at work in determining the cost of education. The factors discussed so far are all outside a household’s control; the same cannot be said for everything that influences comfort. Households may, for

OCR for page 260
263 WILLIAM D. DUNCOMBE AND JOHN M. YINGER example, buy an inefficient furnace, use a relatively expensive type of fuel, neglect to have their furnace maintained properly, or neglect to install the proper weatherstripping. These decisions all affect how much a household spends to obtain comfort but they should not influence a measure of the cost of comfort, which, as we use the term, is based solely on factors outside a household’s control. This leads us to another concept, namely efficiency, which is the extent to which a household uses best-practice methods to achieve comfort. In some contexts, it may be helpful to measure inefficiency and to separate its impact on comfort spending from the effect of cost factors. However, it would be inappro- priate to let household choices influence a cost index, so any method to obtain a cost index must be insulated from the effects of household choices, including those that determine efficiency. The key question for our purposes is: How could one determine the cost of meeting the performance standard in various locations? One way to proceed would be to select a certain type of house with certain heating and insulation characteristics as a standard and then use engineering studies to determine the amount of fuel it would take to keep this house at 72 degrees under the average weather conditions that are experienced in each location. The cost of meeting the standard is the amount of fuel required multiplied by the cost of fuel in that location. This approach has the advantages that it directly accounts for household choices (by using a standard house) and that it can be based on extensive informa- tion on weather conditions. This approach also has the major disadvantage, how- ever, that it requires a detailed engineering study that is usually not available. An alternative approach, which is analogous to the educational cost indexes discussed below, is to gather information for a sample of locations on (1) average household fuel bills, (2) actual indoor temperatures obtained by households, (3) a few key measures of weather conditions, such as average temperature or heating degree days, (4) the cost of the main input, namely fuel, and (5) average house- hold choices that might influence fuel efficiency, such as whether they maintain their furnace annually and whether they use the appropriate weather stripping. An analysis of this information using regression analysis, which is a standard statistical procedure, can then reveal the extent to which input costs and weather conditions affect spending for fuel, holding constant both actual temperatures and household choices. This analysis leads directly to a cost index, which is defined as the amount a household would have to spend in each location, relative to a household in the average location, to obtain a given indoor temperature, under the assumption that the household makes efficient choices about its heating/cooling system. If, for example, Minneapolis has a cost index of 200, households there would have to spend twice as much as households in the average city to obtain the same level of comfort. This approach is more abstract than the previous one in the sense that it does not consider all the details of heating technology, but it

OCR for page 260
264 PERFORMANCE STANDARDS AND EDUCATIONAL COST INDEXES captures the main features of the problem and has the great advantage that it can be implemented with readily available data. In combination, a comfort standard and a cost index reveal how much house- holds in each community would have to spend to meet the standard, based on factors outside their control. In principle, this information also could be obtained from a production study conducted in each community, which would reveal the set of inputs needed to achieve the standard, along with information on each community’s input prices. The regression approach provides the same informa- tion at much lower cost by determining how spending varies with input costs and environmental conditions, controlling for actual comfort outcomes. Measuring Educational Performance With education, as with home comfort, one cannot set a performance stan- dard without selecting a way to measure performance. To put it another way, one cannot determine whether a school has met a performance standard unless its performance can be observed and measured. Policymakers may wish to avoid this choice, because selecting a standard is inevitably somewhat controversial. No set of performance standards can capture all aspects of learning, and schools may respond to specific standards by “teaching to the test” or otherwise shifting their resources to meet the standard at the expense of other legitimate objectives. Nevertheless, this choice cannot be avoided. Any policy to enhance school performance involves, either explicitly or implicitly, a specific performance mea- sure. The trick is to select performance measures that are rich enough to capture success in a range of educational activities. For the most part, the selection of a performance measure is based on the judgement of politicians and educational policy officials, perhaps with some input from scholars. The most common measure is based on some kind of test score, such as an average elementary reading or math score. A dropout rate is another widely used measure at the high school level. To set a performance standard, policymakers must select both a measure of performance and the level of performance school districts are expected to meet. For example, all school districts might be expected to achieve a certain average test score or to ensure that a certain percentage of their students score above some standard reference point on a certain test. Standards of this type can be set for a single indicator or for a set of indicators. For example, school districts might be expected to have a certain average test score and a certain graduation rate. We have developed an alternative approach, which selects performance indi- cators on statistical grounds. In particular, this approach determines which per- formance indicators are valued by voters, as indicated by their correlation with property values and school spending. This approach, which is explained in detail in Duncombe et al. (1996) and Duncombe and Yinger (1997), results in an index of educational performance. This index is a weighted average of the performance

OCR for page 260
265 WILLIAM D. DUNCOMBE AND JOHN M. YINGER indicators that are found to be statistically significant, where the weights reflect the value voters place on each indicator.1 In the case of New York State, this approach leads to an educational performance index based on three performance indicators: the average share of students above the standard reference point on the 3rd- and 6th-grade Pupil Evaluation Program (PEP) tests for math and read- ing, the share of students who receive a more demanding Regents diploma (which requires passing a series of exams), and the graduation rate. These indicators reflect a wide range of school district activities, including both elementary and secondary teaching and programs designed to promote student retention, and reflect the degree of success at both the high and low ends of the student perfor- mance distribution. Although these indicators are identified by an objective, statistically based procedure, they do not, of course, summarize all educational activities by a school district. Like all other approaches to measuring performance, this approach makes the problem manageable through some simplification. Moreover, this approach results in a performance yardstick, but it cannot determine the point on the yardstick that school districts should be expected to meet or that defines an adequate performance. As with other approaches, the selection of the perfor- mance target must be based on the judgment of public officials. Separating Factors In and Outside the Control of School Officials Either indirectly, as in the case of district report cards, or directly, as in the case of a performance-based aid system, performance standards are intended to boost a school district’s incentive to use effective educational policies. The problem, however, is that actual performance is influenced not only by the deci- sions of school officials but also by factors outside their control. Thus, some districts find it easy to meet a standard even if they are very inefficient, whereas others find it impossible to meet a standard even if they are more efficient than other districts. It is neither fair nor effective for a state to reward districts that achieve high performance (or to punish districts that perform poorly) based on factors that are outside their control. A fairer approach is to reward districts that perform well despite external obstacles, such as concentrated poverty, and punish districts that do not perform well despite favorable circumstances. The indoor-temperature example presented earlier may help to make this point because it explains how external factors work. Just as some communities face relatively high oil prices and harsh weather, which raise the cost of meeting any comfort standard, some school districts face relatively high input prices (such as teacher salaries) and relatively harsh educational environments, which raise the cost of meeting any educational performance standard. Thus, the key to removing external factors is to calculate an educational cost index; as we use the term, such an index measures the impact of input and environmental costs, not just input prices. This cost index plays a key role in public policy; it is not fair to

OCR for page 260
266 PERFORMANCE STANDARDS AND EDUCATIONAL COST INDEXES expect a high-cost district to achieve the same performance as other districts unless it is given enough resources to compensate for high costs. The Role of Input Prices In the case of education, the most important input is teachers, so in construct- ing a cost index, it is vital to account for teachers’ salaries. Secondary inputs, such as school facilities, also play a role in delivering education, but data on the prices of these inputs are generally not available. In some cases, data on admin- istrators’ salaries are available, but these salaries are so correlated with teachers’ salaries that they add little to the analysis. Like almost all the literature, there- fore, we will restrict our attention to the role of teachers’ salaries. A cost index is designed to measure the impact of factors outside the control of school officials. It is not appropriate, therefore, to directly use actual teachers’ salaries in constructing a cost index because those salaries reflect both the gener- osity of the school district, a factor over which they have control, and the under- lying labor market conditions, which cannot be influenced by school officials. A cost index should reflect the fact that some school districts are located in high- wage labor markets, where they must pay high salaries to attract people away from other school districts or away from the private market; and it should reflect the fact that the external conditions in some school districts are so harsh that teachers will not come there without receiving “combat pay”; but it should not reflect the fact that some school district administrators pay higher salaries than necessary to attract their teachers, because they are poor negotiators or for any other reason. A cost index should not be affected by school districts’ choices, so the influence of school officials on teachers’ salaries poses a challenge to anyone who wants to construct a cost index. Fortunately, however, well-known statisti- cal procedures can separate the impact of school officials on teachers’ salaries from the impact of external factors and produce a cost index based only on factors outside the control of school officials. These procedures are discussed in a later section. The Role of Environmental Factors The home comfort example reveals that the cost of meeting a performance standard depends not only on input prices but also on the environment in which the relevant services are provided. This lesson carries over into education, as school districts with a harsher educational environment must pay more to obtain the same educational performance as other districts. This section explains the impact of environmental factors on educational costs and shows how this impact can be estimated using widely available data. The key role of environmental factors, also called fixed inputs in the litera-

OCR for page 260
267 WILLIAM D. DUNCOMBE AND JOHN M. YINGER ture, was first identified in the Coleman Report (Coleman et al., 1966), which showed that a student’s performance on standardized tests depended not only on his or her own characteristics and family background but also on the characteris- tics and backgrounds of the students in his or her class. All else equal, for example, a student’s performance is likely to be lower if she comes from a poor family or if a large share of her classmates come from poor families. This finding translates into a statement about educational costs. If performance declines as student poverty increases, then a district with a high poverty rate cannot achieve the same performance as a district with a low poverty rate without running programs (which, of course, cost money) to offset the impact of poverty. The important role of environmental factors in educational production has been verified by dozens of studies. A review of many early studies is provided by Hanushek (1986). Good recent studies, such as Ferguson (1991), Ferguson and Ladd (1996), and Krueger (1997), use school and student-level data, and provide a more detailed analysis of the relationship between the student/family character- istics and student performance. The study by Ferguson and Ladd, for example, finds that a student’s 4th-grade educational performance (on reading and math tests) is affected by, among other things, the share of students receiving a free lunch (a measure of poverty), the share of adults in the district with a college degree, a measure of student turnover, and district enrollment. These studies are analogous to engineering studies that link detailed weather conditions and indoor comfort in each type of house. Production studies focus on the impact of environmental factors on a mea- sure of performance, such as a test score, holding-constant inputs selected by the school, such as the student/teacher ratio. These studies imply that costs are higher in school districts with a harsher educational environment, but do not estimate cost differences directly. Moreover, the results of these studies vary significantly, depending on the methodology, the quality of the data, and other factors, so that even if the results were translated into cost differences across districts, these differences would vary widely from one study to the next. Another set of studies shifts the focus to educational costs. These studies, which are analogous to a study of spending for home heating across a sample of communities, determine the extent to which districts with a harsh educational environment, as measured by the characteristics of their students, must pay more to achieve the same performance as other districts, where performance is mea- sured by a set of performance indicators. These studies include Bradbury et al. (1984), which looks at all local spending, including spending on education, as well as studies by Ratcliffe et al. (1990), Downes and Pogue (1994), Duncombe et al. (1996), and Duncombe and Yinger (1997). These studies build on a well- known general treatment of environmental factors by Bradford et al. (1969). At one level, these cost studies are equivalent to production studies; any statement about production can be translated into a statement about costs and vice versa. In practice, however, the cost approach has several advantages over the

OCR for page 260
268 PERFORMANCE STANDARDS AND EDUCATIONAL COST INDEXES production approach as a tool for informing state education policies. First, cost studies focus on school districts, which are the focus of state policy, instead of on individual students.2 Second, the cost approach makes it possible to examine a range of performance indicators simultaneously instead of one performance indi- cator at a time. Third, the data required to implement the cost approach are widely available. This does not imply, of course, that the cost approach resolves all the controversies about the nature of educational production that have been debated in production studies. It does imply, however, that the cost approach is a practical alternative to the production approach that makes it possible for state officials to design an educational finance system that accounts, in a reasonable way, using up-to-date data, for key features of educational production in their state. Perhaps the most crucial feature is variation in the educational environment across school districts. Existing cost studies all demonstrate that a harsher educa- tional environment, as characterized by high rates of poverty and single-parent families, for example, results in a higher cost to obtain any given performance level. Just as the harsh weather “environment” in Minnesota ensures that people who live there must pay more during the winter time than do people in San Diego to maintain their houses at a comfortable temperature, the harsh educational “environment” in some school districts, particularly big cities, ensures that those districts must pay more than other districts, sometimes much more, to obtain the same educational performance from their students. State educational officials are often aware that environmental factors matter. For example, a report on the status of the state’s schools by New York State Education Department says that “Five indicators, each associated with poor school performance, are useful for identifying students at risk of educational disadvan- tage: minority racial/ethnic group identity, living in a poverty household, having a poorly educated mother, and having a non-English language background” (State University of New York, 1997:3). However, states’ performance standards and state aid programs do not take account of these environmental factors in any systematic way. As a result, these programs do not provide sufficient revenue to high-cost districts to allow them to meet the same performance standard as other districts. The role of environmental factors also is widely ignored in the debate about the relative effectiveness of public versus private schools. Existing studies focus on whether differences in the performance between students in private and public schools, if any, can be explained by the possibility that students who attend private schools (or their parents) are more motivated than students who attend public schools (or their parents) (see, for example, Witte, 1996; and Rouse, 1997). This is, of course, a vital issue, but for policy purposes an equally impor- tant and largely ignored issue is whether existing differences in the performance of students in private and public school are due to environmental factors or to school policies. If, for example, performance in city public schools is lower than

OCR for page 260
269 WILLIAM D. DUNCOMBE AND JOHN M. YINGER in private schools because those public schools have more concentrated poverty among their students, then sending all city public school children to private schools would only export their poverty, undermine the educational environment in private schools, and, perhaps, have no impact on student performance. To put it another way, a finding that some private schools perform better than some public schools even after accounting for differences in student motivation gives no insight whatsoever into the impact on performance of a massive move away from public schools toward private schools, which would dramatically shift the educational environment in both types of schools. Thus, more research is clearly needed on the impact of environmental factors on the cost of private education. ALTERNATIVE METHODS FOR CALCULATING EDUCATIONAL COST INDEXES Several different methods for calculating educational cost indexes have been proposed by scholars. This section explores the strengths and weaknesses of several key methods, and it compares the indexes that result when each method is applied to data for New York State. Input Prices Some scholars have proposed that educational costs be measured with an index of input prices, usually just teachers’ salaries. Because teachers are by far the most important input in producing educational performance, teachers’ sala- ries do, indeed, have a major impact on educational costs. However, a teacher salary index, by itself, has three major flaws as a measure of educational costs. First, teachers’ salaries reflect differences in teachers’ experience and educa- tion, which are associated with quality differences across teachers. One cannot claim that a school district has high costs whenever it decides to hire teachers with extensive experience or with graduate degrees. Ideally, salaries that apply to teachers of comparable quality should be compared. Second, as noted earlier, teachers’ salaries at a given quality level can be influenced by the decisions of school officials. A cost index is intended to measure factors outside the control of school officials, so it should not reflect their bargaining skill or their generosity to teachers. A cost index based solely on teachers’ salaries will provide the misleading impression that generous school districts are forced to pay more than other districts to obtain the same perfor- mance, when in fact their higher spending is entirely of their own making. Finally, a teacher salary index ignores the role of the environment altogether. A school district with a harsh educational environment must spend more than other districts to obtain the same performance, even if teachers’ salaries are the same everywhere. Thus, an index based on teachers’ salaries leaves out one of the key sources of variation in educational costs across districts, namely environ-

OCR for page 260
270 PERFORMANCE STANDARDS AND EDUCATIONAL COST INDEXES mental factors, and therefore understates educational costs in districts with a relatively harsh educational environment. Adjusted Input Prices Some scholars have recognized the first two problems with a cost index based solely on teachers salaries and suggested an alternative approach based on predicted salaries (Chambers 1978, 1995; Wendling, 1981). This approach uses regression analysis to separate the impact on teachers’ salaries of internal factors under district control from the impact of external factors, and then predicts sala- ries holding the internal factors constant. In a typical study, the internal factors include teacher experience, education, and certification, as well as the district’s salary structure, and the external factors include the wage level in the surrounding labor market and the classroom environment that confronts teachers in each district. This approach explicitly recognizes that conditions in some schools are so harsh that teachers must receive “combat pay” to work there. In other words, equally qualified teachers will not come to those schools unless they are paid more than they would be paid at other schools where the private wage scale is the same. This approach solves the first two problems by constructing an index of predicted teachers’ salaries.3 One might at first conclude that it also solves the third problem because it accounts for the impact of environmental factors on teachers’ salaries. As explained earlier, however, environmental factors affect not only the price of inputs but also the quantity of inputs required. This point can be illuminated by returning to the nonschool example at the beginning of this chapter. A notion similar to “combat pay” could arise in the provision of home comfort if the price of the key input, namely natural gas, depends on the weather.4 Suppose, for example, that colder weather requires more maintenance of natural gas pipelines and hence leads to a higher price for gas in colder places. A cost index for comfort based on the price of natural gas would clearly capture this phenomenon. However, such a cost index would not capture the fact that, to obtain any given comfort standard, households in a colder climate not only must spend more per unit of gas but also must purchase more gas than households in a warmer climate. Similarly, to achieve any given perfor- mance standard, school districts with a harsh educational environment not only must pay more to attract teachers, but also must hire more teachers (or spend additional money on other educational programs) than schools with an average educational environment. In short, a cost index based on teachers’ salaries, even if it is predicted on the basis of external factors, including environmental ones, ignores an important source of variation in educational costs and understates costs in districts with a harsh educational environment.

OCR for page 260
287 WILLIAM D. DUNCOMBE AND JOHN M. YINGER TABLE 8-4 Comparison of Predicted Performance Under Different Foundation Formulas Relative to State Average Performance in 1991 for New York School Districts Performance-Based Aid System with Direct Cost Indices No Expenditure- Actual Endogenous Efficiency Exogenous Based Class of District Outcomes Efficiency Index Efficiency Aid System S*=25th percentile 76.2 Average 100.0 104.1 102.1 100.6 103.0 Downstate Small cities 83.0 85.4 90.1 90.0 78.4 Suburbs 118.5 123.7 125.7 126.0 120.0 Upstate Large cities 39.7 68.7 59.7 71.0 40.9 Rural 89.4 95.4 89.2 91.5 93.1 Small cities 82.2 90.1 83.0 86.3 86.1 Suburbs 104.1 104.8 101.5 102.4 107.4 S*=50th percentile 97.4 Average 100.0 122.7 117.6 117.7 118.9 Downstate Small cities 83.0 99.2 104.6 103.1 87.1 Suburbs 118.5 146.4 149.4 148.7 135.8 Upstate Large cities 39.7 87.0 80.9 89.3 47.6 Rural 89.4 115.8 106.4 108.1 110.9 Small cities 82.2 110.8 101.2 103.3 100.7 Suburbs 104.1 119.4 114.1 113.0 122.4 S*=75th percentile 120.4 Average 100.0 155.9 150.8 149.0 153.3 Downstate Small cities 83.0 138.2 141.9 139.2 123.3 Suburbs 118.5 195.6 198.4 197.4 180.3 Upstate Large cities 39.7 104.7 100.0 106.5 64.2 Rural 89.4 145.1 135.7 135.0 142.3 Small cities 82.2 136.7 128.3 127.2 128.7 Suburbs 104.1 148.6 143.6 139.9 155.4 NOTE: All grants require approximately the same state budget to fund as the aid system in 1991, $3.65 billion. S* is the student performance target that defines the aid system. Student performance is expressed relative to the state average performance in 1991. A value of 100 equals this average. All aid plans in this table require districts to assess the minimum tax rate set by the state.

OCR for page 260
288 PERFORMANCE STANDARDS AND EDUCATIONAL COST INDEXES tion plan with the minimum expenditure (not performance) level set at approxi- mately the 25th percentile of the 1991 expenditure distribution and with various hold-harmless and minimum-aid provisions. The second column of Table 8-4 presents results for a performance-based foundation plan with a required mini- mum tax rate and endogenous efficiency. The performance increases above their existing levels are most dramatic for large, upstate central cities. The average performance for these three cities with the most generous plan, 104.7, is slightly more than 2.5 times as large as their current performance, 39.7. This new perfor- mance level falls short of the target S*, 120.4, because the increased aid drives the efficiency level in these districts below the baseline level. This performance- based foundation plan also boosts performance in all other classes of district, although not by such dramatic amounts. Implementing this aid plan requires an understanding of cost indexes, an explicit decision about the acceptable level of inefficiency, and the estimation of cost indexes controlling for efficiency. Existing state plans do not take any of these steps, so Duncombe and Yinger (1997) also simulate three alternative foun- dation plans based on less complete information. The simplest foundation plan follows Equation 1, with no recognition of costs or efficiency. The results for such a plan are presented in the last (fifth) column of Table 8-4. Because the implicit expenditure target in the current New York foundation plan is set at about the 25th percentile of the current expenditure distribution, a comparison of the first and last columns in the first panel of these tables largely reflects the impact of eliminating hold-harmless and minimum-aid provisions and pooling all lump-sum aid into a foundation formula. These steps would modestly increase aid (and performance) in upstate cities, both large and small, and decrease aid substantially (with little impact on performance) in down- state cities and suburbs. The average impact on rural districts and downstate suburbs would be minimal. Bringing in the results in the second column, we can see that a performance- based foundation goes much farther than an expenditure-based foundation in shifting aid toward large cities and thereby boosting their performance. It does not go nearly as far, however, in shifting aid away from downstate small cities and suburbs, a result shown by their high performance. Largely because they face very high labor costs, these downstate districts tend to have high costs, a fact that is missed by an expenditure-based plan. The current system of hold-harm- less and minimum-aid provisions serves some of the same purpose as a cost correction by boosting aid to these districts, but it goes too far in this direction and does not ensure fair treatment either within these districts or between these districts and others. Table 8-4 also reveals that even the most generous expenditure-based foun- dation plan leaves large cities far short of any performance target, even with a required minimum tax rate. In fact, the most generous such plan, in the last panel

OCR for page 260
289 WILLIAM D. DUNCOMBE AND JOHN M. YINGER of Table 8-4, helps large cities but still leaves them at an performance level well below the 25th percentile of the current distribution! Related simulations in Duncombe and Yinger (1998b) make the key point here in a different way. Consider, the notion of a “performance gap,” defined as the sum across districts of the amount by which actual district performance falls below the performance standard, weighted by the number of students in the district. Duncombe and Yinger show that with the foundation level (and implicit performance standard) set at the 25th percentile of the 1991 performance distri- bution and a required minimum tax rate, an expenditure-based foundation plan would close only 36 percent of the current performance gap in New York. In contrast, a comparable, and equal-cost performance-based foundation plan would close 84 percent of performance gap (and would close 100 percent of the gap if all districts met the baseline efficiency standard). The point should be clear: expenditure-based foundation plans, which are used in most states, leave many high-cost districts short of even a minimal performance standard. A state cannot implement a performance-based aid program without estimat- ing a cost index. As noted earlier, an aid program for municipal services, includ- ing education, based on an estimated cost index was implemented in Massachu- setts (Bradbury et al., 1984), and school aid programs based on estimated cost indexes are presented in Ratcliffe et al. (1990) and Downes and Pogue (1994). However, the cost indexes estimated in these cases do not control for efficiency. Thus, we now examine performance-based foundation programs that incorporate a cost index estimated without controlling for efficiency and that implicitly as- sume, following Equation 2, that all districts are efficient. A cost index estimated in this way is biased, because the omission of an efficiency variable biases the coefficients of the included cost variables, but it takes a large step toward recog- nizing the role of input and environmental cost factors. Results for these programs, presented in the third column of Table 8-4, reveal that in most cases adding a cost index closes a large share of the gap between the expenditure-based foundation in the fifth column and the complete performance-based foundation in the second column. Under the most generous plan (75th percentile), for example, adding a biased cost index raises perfor- mance in upstate large cities from 64.2 (column 5) to 100 (column 3), compared to the complete-information performance (column 2) of 104.7. In contrast, the foundation plan based on a biased cost index leads to higher aid and higher performance for downstate small cities and suburbs than either the expenditure-based foundation or the complete-information foundation in the sec- ond column. As explained earlier, this result mainly reflects the large, negative correlation between efficiency and wage rates; because of this correlation, leav- ing efficiency out of the cost equation biases upward the coefficient of the wage variable and hence biases upward the cost index in places, like downstate dis- tricts, with high labor costs.18 In effect, therefore, an aid program based on a

OCR for page 260
290 PERFORMANCE STANDARDS AND EDUCATIONAL COST INDEXES biased cost index rewards the downstate districts for their inefficiency. This is, of course, an inappropriate outcome. This result poses a serious challenge to policymakers and researchers. Aid formulas based on simple cost indexes of the type that have been presented in the literature appear to be a big step in the right direction, but this step has a price. To the extent that efficiency is correlated with cost factors, a standard cost index will reflect inefficiency as well as costs, and an aid formula based on it will favor inefficient districts as well as high-cost ones. In New York, this effect does not boost aid to big cities, which despite their reputation are relatively efficient, but instead boosts aid to downstate small cities and suburbs, which tend to be ineffi- cient by our measure. Obviously the relevant correlations could vary from state to state, so these results cannot determine whether this type of plan would reward the same types of district for inefficiency in other states. Nevertheless, the possibility that the plan rewards inefficiency clearly undercuts its appeal. As noted earlier, one simple step a state can take to recognize the role of efficiency is to bring in the concept of baseline efficiency. All this step requires is identifying an efficiency level that is regarded as acceptable. This approach recognizes that virtually no districts will be able to achieve perfect efficiency so that spending greater than S*Ci is needed to bring district i up to the S* perfor- mance target. Compared to the previous approach, therefore, this approach fo- cuses more aid on higher-need districts. The fourth column of Table 8-4 shows the impact of a foundation plan with a biased cost index but with a baseline efficiency level set at the 75th percentile of the current efficiency distribution. This plan takes another small step toward the complete-information plan in the second column. In the downstate districts, the entry in column 4 generally falls between the entry in column 3, which has no correction for efficiency, and the entry in column 2, which is based on an unbiased cost index. Moreover, perfor- mance in large cities would actually be slightly higher with this plan than with our preferred plan because the biased cost index exaggerates the cost impact of high wages in these districts. The simulations in Table 8-4 all involve the same state budget, namely, the actual New York State educational aid budget in 1990-91. A natural question to ask is: What would happen if the budget increased? In the case of foundation plans without a minimum tax rate, the effect of a higher state budget can be dramatic. With a foundation plan of this type, many districts set tax rates below the level needed to achieve the foundation level of spending (and, in the case of performance-based foundation plans, of performance), that is, they use some of their state educational aid to fund noneducational programs or to cut taxes. Some portion of any additional state aid will be devoted to education and will therefore boost districts’ educational spending and performance. In the case of perfor- mance-based foundation plans with a minimum tax rate, however, additional state aid has no such effect. Somewhat ironically, in fact, additional state aid

OCR for page 260
291 WILLIAM D. DUNCOMBE AND JOHN M. YINGER leads to a small decrease in performance (Duncombe and Yinger, 1998a). The minimum tax rate requirement ensures that all districts raise enough revenue to fund the performance standard if they meet the baseline efficiency standard. Additional state revenue is therefore not needed to meet the standard. However, a district’s efficiency is influenced by the amount of aid it receives, and higher aid generally leads to lower efficiency. As a result, the increase in the state budget shifts the burden of financing education from local governments to the state, with no change in the amount of revenue available for education and, in the process, makes school districts a little less efficient. This drop in efficiency results in a small drop in performance.19 CONCLUSIONS An extensive literature establishes that both school district and student per- formance depend not only on factors that school officials control, such as the student/teacher ratio, but also on factors that are outside their control, including input prices, such as regional wage rates, and environmental factors, such as concentrated poverty. It follows directly that the cost of education is not the same in every district, with higher costs in districts in higher-wage labor markets or with a harsher educational environment. A shift to educational performance standards, whether these standards are simply targets or are imbedded in a foun- dation aid program, can be neither fair nor effective unless it recognizes this variation in the cost of education. This shift cannot be fair to districts that, through no fault of their own, face harsh educational environments, and it cannot be effective because it hands out rewards and punishments that are not related to the contributions of school personnel. Scholars have identified a variety of methods for measuring the cost of education, all of which have limitations. The simplest reasonable methods, which are indexes of teachers salaries predicted on the basis of conditions in the local labor market and in a district’s schools, fail to recognize that districts with a relatively harsh educational environment must hire more teachers (or purchase more of other inputs) than other districts to achieve the same performance. The most comprehensive methods, which recognize the role of environmental factors and control for school district efficiency, involve some complex, hard-to-explain steps. Nevertheless, the literature demonstrates that cost variation across schools is very large and cannot be ignored. Policymakers and scholars need to continue the search for sensible, practical ways to measure educational costs and incorpo- rate them into performance-based educational policies. NOTES 1. Strictly speaking, this interpretation of the weights depends on the as- sumption that educational performance is provided at constant cost (Duncombe

OCR for page 260
292 PERFORMANCE STANDARDS AND EDUCATIONAL COST INDEXES and Yinger, 1997). This assumption cannot be avoided without very complex statistical procedures, and it is employed in virtually all the educational finance literature, even though it has not been adequately tested. One implication of this assumption is that a district’s cost index does not depend on its level of perfor- mance. The performance indicators we selected had an adjusted R-squared of at least 0.10 with variables typically found in an education demand equation, in- cluding income and tax share. In addition, we ran a factor analysis on an array of performance indicators measures, and the scree plot indicated three distinct per- formance dimensions, which are very similar to the three measures we actually use. 2. In principle, the methods discussed here could be applied at the school level and perhaps even at the classroom level. Cost studies at the school level might prove to be helpful for understanding performance disparities within dis- tricts, which is an important issue in many large cities. However, the data for such applications have not yet become available and, even if available, would not yield district-wide cost indices because they could not consider some ele- ments of cost, such as the salaries of administrators and counselors. 3. This statement may be a bit too strong, because this method cannot adjust for teacher quality differences that are known to school officials (and reflected in salaries) but unknown to the researcher. For example, school offi- cials may be able to judge teacher quality through letters of recommendation or the interview process; however, such qualitative data about teachers are generally not collected on a consistent basis. As a result, this approach, along with the more general approaches in later sections, may overestimate the generosity of some districts with relatively high-quality teachers (based on unobserved fac- tors). 4. An analog to the first problem with using teachers’ salaries could arise in this example if households could select different grades of natural gas. There is no analog here to the second problem with using teachers’ salaries, however, since households do not negotiate over natural gas prices. 5. During the 1980s, a state aid program for municipalities in Massachu- setts incorporated a regression-based cost index (Bradbury et al., 1984). This program has since been discontinued. 6. The 1996 New York State aid formulas are described in State Aid to Schools (State University of New York, 1996). 7. One might be concerned that income and preference variables, such as average education, are themselves endogenous. If so, using these variables as instruments could lead to biased coefficients and biased cost indexes. With our New York State data, however, leaving out these instruments has virtually no effect on the cost indexes. The correlation between the index we report and either an index without the income instrument or the preference instruments is 0.99 or above. 8. Applications of this technique to education typically ignore environmen-

OCR for page 260
293 WILLIAM D. DUNCOMBE AND JOHN M. YINGER tal cost factors, and therefore incorrectly conclude that high-cost districts are “inefficient” (Bessent and Bessent, 1980). Ruggiero (1996) shows how to incor- porate environmental factors into the DEA calculations, but also shows that the technique breaks down if more than one or two such factors are introduced. Several other approaches to efficiency estimation use various ad hoc statistically based adjustments for multiple environmental factors (Ray, 1991; Grosskopf et al., 1997; Duncombe and Yinger, 1997). 9. Two new instruments are included to account for the endogeneity of efficiency: (1) a dichotomous variable for whether the district is a city district (city districts are not required to use an annual budget referendum); and (2) per- cent of district employees that are executives, managers, or professionals. The first of these variables is clearly exogenous since rules regarding referendums are set by state law, and there has not been any change in the city classification in decades. The second instrument, like the instruments discussed in note 7 might not be exogenous. However, if this instrument is excluded from the re- gression, the estimated cost index is hardly affected; its correlation with our preferred index is 0.99. 10. Because we rely on the 1990 census data, we cannot determine the sta- bility of these cost indices over time. This is a good issue for future research. 11. The enrollment in a district is influenced, of course, by district consoli- dation or district splitting. The U-shaped relationship we estimate implies that it may be possible to lower educational costs by consolidating small districts and splitting up large ones. We do not consider these possibilities here, but consoli- dation is examined in Duncombe et al. (1995). In the long run, enrollment in a school district may also be influenced by the decisions of school officials, as parents make residential choices and choices about private school based on school quality and property tax rates. We know of no study that considers this possibility, and it is an important topic for future research. 12. We do not attempt a full analysis of spending on students with disabili- ties and in fact our dependent variable, approved operating expenditure as mea- sured by the state, does not include all such spending. For an analysis of the cost of special education, see Chaikind et al. (1993). Broader measures of students with disabilities also complicate the analysis because they may be influenced by the way a school district determines which students have disabilities (Lankford and Wyckoff, 1996). We restrict our analysis to environmental variables, such as the poverty rate and the share of students with severe disabilities, that appear to be, for all practical purposes, outside the control of school officials. 13. These results are not driven by New York City. A regression analysis that excludes New York City and Yonkers (columns 1 and 2 of Table 8-1) results in cost indexes, both for New York City and for other districts, that are similar to those in Tables 8-2 and 8-3. 14. Historically, New York has used a modified foundation formula, but the current formula mixes elements of a foundation formula and a power-equalizing

OCR for page 260
294 PERFORMANCE STANDARDS AND EDUCATIONAL COST INDEXES formula. In effect, the current formula appears to act as a power-equalizing formula for districts with spending levels in the middle of the spending distribu- tion. The state aid formulas are described (State University of New York, 1996). 15. The key problem with this approach is that it does not provide research- ers with a method for identifying this average-cost district. This problem is not recognized by Guthrie and Rothstein (see Chapter 7 in this volume). 16. As noted earlier, New York City has the highest costs in the state, but it also now receives below average aid per pupil. New York City also has a very high share of the pupils in the state, so if it is included in the simulations, it receives a very large share of any performance-based aid program and little is left over for other districts. Results for other states are unlikely to be so domi- nated by one district, so we focus on simulations without New York City. The cost index used in these simulations is based on the regression results reported in columns of 1 and 2 of Table 8-1. Besides being estimated without New York City and Yonkers, this cost regression differs from the one reported in columns 3 and 4 (which was used for our preferred cost index in this chapter) because of several data modifications made since publication of Duncombe and Yinger (1997). 17. Table 8-4 is a revised version of Table 6 in Duncombe and Yinger (1997). The revisions were necessary to correct a minor programming error. This error did not affect any other tables, nor did it affect any of the substantive conclusions in the original Duncombe and Yinger paper. 18. All cost coefficients are biased when efficiency is left out of a cost model, but in our equation the bias in the wage variable is particularly dramatic. 19. This analysis ignores tax distortions. A shift to more state aid (and, in New York at least, a more balanced education finance system) could result in efficiency gains in the form of less tax distortion that are large enough to offset the efficiency losses in education. ACKNOWLEDGMENTS This chapter was originally prepared for a conference sponsored by the Committee on Education Finance, National Research Council, Irvine, California, January 30-31, 1998. The authors are grateful to Janet Hansen, Sunny Ladd, Bob Schwab, Steve Sugarman, and other participants in that conference for comments on the original draft. REFERENCES Bessent, A., and E. Wailand Bessent 1980 Determining the comparative efficiency of schools through data envelopment analysis. Educational Administration Quarterly 16:57-75.

OCR for page 260
295 WILLIAM D. DUNCOMBE AND JOHN M. YINGER Bradbury, K.L., H.F. Ladd, M. Perrault, A. Reschovsky, and J. Yinger 1984 State aid to offset fiscal disparities across communities. National Tax Journal 37:151- 170. Bradford, D., R. Malt, and W. Oates 1969 The rising cost of local public services: some evidence and reflections. National Tax Journal 22:185-202. Chaikind, S., L.C. Danielson, and M.L. Brauen 1993 What do we know about the costs of special education? A selected review. Journal of Special Education 26(4):344-370. Chambers, J. 1978 Educational cost differentials and the allocation of state aid for elementary and secondary education. Journal of Human Resources 13:459-481. 1995 Public school teacher cost differences across the united states: Introduction to a Teacher Cost Index (TCI). Pp. 21-32 in Developments in School Finance, 1995. Washington, DC: National Center for Education Statistics. Clotfelter, C., and H.F. Ladd 1996 Recognizing and rewarding success in public schools. Pp. 23-64 in Holding Schools Accountable: Performance-Based Reform in Education, H. F. Ladd, ed. Washington, DC: The Brookings Institution. Coleman, J.S, E.Q. Campbell, C.J. Hobson, J. McPartland, A.M. Mead, F.D. Weinfeld 1966 Equality of Educational Opportunity. Washington, DC: U.S. Department of Health, Education and Welfare. Courant, P., E. Gramlich, and S. Loeb 1995 A report on school finance and educational reform in Michigan. Pp. 5-33 in Midwest Approaches to School Reform, T.A. Downes and W.A. Testa, eds., Chicago: Federal Reserve Bank of Chicago. Downes, T., and T. McGuire 1994 Alternative solutions to Illinois’ school finance dilemma: A policy brief. State Tax Notes February 14: 415-419. Downes, T., and T. Pogue 1994 Adjusting school aid formulas for the higher cost of educating disadvantaged students. National Tax Journal 47:89-110. Duncombe, W., J. Miner, and J. Ruggiero 1995 Potential cost savings from school district consolidation: A case study of New York. Economics of Education Review 14:356-384. Duncombe, W., J. Ruggiero, and J. Yinger 1996 Alternative approaches to measuring the cost of education. Pp. 327-356 in Holding Schools Accountable: Performance-Based Reform in Education, H.F. Ladd, ed. Wash- ington, DC: The Brookings Institution. Duncombe, W., and J. Yinger 1997 Why is it so hard to help central city schools? Journal of Policy Analysis and Manage- ment 16(1):85-113. 1998a An analysis of two educational policy changes in New York: Performance standards and property tax relief. Pp. 99-136 in Educational Finance to Support Higher Learning Standards, J.H. Wyckoff, ed. Albany, NY: New York State Board of Regents. 1998b School finance reform: Aid formulas and equity objectives. National Tax Journal 51:239-262. Ferguson, R. 1991 Paying for public education: New evidence on how and why money matters. Harvard Journal on Legislation 28:465-498.

OCR for page 260
296 PERFORMANCE STANDARDS AND EDUCATIONAL COST INDEXES Ferguson, R., and H.F. Ladd 1996 How and why money matters: An analysis of Alabama schools. Pp. 265-298 in Holding Schools Accountable: Performance-Based Reform in Education, H.F. Ladd, ed. Wash- ington, DC: The Brookings Institution. Grosskopf, S., K. Hayes, L. Taylor, and W. Weber 1997 Budget-constrained frontier measures of fiscal equality and efficiency in schooling. Re- view of Economics and Statistics 79:116-124. Hanushek, E. 1986 The economics of schooling: Production and efficiency in public schools. Journal of Economic Literature 24:1141-1177. Inman, R. 1979 The fiscal performance of local governments: An interpretative review. In Current Issues in Urban Economics, P. Mieszkowski and M. Straszheim, eds. Baltimore: The Johns Hopkins University Press. Krueger, A.B. 1997 Experimental Estimates of Education Production Functions. Working Paper #379. Princeton, NJ: Princeton University, Industrial Relations Section. Ladd, H.F., and J. Yinger 1991 America’s Ailing Cities: Fiscal Health and the Design of Urban Policy. Baltimore: The Johns Hopkins University Press. 1994 The case for equalizing aid. National Tax Journal 47:211-224. Lankford, H., and J. Wyckoff 1996 The allocation of resources to special education and regular instruction. Pp. 221-257 in Holding Schools Accountable, H.F. Ladd, ed. Washington, DC: The Brookings Institu- tion. Miner, J. 1991 A Decade of New York State Aid to Local Schools Metropolitan Studies Program Occa- sional Paper No. 141. Syracuse, NY: Syracuse University, Center for Policy Research. Ratcliffe, K., B. Riddle, and J. Yinger 1990 The fiscal condition of school districts in Nebraska: Is small beautiful? Economics of Education Review 9:81-99. Ray, S.C. 1991 Resource-use efficiency in public schools: A study of Connecticut data. Management Science 37:1620-1628. Reschovsky, A., and J. Imazeki 1998 The development of school finance formulas to guarantee the provision of adequate edu- cation to low income students. Pp. 121-148 in Developments in School Finance 1997. Washington, DC: National Center for Education Statistics, U.S. Department of Educa- tion. Rouse, C.E. 1997 Private School Vouchers and Student Achievement: An Evaluation of the Milwaukee Parental Choice Program. Unpublished manuscript, May 1997. Ruggiero, J. 1996 On the measurement of technical efficiency in the public sector. European Journal of Operational Research 90:553-565. State University of New York and The State Education Department 1996 State Aid to Schools: A Primer. Albany, NY: The State Education Department. 1997 New York: The State of Learning, Statewide Profile of the Educational System. Albany, NY: The State Education Department.

OCR for page 260
297 WILLIAM D. DUNCOMBE AND JOHN M. YINGER Steinberg, J. 1998 Underachieving schools are shamed into improvement. The New York Times (January 7):B7. Wendling, W. 1981 The cost of education index: Measurement of price differences of education personnel among New York state school districts. Journal of Education Finance 6:485-504. Witte, J.F. 1996 School choice and student performance. Pp. 149-176 in Holding Schools Accountable: Performance-Based Reform in Education, H.F. Ladd, ed. Washington, DC: The Brookings Institution