National Academies Press: OpenBook

Measuring Access to Learning Opportunities (2003)

Chapter: 4. Strengthening the E&S Survey Data

« Previous: 3. Use of E&S Survey Data
Suggested Citation:"4. Strengthening the E&S Survey Data." National Research Council. 2003. Measuring Access to Learning Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/10673.
×

4
Strengthening the E&S Survey Data

The primary purpose of the E&S survey has been to provide the Office for Civil Rights (OCR) with data to use when targeting compliance review sites or when investigating complaints. Because OCR attempts to settle complaints through informal procedures and rarely issues findings as part of a response to a complaint, it is unclear whether and to what extent OCR or complainants use the data.

The lack of strong information about use raises the issue of the basic purposes of the E&S survey and what can be done to strengthen it, both technically and to address policy concerns. Although the committee recognizes that the original intents of the survey continue to be important, many educational and societal changes have occurred since its inception. The state of the art in data collection, analysis, and coordination with other data systems has also advanced greatly since the inception of the survey. These factors, plus the passage of the No Child Left Behind Act of 2001, which includes substantial accountability provisions for states and local school districts, argue for expanded availability and use of the E&S survey data.

The No Child Left Behind Act focuses attention on the outcomes of educational practice by requiring testing in reading, mathematics, and science and related accountability for results and by requiring that the results be disaggregated by several categories, including gender, race, ethnicity, English proficiency, disability status, and economic disadvantage. The E&S survey could help provide a fuller picture of the educational process with information on educational inputs that describe the access of students to quality opportunities to learn. Halpern (1995) suggests that OCR has strongly focused on frequency counts of racial categories

Suggested Citation:"4. Strengthening the E&S Survey Data." National Research Council. 2003. Measuring Access to Learning Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/10673.
×

and that a reliance on this approach will have limited value unless it is supplemented by a focus on the quality of the educational opportunities that students experience.

This chapter highlights ways in which the E&S survey could be strengthened and therefore be made more useful to OCR and others concerned with ensuring access to learning opportunities. The committee offers several ways to strengthen the survey in three broad categories: methodology and technical issues, content, and use.

METHODOLOGY AND TECHNICAL ISSUES

Field Testing and Respondent Validity Studies

The E&S survey instruments are somewhat complex and require respondents to collect a substantial amount of detailed information—such as on enrollments and dropout rates, children with disabilities, racial and ethnic categories, disciplinary events, testing, student assignment, athletics, and teacher certification. The capability of districts and schools to collect the required information efficiently and accurately varies greatly. Some districts have computerized student identification and data management systems in which most of the needed information is routinely collected and analyzed, while other districts and schools may use a paper system to collect some or all of the data. Still others collect the data that the state requires once a year for fall enrollment procedures and do not collect any additional information. In addition, as noted above, several of the questions may not be easily understood by respondents or ask for information that could be understood or interpreted differently by different jurisdictions.

In addition to issues related to the various data collection systems is the question of which employees at the school or district level actually complete the survey. The committee heard from members who have responsibility for data collection in their jurisdictions that there are substantial differences in roles and responsibilities among respondents. In some cases, a data manager may complete the forms; in other cases, an administrator (e.g., school principal or assistant principal) is charged with the responsibility. In still other instances, a clerk from the central office of the district or school may be given the task. A strength of the survey is that it requires a signed certification that the data are accurate. While this requirement enhances the probability of accurate data, it does not necessarily ensure that the data were collected and reported in a consistent and reliable manner.

Suggested Citation:"4. Strengthening the E&S Survey Data." National Research Council. 2003. Measuring Access to Learning Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/10673.
×

These variations in the way the E&S survey is administered can be expected to have an effect on the overall reliability of the information collected. The degree of this effect is not known, but minimizing the unreliability of the information is critical. The committee suggests two courses of action. First, there should be an extensive fieldtesting component as changes are made in the survey, as is done for state and local student testing programs that regularly conduct item tryouts, and field testing of new and revised assessment instruments to be sure that students can understand the test questions and respond appropriately. Second, OCR should consider more extensive field testing of its new or revised forms to help ensure that respondents understand the nature of the questions and how to complete the forms. Similarly, OCR should also conduct validity studies to determine whether the information being collected is, in fact, valid. This kind of study would require that schools be sampled and the information they supply be compared with documentation that exists. Discrepancy rates could then be calculated and the instances in which no backup documentation exists could be tabulated. This type of validity study would give OCR an idea of how accurate the results from the survey really are. If it is not possible to conduct a full-scale validity study, OCR should consider implementing a recommendation from Croninger and Douglas (2002) to conduct a series of small-scale focus interviews to determine how school and district administrators complete certain key tables in the survey.

Tracking Trend Data

One common purpose for large data collection systems is to provide trend information over time. It is certainly logical to expect that a survey that has existed since 1968 would have some capability to provide long-term trends. At present, this capability seems quite limited. The data are not accessible in a common electronic format and therefore cannot be easily retrieved or manipulated. Both the computer tapes and printouts of descriptive information from surveys administered from 1968 to 1992 were not stored in a central location and effectively were lost for several years, then relocated in 1998. Although the data have been transferred from the tapes to computer disks, the data for various survey administrations are in incompatible formats, as they were compiled by computer software and hardware that are now obsolete. For this reason, OCR currently cannot access and analyze most E&S data longitudinally (Peter McCabe, former director, E&S survey, personal communication, 2002).

Suggested Citation:"4. Strengthening the E&S Survey Data." National Research Council. 2003. Measuring Access to Learning Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/10673.
×

For historical purposes, OCR should undertake efforts to compile past data for comparable survey questions since its 1968 inception. OCR also should consider the benefit of building a data system that will strengthen the capacity to allow for easily generated, accessible trend data for reporting categories in the future.

Data Projection Methodology

A technical report on the E&S survey done for OCR (U.S. Department of Education, Office for Civil Rights, 2000b) addressed the issue of state and national projections from the reported E&S survey data. Depending on the years, the survey samples varied in their statistical validity. For most years, the samples have been of sufficient quality to allow for projections to the state and national levels. However, the surveys collected in 1969, 1971 and 1973, 1982, and 1990 did not yield statistically valid samples, so defensible projections could not be constructed.1 Also, other problems with the survey administered in 1996–1997 made these data unreliable (Glennon, 2002, p. 213; Peter McCabe, personal communication, 2002).

OCR should routinely utilize appropriate sampling methodology techniques to ensure that projections to state and national levels can be produced. In recent years, the projections, when they have been calculated, have provided a “best estimate,” but the methods that are currently used do not allow calculation of confidence intervals to clearly identify the degree of error associated with the projections. In addition to developing the methodology to routinely provide projections, OCR should produce and report confidence levels.

Adjusting Protocols

The committee discussed a number of issues related to the changing demographic nature of the country and the different initiatives that states and schools have implemented to address the situation. As an example, the committee finds that the collection of classroom-level data (Item 13) could be a major strength of the survey, but that problems with the protocol for data collection and ambiguity in the wording of the question pose problems. Here we address the protocol issue; the problem of ambiguous wording is addressed separately, below.

The survey protocol has historically placed limits on which schools complete the classroom portions of the survey. OCR collects classroom-level data at the entry and exit grades (e.g., grade 1 and

1  

According to the technical report (U.S. Department of Education, Office for Civil Rights, 2000b), the survey for 1990 produced a sample that could be projected to the national level but was not valid for state-level projections.

Suggested Citation:"4. Strengthening the E&S Survey Data." National Research Council. 2003. Measuring Access to Learning Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/10673.
×

grade 6) for elementary school pupils in schools in which minority enrollment in the school is greater than 20 percent and less than 80 percent. Classroomlevel data collected for these schools include the grade level, ability grouping, and number of students by race, ethnicity, and English proficiency.

The data collection procedure used by OCR eliminates elementary schools that have fewer than 20 percent or more than 80 percent of their students from racial and ethnic minority groups, as well as all middle schools, junior high schools, and high schools. The committee notes that this protocol might have made sense when the demographic composition of the nation and its schools was less complex, but it is inadequate for the multiethnic demographics of schools in the 21st century. OCR should consider changing the data collection protocols so that all elementary schools selected in the sample would supply classroom data.

Data Editing

When administering any large-scale survey, such as the E&S survey, data editing is important. Because the administration and operation of the survey have not been well funded, data editing has frequently suffered. Glennon (2002) indicated that the data from the 1996 survey, actually administered in 1997, turned out to be unusable. Officials responsible for the survey have indicated that there were insufficient funds to do proper editing of the data that were received and that, in times of budget shortfalls, data editing is frequently a casualty.

Croninger and Douglas (2002) conducted analyses to look at the prevalence of high-stakes testing. They found that the data they used from Tables 12A and 12B of the E&S survey for 2000 had not been adequately edited or verified by OCR. They had to make certain logical assumptions about treating the data to do their analyses; these inferences could have been avoided had proper editing been done and documentation provided. Even when data editing has been done, the contractor has in the past carried out the editing with little or no oversight from OCR. In the future, OCR should place a priority on editing the data that are collected in order to have the best datasets available for use by OCR, advocates, parents, and researchers.

CONTENT

Changes in Existing Survey Items

In fulfilling its charge, the committee examined the survey items to determine whether improvements to specific items would enhance the validity and reliability of the survey. The committee identi

Suggested Citation:"4. Strengthening the E&S Survey Data." National Research Council. 2003. Measuring Access to Learning Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/10673.
×

fied seven topics as prime candidates for revision: children with disabilities, testing, high school completion, student assignment, advanced placement classes, interscholastic athletics, and teacher qualifications.

Children with Disabilities

Item 10.1 (Table 10.1) collects information on children with mild, moderate, and severe retardation by race and ethnicity. Item 10.2 (Table 10.2) collects information on children identified as emotionally disturbed and with specific learning disabilities, by race and ethnicity. Both items also ask for information about the educational placement of these students (i.e., the percentage of time spent outside of a regular classroom), but no data on the race and ethnicity of students in different educational placements are requested. These items would be strengthened by collecting race and ethnicity data for educational placement.

How students with disabilities are served makes a huge difference in their future prospects (see Vaughn et al., 2000; Swanson, 1999; National Research Council, 2002b, pp. 324–328), and the quality of services available for children with disabilities varies widely. In many cases, students are well served. But, not infrequently, students are identified as having disabilities and sent to groups or classrooms that actually reduce their learning opportunities (see Garcia Fierros and Conroy, 2002). Misassignment can be the consequence of teachers’ inability to serve difficult-toteach students (see Harry et al., 2002). There is abundant evidence that black, American Indian, and Hispanic students are disproportionately identified as having certain types of disabilities (Finn, 1982; National Research Council, 2002b; Losen and Orfield, 2002b).

Testing

High-stakes tests are those whose passage is required before a student is allowed to advance to the next grade or graduate from high school. Retention in grade often results in increasing the likelihood that students affected will drop out of school or learn less than if the need to accelerate their learning had been addressed in other ways (National Research Council, 1999c, pp. 128–132). Item 12 asks about the use of highstakes tests for grade promotion and high school graduation.

Tables 12A and 12B ask whether testing information is used as a “sole” or “significant” criterion for grade promotion and for high school graduation, respectively. The meaning of “sole criterion” and “significant criterion” is ambiguous, and item 12 would be improved by clarifying the meaning of these terms. In addition, the item does not ask for the number of students who

Suggested Citation:"4. Strengthening the E&S Survey Data." National Research Council. 2003. Measuring Access to Learning Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/10673.
×

fail the test the first time. As a result, the current information contains repeat test takers who have failed one or more times; this confounds the results, and the meaning of the data becomes unclear. The importance of improving this question is heightened by the provisions of the No Child Left Behind Act, which mandates increased testing during the elementary and middle grades.

Student Assignment

As noted in the discussion of protocol, above, Item 13 elicits information on the classroom assignments of students from different racial and ethnic groups and for students with limited English proficiency. The item also asks whether any students are “ability grouped for instruction in mathematics or EnglishReading-Language Arts” in that classroom. Information on classroom assignment is requested for the lowest and highest elementary grades only.

The civil rights concerns emanate from evidence that many students who are “tracked” on a continuing basis into separate classrooms or groups within classrooms because of their belowgrade-level performance continue to lose academic ground in these settings, that minority students are disproportionately “tracked” into low-ability classes, and that such practices may produce “within-school segregation” (see Oakes, 1990; Mickelson, 2001). Experts agree, however, that some approaches to ability grouping (or, more accurately, performance grouping) can serve important educational purposes providing they enlarge, rather than restrict, opportunities to learn (see Slavin et al., 1994). So knowing that students are grouped by ability, by itself, is not sufficient to identify discriminatory or ineffective practices.

Also, the question does not ask how many students of each racial or ethnic group are in the upper or lower ability groupings. The definitions and instructions for ability grouping are complicated and could be confusing to respondents at the school level. Specifically, student assignment data are organized according to teacher identification codes. This can be ambiguous, particularly in the upper elementary grades in which students are more likely to be taught by more than one teacher. Also, as noted in the protocol section above, the fact that only schools in which minority students constitute more than 20 percent but less than 80 percent of total enrollment are required to complete Item 13 is problematic.

Information about classroom assignment and ability grouping is extremely important, but the current wording of this item severely limits its value. This item could be greatly improved with minor modifications: providing a clearer

Suggested Citation:"4. Strengthening the E&S Survey Data." National Research Council. 2003. Measuring Access to Learning Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/10673.
×

definition of ability grouping, the addition of information about the subject matter being taught (e.g., Is it a language class for English-language learners?), as well as information about the racial and ethnic composition of students in the highest and lowest groups. In addition, all sampled schools should be required to provide information on classroom assignment.

Advanced Placement Classes

When students are motivated to learn, the opportunity to engage rigorous curricula often leads to higher achievement (Adelman, 1999). The absence of such learning opportunities restricts what and how much students learn and gives an advantage to those students who do have access to more demanding courses and programs. Item 14 asks about advanced placement (AP) classes offered by the school. However, the survey does not provide a denominator that is more specific than the overall racial and ethnic composition of the school. For example, information is not cohort or grade specific. This makes estimation of the percentages of students in each race and ethnic group who are in AP classes less precise than it might otherwise be. Also, the question is limited to AP classes and does not collect information on other advanced courses of study like the international baccalaureate program and honors programs. The item should be expanded to include other advanced study programs and data specifically on groups by grade or cohort.

High School Completion

Graduation from high school is, of course, a critical step toward college or well-paying jobs. Item 15 asks about high school completion (diploma and certificate of attendance or completion) offered by the school. The race and ethnicity data do not provide specific denominators (e.g., the number of students from each race and ethnic group entering high school), so it is not possible to determine the percentages of students who graduate from each race and ethnic group. Also, there seems to be considerable ambiguity in the meaning of the types of completion certificates. For example, some states have begun to offer a Certificate of Initial Mastery and Certificate of Advanced Mastery to their students, a trend that may accelerate as a result of the No Child Left Behind Act of 2001. Gathering data specifically on groups by grade or cohort as well as clarifying the definitions for the type of completion certificates would strengthen the item.

Interscholastic Athletics

Item 16 asks about the number of different sports and teams offered at the school and whether the sports that are

Suggested Citation:"4. Strengthening the E&S Survey Data." National Research Council. 2003. Measuring Access to Learning Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/10673.
×

offered include males only, females only, or both male and female participants. The item elicits useful information on gender equity, but adding information on the race and ethnicity of students would be useful. Although many school districts under long-standing school desegregation court orders have been required to demonstrate equitable access to extracurricular activities for black students, data have not been routinely collected to allow comprehensive monitoring. The E&S survey has collected information on student participation in interscholastic athletics since 1994 to monitor Title IX issues, but information on the race and ethnicity of student participants has not been collected for Title VI purposes. The collection of this information either in this question or as an additional item would provide useful data to OCR and those concerned with equity in extracurricular learning opportunities.

Teacher Qualifications

The school-based learning opportunity that accounts for the greatest variation in student achievement is quality teaching (see Sanders and Horn, 1995; Sanders and Rivers, 1996; Hedges, Layne, and Greenwald, 1994; Darling Hammond, 1997a; Ferguson, 2000). To measure this, Item 17 asks how many full-time teachers employed by the school meet requirements for a standard certificate. With the field of teacher certification rapidly changing in states and the provisions of the No Child Left Behind Act that every child have a fully qualified teacher, the committee questions whether the concept of a “standard certificate” has become ambiguous. The language of the question should be clarified or possibly changed to include whether teachers are teaching in the content field or specialty for which they were trained to teach.

A related issue with respect to teachers is their years of teaching experience. Teacher inexperience is negatively related to teacher effectiveness, at least in the first 3–4 years of teaching. Since minority students, those with disabilities, and those with limited English proficiency are often more likely to be taught by novice teachers, teacher experience should be a subject of the E&S survey.

Items That Might Be Eliminated

The committee discussed whether some items could be deleted to help streamline the survey and to rid it of questions that are not actionable or cannot be cast into language that would elicit clear and useful responses. Items 6 and 6a on the District Summary Report (ED 101) cover how many

Suggested Citation:"4. Strengthening the E&S Survey Data." National Research Council. 2003. Measuring Access to Learning Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/10673.
×

students were identified as pregnant during the previous academic year and then how many currently are not in school. The committee concludes that these questions are not particularly helpful in generating actionable data for OCR enforcement, nor do they connect directly to civil rights concerns.

USE

Improving Access to Survey Data and Survey Findings

The OCR E&S survey has been useful to a wide variety of users, including state and federal agencies, education advocates, civil rights attorneys, and academic researchers. Although the data have been used for important purposes, they could be more widely and productively used.

Historically, the data files have been difficult to access and utilize in complex examinations. Recently, however, OCR has made some significant strides in making information more accessible to the public and to education advocates. OCR has placed some of the information on its website so that users can access the data and query them in simple ways. This action should make the information much more useful to the general public, and OCR should be commended for taking this step.

The fledgling effort should be evaluated to determine whether the visitors to the website find that the information and the formats provided are useful. It would also be helpful to find out whether users are content with the level of analysis currently available or whether other, more detailed analyses are desired. Also, given the sporadic nature of OCR data editing of the survey, there should be an examination of possible data errors and whether, if errors are found, it is a substantial enough problem to discourage the public from using the data.

There are a number of actions that OCR could consider that would improve access to the survey data. Several of these possible solutions would require a greater allocation of resources within OCR. Some actions, however, may not require additional monetary resources but rather would demand more coordination and cooperation between offices within the Department of Education.

Formatting the Data to Make It Easier to Use

Some of the academic researchers commissioned by the committee (e.g., Ready and Lee, 2002; Croninger and Douglas 2002; see Appendix A) found the files to be problematic to use, prompting several attempts to have the datasets provided in different formats.

Suggested Citation:"4. Strengthening the E&S Survey Data." National Research Council. 2003. Measuring Access to Learning Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/10673.
×

Eventually some of the researchers figured out ways to use the files or were provided with “flat files”2 for certain years, which included much of the raw data that could be more easily transformed into other statistical formats for analysis. The process was not a smooth one and required the researchers to spend much time struggling to make the system work.

For research purposes, E&S data should be provided either as flat files with detailed codebooks or as welllabeled statistical package files (e.g., SPSS, SAS, or STATA). If the data are made available as any one of these types of files, researchers can then use transfer software (STATTRASFER or DBMScopy) to translate the files to any other package. Good documentation of such procedures is essential.

The National Center for Education Statistics (NCES) is the primary statistical agency for the collection, analysis, and publication of education data. Its goal is to collect, analyze, and disseminate statistics and other information related to education in the United States and in other nations. NCES has a large portfolio of data collection projects, including surveys in early childhood, elementary and secondary education, international indicators, postsecondary issues, and the National Assessment of Educational Progress. NCES is a major source of educational data and information for the public. NCES has the professional staff and the experience needed to anticipate problems and issues that may arise as users of the E&S survey data attempt to secure and analyze the data. OCR should discuss with NCES the possibility of including E&S survey data in the datasets it makes available to the public.

OCR should also investigate whether selected findings could be published in other NCES or Department of Education documents that routinely get wide dissemination to states, school districts, and the public. One example of such a document is the department’s annual Condition of Education report, which presents key findings from a wide range of data collection vehicles.

Training and Support of the E&S Survey Users

It is not unusual for users of large datasets, like the E&S survey, to need training and support to make maximum use of the information. For some users, it may be enough to access the less complex data displays that currently exist on the website. For more experienced users of the basic data, as well as researchers who want to do complex secondary analysis of the datasets and link them to other, non-OCR datafiles,

2  

A flat file is a text file that is not tied to any particular computer program.

Suggested Citation:"4. Strengthening the E&S Survey Data." National Research Council. 2003. Measuring Access to Learning Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/10673.
×

additional support may be warranted to facilitate the work. Obstacles may include such topics as how to treat missing data, how to avoid potential misinterpretations of the data resulting from nonobvious definitions of the variables, and inaccuracies. Online tutorials and hard-copy data manuals are possible solutions to making the data more useful to a wider range of analysts. NCES may be able to provide some assistance to OCR since that agency has a solid history of providing web access and data analysis tools for its products. NCES has also conducted data analysis workshops for state education agency personnel, and university and privatesector secondary researchers, an activity OCR should consider.

Connections to Other Data

The E&S survey is but one of many surveys conducted by the U.S. Department of Education. The E&S survey serves a unique set of purposes, but it also contains questions that other surveys ask in similar, if not identical, ways, particularly for information on special education students because the department’s Office of Special Education and Rehabilitative Services collects extensive data on services. The NCES collects education data for a wide variety of purposes, some of which overlap with the content of the E&S survey. OCR should work with these agencies to identify overlap and to see if redundancy exists. Moreover, as previously noted, Halpern (1995) suggests that OCR does not adequately measure curricular and programmatic changes that may be connected to discriminatory practices; rather it emphasizes collecting frequency counts. On its own, the E&S survey clearly cannot collect in-depth information on the quality of curricula and opportunities to learn to produce a full picture, but data from it could be integrated with other data collection efforts to achieve that goal (see Appendix D for a further discussion of this use).

Links to the PBDMI

As mentioned earlier, there is a new major effort under way in the Department of Education to consolidate the collection and maintenance of administrative data used for program management and policy. The initiative, known as the Performance-Based Data Management Initiative (PBDMI), began with top management support from major offices in the Department of Education, including OCR and the Offices of Elementary and Secondary Education, English Language Acquisition, Vocational and Adult Education, and Special Education and Rehabilitative Services. The intention is to lay out the information needed by each program against the statutory, regulatory, and other required information to ensure that only

Suggested Citation:"4. Strengthening the E&S Survey Data." National Research Council. 2003. Measuring Access to Learning Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/10673.
×

critical information is identified. The PBDMI will be launching many activities, including a demonstration project designed to provide a capability that links the department’s various sources of state demographic, academic, and funding information together to support educational performance and achievement analysis. The plan is to transform the current data gathering process, which has numerous and sometimes duplicative collections, into a series of state-federal data exchanges with a central data repository. The PBDMI is expected to achieve partnerships with state systems beginning in 2003 and, given sufficient funding, full implementation by 2005.

The PBDMI effort is consistent with previous recommendations to OCR. For example, the National Research Council (2002b) recommended that the Department of Education conduct a single, well-designed data collection effort to monitor both the number of students receiving services through the Individuals with Disabilities Education Act and the characteristics of those children of concern to civil rights enforcement efforts. The PBDMI has the opportunity to be cognizant of the value of collecting programmatic data from schools on access to high-quality educational services and resources and to ensure that the public has access to that information. How the E&S survey goes through revision and continuation, while coordinating and cooperating with the PBDMI, will be critical to its future utility.

Implementation of the PBDMI is likely to affect the E&S survey, although precisely how is still unclear, since it is still under development. OCR should be sure to participate fully in the PBDMI discussions and ensure that the goals of the E&S survey data collection are represented in the implementation of PBDMI. PBDMI, when fully operational, may offer a unique opportunity to portray E&S survey data in a way that enriches both the OCR data and key educational information collected by the department.

Analysis and Dissemination

It would be very useful to those concerned with the provision of educational opportunities to minority students, to students with disabilities and those with limited English proficiency, and to advocates of gender equity to have easy access to simple tabulations of the data from the E&S survey. The publication of such work may encourage researchers to conduct more extensive analysis of the data. The publication of selected data should be in both electronic and print forms to allow for maximum access and dissemination.

Components of the survey could be disseminated in various ways. In

Suggested Citation:"4. Strengthening the E&S Survey Data." National Research Council. 2003. Measuring Access to Learning Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/10673.
×

addition to the suggestions above that selected findings be published in the department’s annual Condition of Education report and that survey data might be part of the PBDMI process, OCR should consider requesting funds to continue and expand its data reporting efforts so that it could at least conduct simple analyses of the survey data during the year they become available. OCR should also consider publishing the findings in ways that would allow people to examine trends over time. Also, with the emphasis now being placed on the accountability provisions of the No Child Left Behind Act, OCR should consider posting on its website a full report of tabulations of the data showing how opportunities to learn are allocated to students of different backgrounds. This would allow analysts to balance the testing outcome data from the act with solid information about access to learning opportunities and resources described by the E&S survey.

Suggested Citation:"4. Strengthening the E&S Survey Data." National Research Council. 2003. Measuring Access to Learning Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/10673.
×
Page 46
Suggested Citation:"4. Strengthening the E&S Survey Data." National Research Council. 2003. Measuring Access to Learning Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/10673.
×
Page 47
Suggested Citation:"4. Strengthening the E&S Survey Data." National Research Council. 2003. Measuring Access to Learning Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/10673.
×
Page 48
Suggested Citation:"4. Strengthening the E&S Survey Data." National Research Council. 2003. Measuring Access to Learning Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/10673.
×
Page 49
Suggested Citation:"4. Strengthening the E&S Survey Data." National Research Council. 2003. Measuring Access to Learning Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/10673.
×
Page 50
Suggested Citation:"4. Strengthening the E&S Survey Data." National Research Council. 2003. Measuring Access to Learning Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/10673.
×
Page 51
Suggested Citation:"4. Strengthening the E&S Survey Data." National Research Council. 2003. Measuring Access to Learning Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/10673.
×
Page 52
Suggested Citation:"4. Strengthening the E&S Survey Data." National Research Council. 2003. Measuring Access to Learning Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/10673.
×
Page 53
Suggested Citation:"4. Strengthening the E&S Survey Data." National Research Council. 2003. Measuring Access to Learning Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/10673.
×
Page 54
Suggested Citation:"4. Strengthening the E&S Survey Data." National Research Council. 2003. Measuring Access to Learning Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/10673.
×
Page 55
Suggested Citation:"4. Strengthening the E&S Survey Data." National Research Council. 2003. Measuring Access to Learning Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/10673.
×
Page 56
Suggested Citation:"4. Strengthening the E&S Survey Data." National Research Council. 2003. Measuring Access to Learning Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/10673.
×
Page 57
Suggested Citation:"4. Strengthening the E&S Survey Data." National Research Council. 2003. Measuring Access to Learning Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/10673.
×
Page 58
Suggested Citation:"4. Strengthening the E&S Survey Data." National Research Council. 2003. Measuring Access to Learning Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/10673.
×
Page 59
Next: 5. Improving the Survey and Its Use »
Measuring Access to Learning Opportunities Get This Book
×
Buy Paperback | $40.00 Buy Ebook | $31.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Since 1968 the Elementary and Secondary School Civil Rights Compliance Report (known as the E&S survey) has been used to gather information about possible disparities in access to learning opportunities and violations of students’ civil rights. Thirty-five years after the initiation of the E&S survey, large disparities remain both in educational outcomes and in access to learning opportunities and resources. These disparities may reflect violations of students’ civil rights, the failure of education policies and practices to provide students from all backgrounds with a similar educational experience, or both. They may also reflect the failure of schools to fully compensate for disparities and current differences in parents’ education, income, and family structure.

The Committee on Improving Measures of Access to Equal Educational Opportunities concludes that the E&S survey continues to play an essential role in documenting these disparities and in providing information that is useful both in guiding efforts to protect students’ civil rights and for informing educational policy and practice. The committee also concludes that the survey’s usefulness and access to the survey data could be improved.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!