Click for next page ( 207

The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement

Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 206
Appendix E Coordination of Strategies for Collecting Data Data collection strategies must be planned in such a way that there is effective coordination in conducting the activities to develop indicator variables, and in cooperating with other organizations, such as the Center for Education Statistics and the National Assessment of Educational Progress (NAEP), that are collecting similar data. The committee therefore recommends that the collection of data for in- dicator objectives and for other ongoing and planned data collection activities be coordinated to the maximum extent feasible and effec- tive. Indefensible multiplication of surveys or excessive burden on individual respondents would increase the costs of the program and decrease response rates. Effective coordination can lead to reduced costs through possible reduction in sample sizes and to improve- ments in data quality through sharing of frame development and maintenance, sampling operations, and improved training and qual- ity control in survey administration. The purpose of this appendix is to discuss some of the considerations in achieving such coordination. The specific approaches and designs should be developed by those assigned responsibility for the new activities. Elements of data collection activities that affect the feasibility of coordination are: Target population Data elements 206

OCR for page 206
APPENDIX E 207 Methods of data collection Frequency of data collection ~ Sample design (related to data collection methodology and frequency) ~ Time of year (related to data elements and time cycles within the school year) . Respondent burden and its impact on the cooperation of sample respondents Much of the thrust of the committee' recommendations is in the direction of developing new assessment tools and descriptive data. It is far from certain that they will require separate vehicles. However, coordination with other assessment and data collection activities may depend upon the feasibility of compromise between the parallel surveys and gradual shifting to the new tools. Along with the devel- opment and testing of data items, some attention must be given to logistical concerns. The committee considers the assessment of student learning to be of primary importance. For this purpose, the committee suggests testing of students in three grade levels, for example, grades 4, 8, and 12. The committee also recommends that data for additional key indicators about the students and about their teachers be obtained, as well as data to provide several supplementary indicators. These key indicator variables, for example, might include: ~ For students: Semesters of science and mathematics taken by students in the 12th grade; time per week spent on science and mathematics study by students in 4th and 8th grades. to teach. ~ For teachers: Knowledge in subject matter they are expected Collection of these additional data, linked to student learning, can serve two purposes: (1) to provide descriptive statistics about stu- dents, teachers, and schools with regard to the distribution of factors linked to student learning and (2) to help understand and explain differences in student achievement. For the first purpose, linkages between individual students and their teachers are not necessary. Samples of schools, teachers, and students (and possibly their parents) not necessarily linked to the sample of students selected for testing at specified grades can be used to provide general descriptive statistics. Consideration should be given to coordinating these samples with the Elementary/Secondary Integrated Data Systems (ESIDS) program being developed by the

OCR for page 206
208 APPENDIX E Center for Education Statistics. Preluninary specification of sample sizes for schools, students, and teachers could be based on design pa- rameters used by the Center for Education Statistics. In any event, however, steps should be taken to ensure that the data would be com- parable to corresponding data from schools and teachers associated with the sample of students selected for testing. For the second purpose, the committee recommends that the sample be tied to students classified by race or ethnicity, gender, grade level or age, socioeconomic status, type of community, and region or state. It must be recognized that, realistically, it wiD not be possible with the levels of effort now represented by ESIDS or NAEP, for example, to provide enough data for all cross-cIassifications of these variables. It will not even be possible to provide for, say, cross- cIassification of race or ethnicity and gender with equal precision in every cell. One design approach would be to set limited goals but establish designs that could readily be extended; for example, a design for national data that could be expanded to provide state-level data. Testing of students might be coordinated with NAEP, depending on the time needed. NAEP now requires about an hour of student tune, and this could not be reduced substantially without jeopardiz- ing it. Coordination would then depend on the feasibility of adding to the time per student or of sampling additional students. Certain new activities are likely to be special efforts, although in each case it would be desirable to associate them with an organization having related data collection or analysis responsibilities. Among them are the following: Recommendation bv Committee Salary survey Federal support of science and mathematics education Support of scientific bodies Observation of classroom processes Constructions of curriculum frameworks Existing Orzanization/Acti~rit~r U.S. Census Bureau National Science Foundation International Association for the Evaluation of Education Achievement National Assessment of Educational Progress Elementary and Secondary Integrated Data Systems Linkage to developing teacher evaluation programs, in the R&D phase American Association for the Advancement of Science Mathematical Sciences Education Board National Council of Teachers of Mathematics