National Academies Press: OpenBook

Systems for State Science Assessment (2006)

Chapter: Index

« Previous: Appendix C Biographical Sketches of Committee, Staff, and Working Group Members
Suggested Citation:"Index." National Research Council. 2006. Systems for State Science Assessment. Washington, DC: The National Academies Press. doi: 10.17226/11312.
×

Index

A

Accommodations, 8, 12, 138-140, 168, 208

Accountability

California system, 151

classroom-based assessment, 33, 167

in content standards, 62

CRESST quality standards, 149, 157

designing assessments for, 35, 78

equity and resource issues, 136, 141, 143

and inclusion, 168

instructionally supportive tests, 32-33, 63

monitoring effects of, 157

NCLB goals, 11, 16, 158

use of assessments for, 19, 61, 120

Achieve, Inc., 56, 153

Achievement standards.

See also Science achievement

alternate, 138

content standards and, 68

cut scores, 72, 73-74, 75

funding tied to, 141-142

importance, 72

key characteristics, 72-73

methods for setting, 76, 170

NCLB requirements, 2, 12, 54-55, 72, 74, 75, 141-142, 146, 167

performance levels, 72-73, 74, 76, 99, 156, 170

recommendations, 170

supplementary data, 75

validation of, 75-76

variability in, 74-75

Adequacy. See Equity and adequacy issues

Administration of tests, 119-120, 194

frequency, 12

Alaska, 57

American Association for the Advancement of Science, 17, 19, 38, 111, 133, 153

American Chemical Society, 17

American College Testing Program, 151

American Educational Research Association, 148

American Federation of Teachers, 56, 58-59, 64, 128

American Physics Society, 17

American Psychological Association, 148

Ancillary skills, 139-140

Assessment, 3-4.

See also Classroom assessment;

Designing science assessments;

Implementing science assessment systems;

Interpretation of assessment results;

Performance assessments;

Quality of assessments;

Reporting assessment results;

Science assessment systems;

Systems approach to assessment

Suggested Citation:"Index." National Research Council. 2006. Systems for State Science Assessment. Washington, DC: The National Academies Press. doi: 10.17226/11312.
×

content standards aligned with, 30, 35, 57, 58, 68, 69

defined, 16

district-level, 31, 45, 49

triangle concept, 81, 86-87, 89

Atlas for Science Literacy, 69, 133

Atomic-molecular theory, 18, 42, 95, 106-110

Augmented norm-referenced assessments, 12

Australia, 36, 124

B

Benchmarks for Science Literacy, 42-43, 55, 93-94, 133

Big ideas in science

and assessment design, 79, 81, 106-112

curriculum framework, 109-110

knowledge organized around, 39-40

content standards organized around, 3, 40, 56-57, 65, 66, 69-71, 110-112

Bundling activities, 29

Buros Center for Testing, 153

C

California, 30, 64, 151

Center for Research on Education, Standards, and Student Testing, 149, 157

Check sheets, 49

Classroom assessment, 26

for accountability purposes, 33, 167

design considerations, 33-35, 86-87, 91, 95, 120

inquiry, 45

practice and feedback opportunities, 49-50, 120, 132, 158

of prior knowledge, 48-49

professional development and, 126-127

quality standards, 148-149

sample, 33-35

of science literacy, 48-50, 52

Classroom Focused Multi-Level Assessment Model, 34

Colorado, 151-152

Commission on Instructionally Supportive Assessments, 32

Competency standards, 130

Concept mapping, 29, 40, 49, 111

Connecticut, 45, 46-47, 120

Consortium for Policy Research in Education, 65

Constitutional concerns, 142

Construct

modeling approach, 86, 87-88, 89-90

specification, 87, 90, 91-94, 111

Constructed-response items, 33, 94

Content knowledge

advisory groups, 6, 112, 116-117

context-bound, 40-41

improving, 166-167

organizing around big ideas, 39-40, 57, 65, 66, 69-71, 79, 81, 106-112

research needs, 166-167

transfer of, 40, 50, 52

Content standards

accountability component, 62

and achievement standards, 68

assessment-related information in, 30, 35, 57, 58, 68, 69

clarity, detail, and completeness, 2, 3, 57-58, 63-65, 156, 169

cognitive validity, 105

conceptual framework, 2, 65, 67, 69

curriculum aligned with, 57, 63, 65, 67, 68-69, 105

district-level models, 31

and instructional planning, 67, 68-69

key features, 57, 62-68

learning performances and, 3, 91-94, 111

lesson support materials, 68-69

as model of learning, 2, 67

NCLB requirements, 54, 56, 146, 167

organizing around big ideas, 3, 40, 56-57, 65, 66, 69-71, 110-112

performance expectations, 2, 68

review and revision, 2, 9, 19, 61-62

rigor and scientific correctness, 65, 67

scientific terminology as, 57, 60-61

scope, 65, 66

state variation in, 55-61

supplementary guidance material, 68-71

Core assessment, 31

Council for Basic Education, 58-59 n.4

Council of Chief State School Officers, 55, 137, 188

Creative writing, 29

Criterion-referenced assessments, 12, 33, 115

Curriculum.

See also Instruction

assessment design linked to, 109-110, 206

big ideas in science as framework, 109-110

Suggested Citation:"Index." National Research Council. 2006. Systems for State Science Assessment. Washington, DC: The National Academies Press. doi: 10.17226/11312.
×

content standards aligned with, 57, 63, 65, 67, 68-69, 105

system perspective, 22, 24

Cut scores, 72, 73-74, 75

D

Data management, 133

Delaware, 57, 99, 104, 105

Designing science assessments

accommodations, 139-140

accountability component, 32-33, 35, 63, 78

activity selection process, 89

ancillary skills identified in, 139-140

approaches, 17-18, 27-31, 77, 78-81, 82-85, 106-112, 153, 207

assessment triangle concept, 81, 86-87, 89

backward design process, 95-98, 100-103

bias and sensitivity reviews, 140, 155

building blocks, 89-90, 91-104

classroom assessments, 86-87, 91, 95, 120

cognitive validity, 104-105

collaboration in, 112

competency standards, 130

computerized system for, 34-35, 89, 131

conceptual framework, 88-89, 205-209

construct modeling approach, 86, 87-88, 89-90

construct specification, 87, 90, 91-94, 111

curriculum linked to, 109-110, 190, 206

developmental approach, 78-81, 82-85, 106-112

distractors, 95, 100-101

evaluation and monitoring, 140, 154-156, 158

evidence-centered approach, 77, 87-89

evolutionary biology example, 110-112

field testing, 140, 155

influences on committee thinking, 81, 86-90

instructionally supportive accountability tests, 32-33, 63

item design, 90, 94-98, 100-103, 108, 109, 110

language and vocabulary considerations, 140

learning performances and, 3, 91-94, 95-98, 100-103, 108, 109

learning progression and, 3, 18, 77, 78, 79-80, 82-85, 106-112

learning theory and, 110-112, 150

matter and atomic-molecular theory example, 18, 106-110

measurement models, 17, 86-87, 89, 90, 95, 99, 102-103, 139, 167

outcome space, 90, 98-99

presentation process, 89

program evaluation context, 89

purpose of assessment and, 3-4, 5, 86-87, 91, 95, 191

questions for states, 52-53, 112-113, 162, 163

research needs, 159, 166, 167

response processing, 89, 109, 110

rotating key concepts, 32-33

sample designs, 31-35

science literacy and, 1, 50-53

standards aligned in, 89, 91, 109, 110, 154-156, 158, 167, 191

summary scoring process, 89

systems approach, 4, 21, 32-35, 77, 87, 150, 153, 161, 206

target skills identified in, 139-140

task selection, 94-98

technology support, 34-35

test delivery architecture, 89;

see also individual formats

time limits, 140-141

universal design, 139

Diagnostic assessments, 33, 89

Disabilities, students with, 137, 138, 168.

See also Accommodations;

Inclusion

Distractors, 95, 100-101

District-level

assessments, 31, 45, 49

content standards, 31

E

Editorial Projects in Education, 56

Education administrators, 128, 130

Education journalists, 129

Education Leaders Council, 121

Elementary and Secondary Education Act. See Improving America’s Schools Act

England, 36

English language learners, 137, 138, 148, 168.

See also Accommodations;

Inclusion

Equity and adequacy issues

accountability and, 136, 141, 143

children’s equity, 142

inclusion, 136, 138-141, 145, 165, 168

interpreting assessment results, 7-8, 136, 137

Suggested Citation:"Index." National Research Council. 2006. Systems for State Science Assessment. Washington, DC: The National Academies Press. doi: 10.17226/11312.
×

opportunity to learn, 7-8, 24, 136-138, 142, 145, 165

questions for states, 145, 165

resources, 141-144, 145, 165

school finance burdens, 141-142, 143-144

taxpayer equity, 142

Evaluation and monitoring.

See also Quality of assessments

accountability effects, 8, 157

achievement standard-setting methods, 170

alignment of assessment and standards, 152-154

assessment development, 140, 154-156, 158

challenges, 158-159

consequences and uses of assessment systems, 150-154, 157-158

as feedback, 22, 147

questions for states, 159-160, 165

reliability of scores, 151-152

reporting of results, 156

research needs, 152, 153

standards review and revision, 2, 19, 61-62, 169

systems approach, 150, 154-158, 170

validity of gains, 150-151, 157

Evolutionary biology example, 110-112

F

Fieldwork, 29

Finance issues, 141-142, 143-144

Florida, 57

Force Concept Inventory, 94

Fordham Foundation, 56, 58, 59

Formative assessment, 26

France, 36

G

Germany, 36

Guidance material

with content standards, 68-71

H

High-stakes testing, 27, 32, 137, 138, 141, 144, 150, 157, 159

Hybrid tests, 31, 35

I

Illinois, 57

Implementing science assessment systems.

See also Test development/developers

administration of tests, 12, 119-120, 194

advisory groups, 6, 112, 116-117

continuous improvement plan, 115

contractor issues, 112, 121, 123, 195-196

data management, 133

deadlines for, 1, 12, 13, 17

developing the structure, 117-121;

see also Designing science assessments

documentation, 117-118

example, 122

frequency of administration, 120

identification of purposes, 117, 118-119

needs analysis, 114-116

online administration, 132

professional development, 119, 125-130, 135, 164-165, 168-169

questions for states, 134, 163-164

reporting results, 19, 115, 123-125, 135, 164-165, 194

scoring, 132, 194-195

support system for, 133

technology support, 131-133

Improving America’s Schools Act, 11

Inclusion

accommodations, 8, 12, 138-139, 168, 208

advice to states, 139-141

equity issues, 136, 138-141, 145, 165, 168

questions for states, 145, 165

research needs, 168

Indiana, 57

Inquiry. See Scientific inquiry

Instruction

assessment design linked to, 4, 138, 155, 190, 206

content standards and, 67, 68

lesson support materials, 68-69

teaching to the test, 26-27, 150

systems perspective, 22, 24, 119, 141, 206

Instructionally supportive accountability tests, 32-35, 63

International influences, 23

Interpretation of assessment results

accommodations and, 8, 138-139, 168

equity and adequacy issues, 7-8, 136, 137

identifying strategies for, 116

validity, 7-8, 26, 148, 149, 150-151, 168

Suggested Citation:"Index." National Research Council. 2006. Systems for State Science Assessment. Washington, DC: The National Academies Press. doi: 10.17226/11312.
×

Interstate New Teacher Assessment and Support Consortium, 128

Item design, 90, 94-98, 108, 109, 110.

See also individual formats

Item response theory, 99, 102, 103

J

Japan, 36

K

Kentucky, 30, 150-151

Kinetic molecular theory, 79, 81

Knowledge.

See also Content knowledge;

Prior knowledge and misconceptions

declarative, 105

procedural, 105

schematic, 105

strategic, 105

L

Laboratory experiments, 50

Large-scale standardized tests, 86-87, 91, 94, 95, 103, 115, 118, 120, 121, 125, 126, 131

Learning.

See also Science learning

assessment linked to, 48, 49, 110-112, 150

content standards as model of, 2, 67

theory, 48, 110-112, 150

Learning performances

on atomic-molecular theory, 95

designing science assessments, 3, 91-94, 95-98, 100-103, 108, 109

differential survival example, 93-94

item creation from, 95-98, 100-103, 108

scientific practices that serve as basis for, 92-93

standards elaborated through, 3, 91-94, 111

Learning progressions

defined, 48

designing assessments with, 3, 18, 77, 78, 79-80, 82-85, 106-112

developing, 48

errors and misconceptions identified through, 98, 100-101

matter and atomic-molecular theory, 18, 106-110

reporting results, 123, 124

research needs, 48, 125-126, 166

standards elaborated through, 3, 69-71

Literacy. See Science literacy

M

Maine MeCAS, 117, 118, 119, 132

Maryland, 30

Mathematics, 19, 57, 59, 139, 141, 143, 151-152, 153, 157

Matrix-sample tests, 30, 31, 35

Measurement models, 17

assessment triangle, 86-87, 90

evidence-centered design principles, 89

item response theory, 99, 102, 103

large-scale assessments, 103

multidimensional item response theory, 95, 102

reliability, 99, 167

Memorization, 50, 51

Mid-continent Research for Education and Learning, 55-56

Milwaukee school system, 115

Minority and low-income populations, 144

Misconceptions. See Prior knowledge and misconceptions

Modeling/simulations, 29, 34

Monitoring. See Evaluation and monitoring

Multiple-choice formats, 33, 44, 49, 52, 94, 95, 98, 99, 100-101, 118, 152

N

National Assessment of Educational Progress (NAEP), 3, 12-13, 35, 75-76, 120, 151

National Association of Secondary School Principals, 129

National Center for Education Statistics, 137

National Center on Educational Outcomes, 139

National Conference of State Legislatures, 129

National Council on Measurement in Education, 128, 148

National Education Association, 128

National Education Goals Panel, 58-59

National Research Council, 19, 38, 86, 111, 168

National Science Board, 38

National Science Education Standards, 1, 31, 42-43, 55, 136, 137, 148

National Science Foundation, 1

Suggested Citation:"Index." National Research Council. 2006. Systems for State Science Assessment. Washington, DC: The National Academies Press. doi: 10.17226/11312.
×

National Science Teachers Association, 17, 38

Nebraska STARS, 33, 45, 119

Nevada, 57

New Jersey, 57

New York, 45, 120

New Zealand, 36

No Child Left Behind Act, 118, 170

achievement standards, 2, 12, 54-55, 72, 74, 75, 141-142, 146, 167

access to teachers, 137, 141

content standards, 54, 56, 146, 167

deadlines for implementing assessments, 12, 13, 17

evaluation and monitoring requirements, 146, 169

goals, 4, 7, 11, 16, 45, 136, 141, 157, 158

inclusion requirements, 8

Peer Review Guidance, 147

professional development requirements, 168

reporting requirements, 19, 123

science requirements, 1, 4-5, 11, 12-16, 27, 31, 157, 161, 204-205

validation of assessments, 155

Norm-referenced assessments, 12, 115

O

Observing students, 29, 45, 127

Open-ended items, 52, 115, 118, 124, 132, 139

Opportunity to learn, 7-8, 24, 136-138, 142, 145, 165

Oral examination, 49

Ordered Multiple Choice, 95, 98, 100-101

Oregon, 132

Outcome space, 90, 98-99

P

PADI (Principled Assessment Design for Inquiry), 89, 131

Paper-and-pencil tests, 29, 44, 115, 124, 131

Peer assessments, 29, 32

Performance assessments, 52

achievement standards and, 72

applications, 94

buoyancy concept example, 99, 102-103

classroom-administered, 115-116

open-ended tasks, 118

scoring rubrics, 45, 49, 99, 102-103, 131

Performance categories, 33

Performance expectations, 2, 68

Performance standards, 72-73, 74, 76, 99, 156, 170

Physical science, 32

Plate tectonics theory, 41, 64

Porter, Andrew C., 153

Practical investigations, 29, 50

Practical tests, 29

Presentations, 29

Prior knowledge and misconceptions, 48-49, 67, 68-69, 77, 95, 98, 99, 100-101, 127, 128

Problem solving, 29

Professional development

alignment with standards, 2, 152

assessment literacy, 33, 119, 125-130, 158, 168-169, 208

certification standards and, 7, 128, 129, 169

education administrators, 128, 130

in-service programs, 7, 126, 128

NCLB requirements, 168

preservice programs, 7, 126, 169

questions for states, 135, 164-165

recommendation, 169

systems approach, 2, 24, 152

teachers, 6-7, 125-128, 166, 168, 169, 208

and use of assessment results, 125-127, 129, 130

Program evaluation, 89

Programme for International Student Assessment, 23

Progress maps, 124

Project 2061, 17, 153

Q

Quality of assessments

AERA/APA/NCME, 148, 209

continuous improvement plan, 115

CRESST accountability standards, 149, 157

deeper conception of quality, 150

NSES, 148-149, 209

validity of inferences, 147, 167

R

Reading, 19, 141, 143, 151-152, 153, 155, 157

Reporting assessment results, 208

comparison groups, 124

computerized data management, 133

disaggregated group, 139

Suggested Citation:"Index." National Research Council. 2006. Systems for State Science Assessment. Washington, DC: The National Academies Press. doi: 10.17226/11312.
×

format, 123-124, 125

for inquiries, 45

interpretive material, 124-125

as learning progressions, 123, 124

monitoring plan for, 156

NCLB requirements, 19, 123

needs analysis, 6, 115

progress maps, 124

public availability, 19, 24

questions for states, 135, 164

research needs, 125

samples of student work, 124, 133

standard-specific, 123

for subgroups, 136, 157-158

subscores, 67, 124

test development questions, 194-195

uncertainty and error information, 125

use of results and, 123

validity of interpretations, 124-125

Requests for proposals

authority, 189

background and contextual information, 190

budget, 189

contract period, 189

eligible offerers, 189

laws, rules, and guidelines, 189

ownership of test items, 189-190

questions to be addressed, 189-190, 191-192

specification of products, 190

test development, 190

timeline, 190

Research needs, 9, 18-19, 48, 125-126, 152, 153, 159, 166-167, 168

Resource allocation, 24

equity and adequacy issues, 8, 141-144, 145, 165

minority and economically disadvantaged districts, 143

school finance burdens, 141-142, 143-144

teacher demand and supply, 8, 141, 143, 144, 157

use of assessment results for, 129

Response processing, 89, 109, 110

Rhode Island, 57, 58

S

Sanctions, 11

Scaling, vertical, 35

Science achievement

multiple measures of, 30-31, 115, 119, 167

Science assessment systems.

See also Designing

science assessments;

Implementing science assessment systems

classroom-based, 26, 33-34

coherence in, 5-9, 25-27, 75, 122, 126, 152, 158, 161

constructs, 38;

see also Science achievement;

Science literacy;

Scientific inquiry economic issues, 132

feasibility studies, 157

federal support of, 9, 170

feedback in, 25, 26, 49, 120, 147

goals for learning aligned with, 30, 44

high-quality, 28, 167

instructionally supportive accountability tests, 32-33, 63

international examples, 35-36

intrastate collaboration, 34-35

multiple measures and measurement approaches, 5, 12, 17, 27-31, 119, 125, 147, 167

NCLB requirements, 1, 4-5, 11, 12-16, 27, 31, 157, 161, 204-205

psychometric and practical considerations, 35

questions for states, 36-37, 162

sample designs, 31-35

science literacy and, 49-52, 69-71

statutory requirements, 11, 12-16

Science education system

coherence in, 24-25, 54

effects of science assessment, 21

goal of, 22, 54

influences on, 23-24

standards and, 22

Science for All Americans, 136

Science learning

developmental nature of, 45, 48-49, 77, 106-112, 150, 166-167

evolutionary biology, 110-112

learning progression and, 46, 77, 106-112

matter and atomic molecular theory, 42, 106-110

measurement approaches, 30, 86

prior knowledge and misconceptions, 48-49, 67, 68-69, 77, 95, 99

professional development needs, 166

progress maps, 78, 79-80, 82-85, 124

research recommendations, 166-167

standards as a model of, 2, 67

Science literacy

assessment of, 48-52, 91, 112

Suggested Citation:"Index." National Research Council. 2006. Systems for State Science Assessment. Washington, DC: The National Academies Press. doi: 10.17226/11312.
×

coherence in science education system and, 24

common elements, 38-39

content knowledge, 1, 39-41

defined, 38-39

inquiry and, 1, 17, 42-45, 46-47, 167-168

memorization and, 50, 52

national priority, 1, 11, 38

questions for states, 52-53, 162

science literacy and, 1, 50-53

understanding science as a way of knowing, 1, 41-42, 50-51

Science standards.

See also Achievement standards;

Content standards;

State science standards

alignment with science assessments, 89, 91, 109, 110, 154-156, 158, 167

NCLB requirements, 54-55, 56

role of, 54-55

validity of methods used to set, 153, 167

Scientific inquiry, 32

abilities associated with, 44

approaches to, 43-44, 131

assessment methods, 9, 17, 44-45, 91, 120, 131, 167-168

content standards, 1, 43-44, 144, 167-168

defined, 42

research needs, 168

soapy water experiment, 46-47

Scoring/scores

accommodations and, 168

combining two years of, 152

comparability across years and formats, 153, 156

computerized, 132

evaluation and monitoring, 151-152

implementing, 132, 194-195

multiple-choice formats, 98

open-ended items, 132

performance tasks, 98-99, 124

reliability from year to year, 151-152

rubrics, 45, 49, 99, 102-103, 104, 120, 131, 132

subscores, 67, 124

summary scoring process, 89

test development questions, 194-195

validity of interpretations, 124, 151, 157

Self-assessments, 29, 49

Southern Regional Education Board, 129

Special needs students, 116.

See also Inclusion

Standards. See Science standards;

State science standards

Standards for Educational and Psychological Testing, 78

State Coalition for Assessment of Learning Environments Using Technology (SCALE Tech), 34

State science assessment programs.

See also individual states

status of, 13 n.4

strategies, 26, 27-31

State science standards.

See also Achievement standards;

Content standards;

Science standards

AFT evaluation of, 58-59, 61

elaborating for practitioners, 68-71

Fordham Foundation evaluation, 58, 59, 61

high-quality elements, 2, 54, 59, 60-68, 167, 169

inquiry component, 42-43, 45, 91

questions for states, 76, 163

review and revision, 60-62, 104, 169

specificity, 3, 24-25, 91, 147-148, 152, 169

variations among states, 56-58

Strand maps, 69, 70-71, 133

Student portfolios, 29

Student profiles, 29

Subgroups, reporting results for, 136, 157-158

Summative assessment, 26

Sweden, 36

Systems approach to assessment, 4-5

challenges, 33

characteristics of systems, 21-22

design stage, 4, 21, 32-35, 77, 87, 150, 153, 161

evaluation and monitoring component, 150, 154-158, 170

feedback loops, 22

fundamental issues, 36, 153

instruction and, 22, 24, 119, 141

rationale for, 16

science assessment system, 25-27

science education system, 22-25

T

Target skills, 139-140

Taxpayer equity, 142

Suggested Citation:"Index." National Research Council. 2006. Systems for State Science Assessment. Washington, DC: The National Academies Press. doi: 10.17226/11312.
×

Teachers.

See also Classroom assessment;

Professional development

assessment competence, 6-7, 23, 24, 124-128, 129, 132

certification and licensure, 128

evaluating assessment design, 2, 155, 208

quality and availability, 8, 137, 138, 141, 143, 144, 157

salaries, 144

scoring and evaluating test responses, 132

Teaching to a test, 26-27, 150

Technical Advisory Councils

implementing assessments, 116-117

Technology support

data management, 133

designing assessment systems, 34-35, 89, 131, 209

implementing assessment systems, 131-133

item banks, 133

learning environments, 34

online administration, 132

research needs, 131

scoring, 132

Test administration, 194

Test development/developers.

See also Designing science assessments

commercial test publishers, 91, 121, 123, 187-201

curriculum standards, 190

grade levels, 190

industry characteristics, 187

interface with current program, 191-192

perspectives from test publishers, 200-201

practical tips, 187-201

prime contractor vs. multiple vendors, 188

quality standards, 148

questions to be addressed, 192-193

relationship with contractor, 121, 197-200

requests for proposals, 69, 121, 123, 188-192

responsibility for, 121, 122

score scales, 72

supplementary guidelines with content standards, 69

technical and quality standards, 190

Third International Mathematics and Science Study, 23, 99

Third International Mathematics and Science Study—Repeat, 105

Time limits for tests, 140-141

U

Universal design, 139

U.S. Department of Education, 61-62, 125, 168, 169, 170, 187

Use of assessments, 16

for accountability purposes, 19, 61, 120, 209

administration level and, 120

coherence in assessment systems and, 25-27

competency standards for, 130

design considerations, 3-4, 5, 86-87, 91, 95, 191

evaluation and monitoring, 150-154, 157-158

guidelines for, 129

identifying and documenting, 117, 118-119

multiple assessment strategies and, 27-28, 30

professional development and, 125-127, 129, 130

for promotion and graduation, 137

reporting of results and, 123

for resource allocation, 129

scores and scoring, 124, 151, 157

Utah, 57, 60-61

V

Validation/validity

of achievement standards, 75-76

assessment design, 104-105

cognitive, 104-105

of content standards, 105

gains in scores, 150-151, 157

interpretation of assessment results, 7-8, 26, 124-126, 148, 149, 150-151, 157, 168

reporting of results and, 124-125

Validities of Science Inquiry Assessments, 44

W

Washington state, 65, 66

Webb, Norman L., 153

Wixson, Karen K., 153

Written tests. See Paper-and-pencil tests

Suggested Citation:"Index." National Research Council. 2006. Systems for State Science Assessment. Washington, DC: The National Academies Press. doi: 10.17226/11312.
×

This page intentionally left blank.

Suggested Citation:"Index." National Research Council. 2006. Systems for State Science Assessment. Washington, DC: The National Academies Press. doi: 10.17226/11312.
×
Page 221
Suggested Citation:"Index." National Research Council. 2006. Systems for State Science Assessment. Washington, DC: The National Academies Press. doi: 10.17226/11312.
×
Page 222
Suggested Citation:"Index." National Research Council. 2006. Systems for State Science Assessment. Washington, DC: The National Academies Press. doi: 10.17226/11312.
×
Page 223
Suggested Citation:"Index." National Research Council. 2006. Systems for State Science Assessment. Washington, DC: The National Academies Press. doi: 10.17226/11312.
×
Page 224
Suggested Citation:"Index." National Research Council. 2006. Systems for State Science Assessment. Washington, DC: The National Academies Press. doi: 10.17226/11312.
×
Page 225
Suggested Citation:"Index." National Research Council. 2006. Systems for State Science Assessment. Washington, DC: The National Academies Press. doi: 10.17226/11312.
×
Page 226
Suggested Citation:"Index." National Research Council. 2006. Systems for State Science Assessment. Washington, DC: The National Academies Press. doi: 10.17226/11312.
×
Page 227
Suggested Citation:"Index." National Research Council. 2006. Systems for State Science Assessment. Washington, DC: The National Academies Press. doi: 10.17226/11312.
×
Page 228
Suggested Citation:"Index." National Research Council. 2006. Systems for State Science Assessment. Washington, DC: The National Academies Press. doi: 10.17226/11312.
×
Page 229
Suggested Citation:"Index." National Research Council. 2006. Systems for State Science Assessment. Washington, DC: The National Academies Press. doi: 10.17226/11312.
×
Page 230
Systems for State Science Assessment Get This Book
×
Buy Hardback | $49.95 Buy Ebook | $39.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

In response to the No Child Left Behind Act of 2001 (NCLB), Systems for State Science Assessment explores the ideas and tools that are needed to assess science learning at the state level. This book provides a detailed examination of K-12 science assessment: looking specifically at what should be measured and how to measure it.

Along with reading and mathematics, the testing of science is a key component of NCLB—it is part of the national effort to establish challenging academic content standards and develop the tools to measure student progress toward higher achievement. The book will be a critical resource for states that are designing and implementing science assessments to meet the 2007-2008 requirements of NCLB.

In addition to offering important information for states, Systems for State Science Assessment provides policy makers, local schools, teachers, scientists, and parents with a broad view of the role of testing and assessment in science education.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!