National Academies Press: OpenBook
« Previous: Appendix Biographical Sketches
Suggested Citation:"Index." National Research Council. 2001. Knowing What Students Know: The Science and Design of Educational Assessment. Washington, DC: The National Academies Press. doi: 10.17226/10019.
×

Index

A

Accountability

linkage with assessments for classroom learning, 283–287

versus instructional guidance, 223–224

ACT-R research group, 185

Adventures of Jasper Woodbury video series, 275

Alternative assessment practices, 29–30, 242–248

AP Studio Art, 246–247

DIAGNOSER, 96, 247–248

Maryland State Performance Assessment Program, 245–246

National Assessment of Educational Progress (NAEP), 244–245

American Institutes for Research, 208

AmericaQuest, 267

Analysis

of errors, 207

ethnographic, 101

microgenetic, 100–101

of protocols, 99–100, 207

of reasons, 207

AP Studio Art, 246–247

Aptitude, assessment of, 37

Aristotelian perspective, 206

Arithmetic

keymath diagnostic test, 139

models of learning, 94–95

Assessment, 15–54.

See also Alternative assessment practices;

Classroom formative assessment;

Educational assessment;

Modern assessment practices;

Purposes of assessment

as an instrument of reform, 24–25

analyzing existing, 12, 303–304

of aptitude, 37

balanced system, 253–257

basing on contemporary foundations, 30– 32

contributions of measurement and statistical modeling to, 5–6, 110–172

of control-of-variables strategy, 216–217

curriculum-embedded, 13, 243

defined, 20

developmental, 136–137, 190–192, 250

fairness in, 214–218, 240–241

future of, 292–294

implications of brain research for, 107–109

implications of cognition for, 71–72

implications of expertise for, 90–92

implications of learning models for, 96–97

implications of observation methods for, 101–102

implications of subject-matter expertise for, 79

integrating models of cognition and learning with, 92–97

limitations of current, 14, 26–29, 310–312

linked with curriculum and instruction, 51–53

Suggested Citation:"Index." National Research Council. 2001. Knowing What Students Know: The Science and Design of Educational Assessment. Washington, DC: The National Academies Press. doi: 10.17226/10019.
×

matrix sampling, 13, 307

moderation, 117

potential future role of Bayes nets in, 164–165

in practice, 7–9, 220–259

precision and imprecision in, 42

publicizing its importance in improving learning, 14, 312–313

quality of feedback in, 234–236

and reasoning from evidence, 2–3, 42–43

reporting results of, 212–214

rethinking, 17–35

scientific foundations of, 55–172

static nature of current, 27–28

summative, 38

tasks, 116

using to assist learning, 7–8, 37–38

Assessment design, 173–288

enhancing overall process of, 270–271

funding research into improved, 11–12, 299–301

implications of new foundations for, 6–7, 176–219

inevitability of trade-offs in, 222–223

task-centered versus construct-centered approaches, 194

Assessment instruments.

See also Large-scale assessment

developers of, focusing on cognition, observation, and interpretation, 13, 305–306

task sets and assembly of, 200–202

Assessment systems, 252–257.

See also BEAR

assessment system

balance between classroom and large-

scale assessment, 252–253

Assessment triangle, 19, 44–51, 263–271, 282, 296

cognition, 44–47

cognition-interpretation linkage, 51

cognition-observation linkage, 51, 263– 269

interpretation, 48–49

observation, 47–48

observation-interpretation linkage, 51, 269–270

relationships among the three vertices of, 49, 51

Associationist perspective. See Behaviorist perspective

Australia’s Developmental Assessment program, 190–192

B

Balance-scale problems, solving, 49–50

Balanced assessment systems, 253–257

approximations of, 257

between classroom and large-scale, 252– 253

coherence of, 255–256

comprehensiveness of, 253–255

continuity of, 256–257

Base rate probabilities, 161

of subprocedure profile, 161

Bayes nets, 154–165

mixed-number subtraction, 156–164

potential future role in assessment, 164– 165

Bayes theorem, 155

BEAR. See Berkeley Evaluation and Assessment Research Center

BEAR assessment system, 115–117

sample progress map from, 119

sample scoring guide for, 118

Behaviorist perspective, on knowing and learning, 61–62

Beliefs. See Student beliefs

Berkeley Evaluation and Assessment

Research (BEAR) Center, 115

Blueprints, 116

Brain research

cognition and, 104–109

cognitive architecture and, 68–69

into enriched environments and brain development, 105–107

into hemispheric specialization, 104–105

implications for assessment, 107–109

Bridging research and practice, 294–296

C

CGI. See Cognitively Guided Instruction

Change, models of, 128–134, 165–168

Changing expectations for learning, 21–25

higher standards and high-stakes tests, 23–25

societal, economic, and technological changes, 22–23

Chess experts, meaningful units as encoded by, 74–75

Children

assessing problem-solving rules of, 46–47

Suggested Citation:"Index." National Research Council. 2001. Knowing What Students Know: The Science and Design of Educational Assessment. Washington, DC: The National Academies Press. doi: 10.17226/10019.
×

coming to understand the whole number system , 180–181

naive conceptions of, 83

Piagetian stages of development, 151

strategies for simple addition, 86

Classical test theory (CTT), 120–121

Classification, social context of, 88

Classroom assessment, 8–9, 225–241

balancing with large-scale assessment, 252–253

cognitively based approaches to, 230–233

fairness in, 240–241

learner’s role in, 236–240

new forms of, 12, 301–303

quality of feedback, 234–236

teacher’s role in, 234

transforming, 226–228

Classroom Connect, 267

Classroom formative assessment, 38

facilitating, 272–274

mandates and resources placing increasing emphasis on, 14, 310–312

CLP. See Computer as Learning Partner

Cognition, 65–72

analyzing in existing assessments, 12, 303–304

and brain science, 104–109

developers of assessment instruments focusing on, 13, 305–306

implications for assessment, 71–72

importance of a model of, 6, 178–192, 229–230

as part of the assessment triangle, 44–47

Cognition-observation linkage, 51

complex problem solving, 265–269

concept organization, 265

enhancing, 263–269

theory-based item generation, 263–265

Cognitive architecture, 65–69

and brain research, 68–69

long-term memory, 67–68

working memory, 65–67

Cognitive coherence

among Curriculum, instruction and assessment, 271–283

of balanced assessment, 255–256

Cognitive complexity, of science tasks, 210– 211

Cognitive elements in existing measurement models

enhancement through diagnostics, 137, 142–147

incorporation of, 134–147

progress maps, 136–142

Cognitive models of learning

of arithmetic, 94–95

for assessing children’s problem-solving rules, 46–47

and debugging of computer programs, 95–96

implications for assessment, 96–97

integrating with instruction and assessment, 92–97

and intelligent tutoring, 93

Cognitive perspective, on knowing and learning, 62–63

Cognitive sciences

defined, 20

making advances available to educators, 11, 299

Cognitive structures

adding to measurement models, 147–152

psychometric modeling of, 152–165

Cognitively Guided Instruction (CGI), 95, 234

and assessment, 230–231

Collaboration, recommendation for multidisciplinary, 11

Committee on Learning Research and Educational Practice, 294

Committee on the Foundations of Assessment, 1, 17, 291

Comparable validity, 214

Competencies. See Student competencies

Complex problem solving, 265–269

Complex solution strategies, analysis of, 270

Comprehensiveness, of balanced assessment system, 253–255

Computational modeling and simulation, 99

Computer as Learning Partner (CLP), 278–279

Computer programs, debugging, 95–96

Concept organization, 265

Conceptual frameworks, 271

Conceptual scheme, to guide thinking and discussion, 11

Concurrent verbal protocols, 99

Conditional independence, 114

Conditional inference

methods of, 218

versus unconditional, 215–216

Conditional probabilities, 159

Connecticut Common Core of Learning Assessment Project, 210

Suggested Citation:"Index." National Research Council. 2001. Knowing What Students Know: The Science and Design of Educational Assessment. Washington, DC: The National Academies Press. doi: 10.17226/10019.
×

Connectionist networks, model of long-term memory, 67–68

ConQuest software, 134

Construct-centered approach, to assessment design, 194

Construct variables, 112

Contemporary assessment. See Modern assessment practices

Content-process space, 208

Continuity, of balanced assessment system, 256–257

Continuous latent variables, modeling of change in, 133–134

Control-of-variables strategy, assessment of, 216–217 Coordinated systems, of multiple assessments, 252

Co Vis (Learning Through Collaborative Visualization) project, 279

Criterion-referenced testing, 214

Crystallized intelligence, 66

CTT. See Classical test theory

Cultural norms, impact on expertise, 90

Curriculum

assessment linked with, 51–53

developers of, creating tools to facilitate modern assessment practices, 13, 306– 307

formative assessment in, 228–229

Curriculum and Evaluation Standards for School Mathematics, 209

Curriculum-embedded assessment, 13, 243

D

Debugging of computer programs, 95–96

Dental Interactive Simulation Corporation (DISC), 266–267, 271

Department of Education, 299

Design. See Assessment design

Designing and Conducting Investigations, sample performance map for, 125

Developers of assessment instruments. See also Large-scale assessment

focusing on cognition, observation, and interpretation, 13, 305–306

Developers of educational curricula, creating tools to facilitate modern assessment practices, 13, 306–307

Developmental Assessment program (Australia), 136–137, 190–192, 250

Developmental continuua. See Progress maps

DIAGNOSER, 96, 203, 247–248, 273

Diagnostic arithmetic test, keymath, 139

Diagnostic indices

IEY diagnostic results, 146

incorporated into measurement models, 137, 142–147

KIDMAP, 144–145

DIF. See Differential item functioning

Differential item functioning (DIF), 148, 215

Differential perspective, on knowing and learning, 60–61

DISC. See Dental Interactive Simulation Corporation

Discrete latent variables, modeling of change in, 134

Discussion.

See also Multidisciplinary discourse communities

conceptual scheme and language to guide, 11

Distributed learning, 285

Domain-general knowledge, and problem-solving processes, 69–70

Dyslexia, neural bases of, 108–109

E

Economic change, and changing expectations for learning, 22–23

Educational assessment

defined, 20

opportunities for advancing, 9–10, 260– 288

providing instruction in for teachers, 14, 309–310

purposes and contexts of use, 222–225

Educational curricula, developers of, creating tools to facilitate modern assessment practices, 13, 306–307

Educational decision making, studying how new forms of assessment affect, 12, 301–303

Educational reform

assessment as an instrument of, 24–25

facilitating, 34

Educational Testing Service, 271

Educators. See Teacher education

Environments.

See also Learning environments

enriched, and brain development, 105–107

Suggested Citation:"Index." National Research Council. 2001. Knowing What Students Know: The Science and Design of Educational Assessment. Washington, DC: The National Academies Press. doi: 10.17226/10019.
×

Equity issues, in linkage of assessments, 286– 287

Estimates, assessment results as, 42

Ethnographic analysis, 101

Evaluation. See Assessment

Evidence, reasoning from, 2–3, 36–54, 112–117

Exemplars, 116

Expanding views of knowing and learning, 59–65

the behaviorist perspective, 61–62

the cognitive perspective, 62–63

the differential perspective, 60–61

points of convergence among, 64–65

the situative perspective, 63–64

Expectations for learning

changing, 21–25, 33

in large-scale assessment, 249–250

Expertise, 79–92.

See also Novices and experts

impact of cultural norms and student beliefs, 90

implications for assessment, 90–92

multiple paths of learning, 81–83

practice and feedback, 84–87

predisposition to learn, 80

role of prior knowledge, 83–84

role of social context, 88–89

transfer of knowledge, 87–88

Eye-movement tracking, 96

F

Facet clusters, 187

Facets-based instruction and learning, 186– 189, 202–206, 234, 273

separating medium effects from gravitational effects, 188–189

Facets DIAGNOSER, 96, 247–248

Facets of measurement, 121

Facets of student thinking, 187

Fairness in assessment, 7, 39, 214–218, 240– 241

conditional versus unconditional inferences, 215–216

Federal agencies, establishing multidisciplinary discourse communities, 12–13, 304–305

Feedback in assessment, 4, 84–87

and expertise, 84–87

large-scale, 249–250

quality of, 234–236

Fluid intelligence, 66

fMRI. See Functional magnetic resonance imaging

Formal measurement models

as a form of reasoning from evidence, 112–117

reasoning principles and, 113–115

Formative assessment. See Classroom formative assessment

Foundations, defined, 20

Fraction items, skill requirements for, 158

Functional magnetic resonance imaging (fMRI), 68, 108

Funding, needed for research into improved assessment design, 11–12, 299–301

Future of assessment, 292–294

bridging research and practice, 294–297

G

Generalizability theory (G-theory), 121–122

multivariate, 128

with raters and item type, 122

GenScope™, 276, 278, 286–287

Goals, using large-scale assessment to signal, 248–249

GradeMap software, 119, 143

Growth, models of, 128–134

Guessing probability, 153

H

Hemispheric specialization, brain research into, 104–105

Hierarchical factor analysis, 149

Hierarchical linear modeling (HLM), 133

Hierarchical measurement models, 148–152

combining classes and continua, 151–152

High Stakes: Testing for Tracking, Promotion, and Graduation, 39, 253

High-stakes tests, and changing expectations for learning, 23–25

HLM. See Hierarchical linear modeling

How Far Does Light Go? project, 280–281

How People Learn, 59

HYDRIVE intelligent tutoring system, 164– 165

inference networks for, 169–172

Suggested Citation:"Index." National Research Council. 2001. Knowing What Students Know: The Science and Design of Educational Assessment. Washington, DC: The National Academies Press. doi: 10.17226/10019.
×

I

IEY. See Issues, evidence and you curriculum

IEY curriculum, target performance map for, 146

IMMEX (Interactive Multimedia Exercises) program, 270, 273–274

Inclined-plane schemas, novices’ and experts’ network representations of, 78–79

Individual achievement, using assessment to determine, 38–39

Inference

conditional versus unconditional, 215–216

methods of, 97–102, 218

targets of, 45

Inference networks

for the HYDRIVE student model, 170

structure of, 160

Information. See Publicity

Information technology, 22

opportunities for advancing educational assessment, 9–10, 260–288

Instruction

assessment linked with, 51–53

formative assessment in, 228–229

integrating models of cognition and learning with, 92–97

in learning assessment, providing for teachers, 14, 309–310

Instructional guidance, accountability versus, 223–224

Intelligent tutoring systems, 68, 231–233

application of Bayes nets in, 169–172

cognitive models of learning and, 93

effects on mathematics learning, 232–233

Interpretation

analyzing in existing assessments, 12, 303–304

developers of assessment instruments focusing on, 13, 305–306

as part of the assessment triangle, 48–49

IRM. See Item response modeling

Issues, evidence and you (IEY) curriculum, progress variables from, 116

Item generation, theory-based, 263–265

Item parameters, 113, 119, 154

Item response modeling (IRM), 123–126, 134

multidimensional, 128–129

with raters and item type, 126

J

James S.McDonnell Foundation, 304

K

KIDMAP, 142, 144–145

Knowing, expanding views of the nature of, 59–65

Knowledge

domain-general, and problem-solving processes, 69–70

role of prior, 83–84

synthesis of existing, 299

transfer of, 87–88

Knowledge base

expanding, 299–303

initial steps for building, 303–305

Knowledge-in-pieces perspective, versus theoretical perspective, 203–206

Knowledge Integration Environment (KIE), 278–281

Knowledge organization

expert-novice differences in, 72–77

and schemas, 70–71

Knowledge tracing, 186

L

Language, to guide thinking and discussion, 11

Large-scale assessment, 8–9, 241–251

and advances in cognition and measurement, 241–242

alternative approaches to, 242–248

AP Studio Art, 246–247

balancing with classroom assessment, 252–253

feedback and expectations for learning, 249–250

increasing spending on, 24

Maryland State Performance System, 245– 246

National Assessment of Educational Progress, 244–245

New Standards Project, 250–251

sampling wider range of student competencies, 13, 307–308

using to signal worthy goals, 248–249

Suggested Citation:"Index." National Research Council. 2001. Knowing What Students Know: The Science and Design of Educational Assessment. Washington, DC: The National Academies Press. doi: 10.17226/10019.
×

Large-scale contexts, making new forms of assessment practical in, 12, 301–303

Latent class models, 126–127

ordered, 127

Latent semantic analysis (LSA), 269, 274

Latent variables

continuous, modeling of change in, 133– 134

discrete, modeling of change in, 134

stage-sequential dynamic, 135

LCAG software, 134

Learner’s role, in classroom assessment, 236– 240

Learning.

See also Expectations for learning; Student competencies

advances in the science of, 3–5, 58–109

changing expectations for, 21–25

cognitive model of, 46–47

distinguished from development, 80

expanding views of the nature of, 59–65

impact of prior theories of, 25–30

impact of reflective inquiry on, 238–239

importance of a model of, 6–7, 178–192, 229–230

linkage of assessments for, 283–287

models of, integrating with instruction and assessment, 92–97

multiple paths of, 81–83

predisposition to, 80

principles for structuring, 87

problem-based, 276

publicizing the importance of assessment in improving, 14, 312–313

social context of, 89

using assessment to assist, 7–8, 37–38

Learning environments, technology-enhanced, 274–283

Learning Through Collaborative Visualization (CoVis) project, 279

Limitations of current assessment, 26–29

making policy makers aware of, 14, 310– 312

Linear models, families of, 131–132

Link tests, 116

Linkage of assessments

for classroom learning and accountability, 283–287

equity issues in, 286–287

policy issues in, 285–286

pragmatic issues in, 286

privacy issues in, 287

LISP tutor, 164

Logistic regression, 148

LOGO language, 95

Long-term memory, 2, 67–68

production systems model of, 67–68

LSA. See Latent semantic analysis

M

“Magic Number Seven” argument, 66

Mandates, increasing emphasis on classroom formative assessment, 14, 310–312

Maryland State Performance Assessment Program (MSPAP), 245–246

MashpeeQuest performance task, 267–268

Mathematics

effects of an intelligent tutoring system on learning, 232–233

student beliefs about the nature of, 91

Mathematics Test Creation Assistant, 263–264

Matrix sampling, 13, 248

Meaningful units, as encoded by chess experts, 74–75

Measurement models, 5–6, 112

adding cognitive structure to, 147–152

addition of new parameters, 148

formal, 112–117

hierarchization of, 148–152

incorporation of cognitive elements in existing, 134–147

standard, 117–127

Measurement science

contributions to assessment, 5–6, 110–172

facets of, 121

impact of prior theories of, 25–30

making advances from available to educators, 11, 299

Media, publicizing the importance of assessment in improving learning, 14, 312–313

Mediated activity, 63

Memory

contents of, 69–71

domain-general knowledge and problem-solving processes, 69–70

long-term, 67–68

schemas and the organization of knowledge, 70–71

working, 65–67

Metacognitive skills, importance of, 4, 78–79, 281

Suggested Citation:"Index." National Research Council. 2001. Knowing What Students Know: The Science and Design of Educational Assessment. Washington, DC: The National Academies Press. doi: 10.17226/10019.
×

Microgenetic analysis, 100–101

Middle School Mathematics Through Applications Project, 187–189

Mixed-number subtraction, Bayes net analysis of, 156–164

Mixture model approach, 151

Model tracing, 186

“Model tracing,” 167

Modeling

computational, 99

psychometric, of cognitive structures, 152–165

statistical, 5–6, 110–172

of strategy changes, 165–168

Models of change and growth, 128–134

modeling of change in continuous latent variables, 133–134

modeling of change in discrete latent variables, 134

true-score modeling of change, 130–133

Models of cognition and learning, 185–192

Australia’s Developmental Assessment program, 190–192

Facets-based instruction and learning, 186–189, 202–206

Middle School Math Through Applications Project, 187–189

PAT algebra tutor, 185–186

Modern assessment practices, 30–32

developers of educational curricula creating tools to facilitate, 13, 306–307

publicizing, 14, 312–313

Monotonic development, cumulative, in a stage-sequential dynamic latent variable, 135

MOOSE Crossing, 279

“Mozart effect,” 105–107

M2RMCL, unified model and, 152–154

Multiattribute models, 128

Multidimensional item response model, 129

Multidisciplinary discourse communities, establishing, 11–13, 304–305

Multiple assessments

coordinated systems of, 252

developing new systems of, 14, 310–312

Multiple-choice questions, to test for theoretical versus knowledge-in-pieces perspective, 204–205

Multiple paths, of learning, 81–83

Multivariate G-theory, 128

N

NAEP. See National Assessment of Educational Progress

National Academy of Education, 298

National Assessment of Educational Progress (NAEP), 40, 90, 124, 200, 224, 244–245

National Board of Medical Examiners, 270

National Council of Teachers of Mathematics, 23, 275

National Education Research Policies and Priorities Board, 298

National Institute of Child Health and Human Development, 299

National Institute of Mental Health, 108

National Institute on Aging, 108

National Research Council (NRC), 17, 23, 39, 59

Committee on Learning Research and Educational Practice, 294

National Science Foundation (NSF), 1, 17, 291, 299, 304

Neural bases of dyslexia, 108–109

New Standards Project, 23, 250–251

Newell-Dennett framework, 153

Newtonian perspective, 203

Norm-referenced results, 213

Novices and experts

differences in, 4, 72–77

network representations of inclined-plane schemas, 78–79

NRC. See National Research Council

NSF. See National Science Foundation

Number Knowledge Test, 196–199

O

Object models, 271

Observation

analysis of protocols for, 99–100

analyzing in existing assessments, 12, 303–304

and computational modeling and simulation, 99

developers of assessment instruments focusing on, 13, 305–306

ethnographic analysis, 101

implications for assessment, 101–102

methods of, 97–102

microgenetic analysis, 100–101

as part of the assessment triangle, 47–48

Suggested Citation:"Index." National Research Council. 2001. Knowing What Students Know: The Science and Design of Educational Assessment. Washington, DC: The National Academies Press. doi: 10.17226/10019.
×

of problem-solving rules in children, methods for, 49

and reaction-time studies, 98–99

of student performance, interpreting, 50

Observation-interpretation linkage, 51

analysis of complex solution strategies, 270

enhancing, 269–270

text analysis and scoring, 269

Observation models, 112

Office of Education Research and Improvement, National Education Research Policies and Priorities Board, 298

OLEA tutor, 164

P

PANMARK software, 134

Parallel distributed processing (PDP) systems, model of long-term memory, 68

Pasteur’s Quadrant, 298

PAT algebra tutor, 185–186, 232

PDP. See Parallel distributed processing systems

Performance maps

for Designing and Conducting Investigations, 125

for the IEY curriculum, 146

PET. See Positron emission tomography

Physics examination, components of A-level, 254

Physics problems, sorting of, 76

Piagetian stages, of child development, 151

“Plan recognition,” 167

Policy issues, in linkage of assessments, 285– 286

Policy makers

recognizing limitations of current assessments, 14, 310–312

recommendations for, 305–312

Positron emission tomography (PET), 68, 108

Power law of practice, 85

Practice, in expertise, 84–87

Pragmatic issues, in linkage of assessments, 286

Precision and imprecision, in assessment, 42

Predictive tests, 18

Predisposition to learn, 80

Prior knowledge, role in expertise, 83–84

Privacy issues, in linkage of assessments, 287

Private-sector organizations, establishing multidisciplinary discourse communities, 12–13, 304–305

Probabilities

base rate, 161

updated, 162

Problem-based learning, 276

Problem solving

assessing in children, 46–47

complex, 265–269

domain-general knowledge, 69–70

methods for observing in children, 49

weak methods versus strong, 69–70

Production systems, 99

model of long-term memory, 67–68

Professional development programs, providing instruction in learning assessment, 14, 309–310

Profile strands. See Progress maps

Progress maps, 117, 137, 190

BEAR assessment system, 119

cognitive elements in existing measurement models, 136–142

for counting and ordering, 191–193

keymath diagnostic arithmetic test, 139

of national writing achievement, 140–142

reporting individual achievement in spelling, 138

Progress variables, 115–117

from the issues, evidence and you curriculum, 116

Progressions of developing competence. See Progress maps

Protocols

analyzing, 207

concurrent verbal, 99

for observation and inference, 99–100

Psychological Tests and Personnel Decisions, 222

Psychometric modeling of cognitive structures, 152–165

Bayes nets, 154–165

unified model and M2RMCL, 152–154

Public opinion, recommendations regarding, 312–313

Publicity, on the importance of assessment in improving learning, 14, 312–313

Purposes of assessment, 37–42

to assist learning, 37–38

to determine individual achievement, 38– 39

Suggested Citation:"Index." National Research Council. 2001. Knowing What Students Know: The Science and Design of Educational Assessment. Washington, DC: The National Academies Press. doi: 10.17226/10019.
×

to evaluate programs, 40

multiple, 225

reflecting on, 40–42

Q

QUASAR Cognitive Assessment Instrument, 209, 211–212

R

Rasch models, 124

Raven’s Progressive Matrix Test, 66

Reaction-time studies, 98–99

Reasoning from evidence, 2–3, 36–54

assessment as a process of, 42–43

formal measurement models as, 112–117

Reasoning principles, and formal measurement models, 113–115

Recommendations, 10–14, 297–313

for policy and practice, 13–14, 305–313

for research, 11–13, 297–305

Recursive representations, 159

Reference Exam, 251

Reflective inquiry, impact on learning, 238– 239

Reform. See Educational reform

Reliability, 39, 120

Research, for improved assessment design, need to fund, 11–12, 299–301

Resources, increasing emphasis on classroom formative assessment, 14, 308, 310–312

Revising tasks, 212–213

Role of prior knowledge, on expertise, 83–84

Rule assessment method, 48

Rule-space representation, 143

S

Schemas, and the organization of knowledge, 70–71

Science and mathematics, disparities in access to quality instruction in, 18

Science standards, 23–24

Scientific foundations, of assessment, 55–172

Scientists in Action video series, 275

Scoring guides, 116

SEM. See Structural equation modeling

Short-term memory. See Working memory

Simulation, computational, 99

Situative perspective, on knowing and learning, 63–64

Skill acquisition curves, 86

Skill requirements, for fraction items, 158

“Slip probability,” 153

SMART (Scientific and Mathematical Arenas for Refined Thinking) Model, 275–277

web-based resources for, 277

Social context, role in expertise, 88–89

Societal change, and changing expectations for learning, 22–23

Solution strategies, analysis of complex, 270

Sorting, of physics problems, 76

Space-splitting, 171–172

Spatial navigation, 105

Spelling, progress maps reporting individual achievement in, 138

SRI International, 267

Standard psychometric models, 117–128

classical test theory, 120–121

generalizability theory, 121–122

item response modeling, 123–126

latent class models, 126–127

multiattribute models, 128

Standards-based reform, 33

Standards for Educational and Psychological Testing, 177

Standards for learning, rising, and changing expectations, 23–25

Statistical modeling, contributions to assessment, 5–6, 110–172

Stones River Mystery, 275–276

Strategy changes, modeling of, 165–168

Strong methods, of problem solving, 69–70

Structural equation modeling (SEM), 133

Student beliefs

about the nature of mathematics, 91

impact on learning, 90

Student competencies

large-scale assessments sampling wider range of, 13, 307–308

rethinking ways to assess, 27–28

Student learning

accountability versus instructional guidance for, 223–224

studying how new forms of assessment affect, 12, 301–303

Student performance

evaluation of, 197, 200

Suggested Citation:"Index." National Research Council. 2001. Knowing What Students Know: The Science and Design of Educational Assessment. Washington, DC: The National Academies Press. doi: 10.17226/10019.
×

interpreting observations of, 50

Subject-matter expertise, 72–79

expert-novice differences in knowledge organization, 72–77

implications for assessment, 79

importance of metacognition, 78–79

Subprocedure profile

base rate probabilities of, 161

updated probabilities of, 163

Subtraction, of mixed-numbers, 156–164

Subtraction bugs, 183

using sets of items to diagnose, 201

Summative assessment, 38

Systems of multiple assessments, developing new, 14, 253–257, 310–312

T

Target performance map, for the IEY curriculum, 146

Targets of inference, 45

Task-centered approach, to assessment design, 194

Task design, guided by cognitive and measurement principles, 193–196

Task validation, 7, 206–213

approaches to, 207–209

QUASAR Cognitive Assessment Instrument, 209, 211–212

Teacher education, providing instruction in learning assessment, 11, 14, 299, 309– 310

Teacher practice, studying how new forms of assessment affect, 12, 301–303

Teacher’s role, in classroom assessment, 234

Technological change.

See also Information technology

and changing expectations for learning, 22–23

Technology-enhanced learning environments, 274–283

assessment issues and challenges for, 279–283

CoVis (Learning Through Collaborative visualization) project, 279

GenScope™, 276, 278, 286–287

Knowledge Integration Environment, 278–279

MOOSE Crossing, 279

SMART Model, 275–277

Test Creation Assistant, 263–264

Testing. See Assessment

Testing in American Schools, 25

Text analysis and scoring, 269

Theoretical perspectives, versus knowledge-

in-pieces, 204–205

Theory-based item generation, 263–265

ThinkerTools Inquiry Project, 96, 237–240, 265, 273

Thinking

advances in the science of, 3–5, 58–109

conceptual scheme and language to guide, 11

Time-structured data, 133

Tools to facilitate modern assessment practices, 263–271.

See also individual programs and software packages

cognition-observation linkage, 51, 263– 269

developers of educational curricula creating, 13, 306–307

observation-interpretation linkage, 51, 269–270

recommendations regarding, 305–308

supporting, 271

Trade-offs in assessment design

accountability versus instructional guidance for individual students, 223– 224

inevitability of, 222–223

Transfer of knowledge, 87–88

Transforming classroom assessment, 226–228

Triangle. See Assessment triangle

True-score modeling of change, 120, 130–133

Tutoring. See Intelligent tutoring

U

Understanding. See Learning; Student competencies

Unidimensional-continuous constructs, 120

Unified model, 153

and M2RMCL, 152–154

Updated probabilities, 162

of subprocedure profile, 163

Suggested Citation:"Index." National Research Council. 2001. Knowing What Students Know: The Science and Design of Educational Assessment. Washington, DC: The National Academies Press. doi: 10.17226/10019.
×

V

“V-known” option, 133

Validation. See Task validation

Validity, 39

Variables.

See also Control-of-variables strategy

construct, 112

latent, 133–134

progress, 115–117

Voluntary National Test, 208

W

Weak methods, of problem solving, 69–70

Whole number system, children coming to understand, 180–181

Working memory, 65–67

Worthy goals, using large-scale assessment to signal, 248–249

Writing achievement, progress maps of national, 140–142

Suggested Citation:"Index." National Research Council. 2001. Knowing What Students Know: The Science and Design of Educational Assessment. Washington, DC: The National Academies Press. doi: 10.17226/10019.
×
Page 355
Suggested Citation:"Index." National Research Council. 2001. Knowing What Students Know: The Science and Design of Educational Assessment. Washington, DC: The National Academies Press. doi: 10.17226/10019.
×
Page 356
Suggested Citation:"Index." National Research Council. 2001. Knowing What Students Know: The Science and Design of Educational Assessment. Washington, DC: The National Academies Press. doi: 10.17226/10019.
×
Page 357
Suggested Citation:"Index." National Research Council. 2001. Knowing What Students Know: The Science and Design of Educational Assessment. Washington, DC: The National Academies Press. doi: 10.17226/10019.
×
Page 358
Suggested Citation:"Index." National Research Council. 2001. Knowing What Students Know: The Science and Design of Educational Assessment. Washington, DC: The National Academies Press. doi: 10.17226/10019.
×
Page 359
Suggested Citation:"Index." National Research Council. 2001. Knowing What Students Know: The Science and Design of Educational Assessment. Washington, DC: The National Academies Press. doi: 10.17226/10019.
×
Page 360
Suggested Citation:"Index." National Research Council. 2001. Knowing What Students Know: The Science and Design of Educational Assessment. Washington, DC: The National Academies Press. doi: 10.17226/10019.
×
Page 361
Suggested Citation:"Index." National Research Council. 2001. Knowing What Students Know: The Science and Design of Educational Assessment. Washington, DC: The National Academies Press. doi: 10.17226/10019.
×
Page 362
Suggested Citation:"Index." National Research Council. 2001. Knowing What Students Know: The Science and Design of Educational Assessment. Washington, DC: The National Academies Press. doi: 10.17226/10019.
×
Page 363
Suggested Citation:"Index." National Research Council. 2001. Knowing What Students Know: The Science and Design of Educational Assessment. Washington, DC: The National Academies Press. doi: 10.17226/10019.
×
Page 364
Suggested Citation:"Index." National Research Council. 2001. Knowing What Students Know: The Science and Design of Educational Assessment. Washington, DC: The National Academies Press. doi: 10.17226/10019.
×
Page 365
Suggested Citation:"Index." National Research Council. 2001. Knowing What Students Know: The Science and Design of Educational Assessment. Washington, DC: The National Academies Press. doi: 10.17226/10019.
×
Page 366
Knowing What Students Know: The Science and Design of Educational Assessment Get This Book
×

Education is a hot topic. From the stage of presidential debates to tonight's dinner table, it is an issue that most Americans are deeply concerned about. While there are many strategies for improving the educational process, we need a way to find out what works and what doesn't work as well. Educational assessment seeks to determine just how well students are learning and is an integral part of our quest for improved education.

The nation is pinning greater expectations on educational assessment than ever before. We look to these assessment tools when documenting whether students and institutions are truly meeting education goals. But we must stop and ask a crucial question: What kind of assessment is most effective?

At a time when traditional testing is subject to increasing criticism, research suggests that new, exciting approaches to assessment may be on the horizon. Advances in the sciences of how people learn and how to measure such learning offer the hope of developing new kinds of assessments-assessments that help students succeed in school by making as clear as possible the nature of their accomplishments and the progress of their learning.

Knowing What Students Know essentially explains how expanding knowledge in the scientific fields of human learning and educational measurement can form the foundations of an improved approach to assessment. These advances suggest ways that the targets of assessment-what students know and how well they know it-as well as the methods used to make inferences about student learning can be made more valid and instructionally useful. Principles for designing and using these new kinds of assessments are presented, and examples are used to illustrate the principles. Implications for policy, practice, and research are also explored.

With the promise of a productive research-based approach to assessment of student learning, Knowing What Students Know will be important to education administrators, assessment designers, teachers and teacher educators, and education advocates.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!