The following HTML text is provided to enhance online
readability. Many aspects of typography translate only awkwardly to HTML.
Please use the page image
as the authoritative form to ensure accuracy.
How People Learn: Brain, Mind, Experience, and School
well (see, e.g., Collins, 1990; Collins and Brown, 1988; Collins et al., 1989). By prompting learners to articulate the steps taken during their thinking processes, the software creates a record of thought that learners can use to reflect on their work and teachers can use to assess student progress. Several projects expressly include software designed to make learners’ thinking visible. In CSILE, for example, as students develop their communal hypermedia database with text and graphics, teachers can use the database as a record of students’ thoughts and electronic conversations over time. Teachers can browse the database to review both their students’ emerging understanding of key concepts and their interaction skills (Means and Olson, 1995b).
The CoVis Project developed a networked hypermedia database, the collaboratory notebook, for a similar purpose. The collaboratory notebook is divided into electronic workspaces, called notebooks, that can be used by students working together on a specific investigation (Edelson et al., 1995). The notebook provides options for making different kinds of pages—questions, conjectures, evidence for, evidence against, plans, steps in plans, information, and commentary. Using the hypermedia system, students can pose a question, then link it to competing conjectures about the questions posed by different students (perhaps from different sites) and to a plan for investigating the question. Images and documents can be electronically “attached” to pages. Using the notebook shortened the time between students’ preparation of their laboratory notes and the receipt of feedback from their teachers (Edelson et al., 1995). Similar functions are provided by SpeakEasy, a software tool used to structure and support dialogues among engineering students and their instructors (Hoadley and Bell, 1996).
Sophisticated tutoring environments that pose problems are also now available and give students feedback on the basis of how experts reason and organize their knowledge in physics, chemistry, algebra, computer programming, history, and economics (see Chapter 2). With this increased understanding has come an interest in: testing theories of expert reasoning by translating them into computer programs, and using computer-based expert systems as part of a larger program to teach novices. Combining an expert model with a student model—the system’s representation of the student’s level of knowledge—and a pedagogical model that drives the system has produced intelligent tutoring systems, which seek to combine the advantages of customized one-on-one tutoring with insights from cognitive research about expert performance, learning processes, and naive reasoning (Lesgold et al., 1990; Merrill et al., 1992).
A variety of computer-based cognitive tutors have been developed for algebra, geometry, and LISP programming (Anderson et al., 1995). These cognitive tutors have resulted in a complex profile of achievement gains for the students, depending on the nature of the tutor and the way it is inte-