Executive Summary

In a broad sense, technology is any modification of the natural world made to fulfill human needs or desires. Although people tend to focus on the most recent technological inventions, such as computers, cell phones, and the Internet, technology also includes automobiles, frozen food, irrigation systems, manufacturing robots, and a myriad of other devices and systems that profoundly affect everyone in a modern society.

Because of the pervasiveness of technology, an understanding of what technology is, how it works, how it is created, how it shapes society, and how society influences technological development is critical to informed citizenship. Technological choices influence our health and economic well-being, the types of jobs and recreation available, even our means of self-expression. How well citizens are prepared to make those choices depends in large part on their level of technological literacy.

The National Science Foundation (NSF) has been involved in raising public awareness of the need for an understanding of technology since the 1980s (Bloch, 1986). More recently, the American Association for the Advancement of Science (AAAS, 1990), the International Technology Education Association (ITEA, 1996), and other organizations have also called for Americans to become more savvy about technology. A case for technological literacy has been spelled out in Technically Speaking: Why All Americans Need to Know More About Technology (NAE and NRC, 2002) and in detailed requirements for the development of understanding and capabilities related to technology among K–12 students (ITEA, 2000).

No one really knows the level of technological literacy among people in this country—or for that matter, in other countries. Although



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 1
Tech Tally: Approaches to Assessing Technological Literacy Executive Summary In a broad sense, technology is any modification of the natural world made to fulfill human needs or desires. Although people tend to focus on the most recent technological inventions, such as computers, cell phones, and the Internet, technology also includes automobiles, frozen food, irrigation systems, manufacturing robots, and a myriad of other devices and systems that profoundly affect everyone in a modern society. Because of the pervasiveness of technology, an understanding of what technology is, how it works, how it is created, how it shapes society, and how society influences technological development is critical to informed citizenship. Technological choices influence our health and economic well-being, the types of jobs and recreation available, even our means of self-expression. How well citizens are prepared to make those choices depends in large part on their level of technological literacy. The National Science Foundation (NSF) has been involved in raising public awareness of the need for an understanding of technology since the 1980s (Bloch, 1986). More recently, the American Association for the Advancement of Science (AAAS, 1990), the International Technology Education Association (ITEA, 1996), and other organizations have also called for Americans to become more savvy about technology. A case for technological literacy has been spelled out in Technically Speaking: Why All Americans Need to Know More About Technology (NAE and NRC, 2002) and in detailed requirements for the development of understanding and capabilities related to technology among K–12 students (ITEA, 2000). No one really knows the level of technological literacy among people in this country—or for that matter, in other countries. Although

OCR for page 1
Tech Tally: Approaches to Assessing Technological Literacy many concerns have been raised that Americans are not as technologically literate as they should be (e.g., Rutherford, 2004), these statements are based on general impressions with little hard data to back them up. Therefore, the starting point for improving technological literacy must be to determine the current level of technological understanding and capability, which areas require improvement first, and how technological literacy varies among different populations—children and adults, for instance. The goal of the Committee on Assessing Technological Literacy was “to determine the most viable approach or approaches for assessing technological literacy in three distinct populations in the United States: K–12 students, K–12 teachers, and out-of-school adults.”1 The committee was not asked to develop assessment tools but to point the way toward their development. Assessing Technological Literacy Technological literacy is an understanding of technology at a level that enables effective functioning in a modern technological society. To assess technological literacy, one must have not only a clear idea of what it is, but also a good deal of knowledge about assessment. Basically, technological literacy is an understanding of technology at a level that enables effective functioning in a modern technological society. For the purposes of this report, the committee defined technological literacy as having three major components, or dimensions: knowledge, capabilities, and critical thinking and decision making (Figure ES-1). A similar three-part model of literacy has been proposed for information technology (IT) (NRC, 1999). The “knowledge dimension” of technological literacy includes both factual knowledge and conceptual understanding. The “capabilities dimension” relates to how well a person can use technology (defined in its broadest sense) and carry out a design process to solve a problem. A technologically literate person should, for example, be able to use an automobile, a VCR, a microwave, a computer, and other technologies commonly found in the home or office and should be able to do basic troubleshooting when necessary. The final dimension—the “critical thinking and decision-making dimension”—has to do with one’s approach to technological issues. For example, when a person with highly developed critical-thinking and decision-making skills is confronted with a new 1 The original charge, which included K–16 students and teachers, was modified because the committee was unable to identify opportunities for assessing college students and faculty (with the exception of pre-service teachers).

OCR for page 1
Tech Tally: Approaches to Assessing Technological Literacy FIGURE ES-1 The three dimensions of technological literacy. Source: Adapted from NAE and NRC, 2002. technology, he or she asks questions about risks and benefits and can participate in discussions and debates about the uses of that technology. The committee does not consider attitude to be a cognitive dimension in the same way knowledge, capability, and critical thinking and decision making are. However, a person’s attitude toward technology can provide a context for interpreting the results of an assessment. In other words, what a person knows—or does not know—about a subject can sometimes be correlated with his or her attitude toward that subject. Although few assessments have been developed for technological literacy, many good assessment tools have been developed for other subjects, from reading and writing to science and mathematics. Indeed, the field of assessment is mature in many other domains. Benefits Of the many groups that would benefit from the development of assessments of technological literacy, the most obvious is the formal-education community. As more and more states move toward adopting technology-education standards for K–12 students (Meade and Dugger, 2004), schools will have to measure how well they are implementing those standards. Assessments will provide a gauge of how effectively schools promote technological literacy and an indication of where improvements can be made.

OCR for page 1
Tech Tally: Approaches to Assessing Technological Literacy For K–12 students to become technologically literate, their teachers must also become technologically literate. To this end, colleges of education will need assessment tools to gauge the level of technological literacy of teachers-in-training. Even teachers of nontechnical subjects must be technologically literate to make connections between their subject areas and technology. Many other institutions and organizations—such as media outlets, museums, government agencies, and associations that represent industries—would benefit from knowing the level of technological literacy of their customers, patrons, or target audiences. Levels and types of technological literacy are bound to differ among people from different social, cultural, educational, and work backgrounds. To the extent that these differences put particular people or groups at a disadvantage (e.g., related to educational or employment opportunities), technological literacy can be considered a social-justice issue. Assessment can identify these differences, thus creating opportunities for lessening them. Levels and types of technological literacy are bound to differ among people from different social, cultural, educational, and work backgrounds. However, to make a case for raising the level of technological literacy, one must first be able to show that the present level is low, which is difficult to do without a good measure of technological literacy. Until technological literacy is assessed in a rigorous, systematic way, it is not likely to be considered a priority by policy makers, educators, or average citizens. Existing Assessment Instruments As a context for discussion, the committee collected examples of assessments that can measure an aspect of technological literacy, even if they were not developed for that purpose. Altogether, the committee identified 28 such instruments, including several developed outside the United States. About two-thirds target K–12 students, nearly one-third focus on out-of-school adults, and two are intended for teachers. Most existing assessments for out-of-school adults tend to focus on awareness, attitudes, and opinions, rather than on knowledge or capabilities. The committee concluded that none of these instruments is completely adequate to the task of assessing technological literacy, because none of them fully covers the three dimensions spelled out in Technically Speaking. Most of them emphasize the knowledge dimension, although a number include items that explore technological capabilities, and a handful even focus solely on the capability dimension. But very few include the

OCR for page 1
Tech Tally: Approaches to Assessing Technological Literacy critical-thinking and decision-making dimension. Assessing technology-related capability, which includes the ability to use a design process, is more difficult than gauging knowledge, and only a few methods have been tried for assessing it, partly because this tends to be very expensive, at least for large-scale application. Nevertheless, assessing the capability dimension is crucial. Only a few instruments encourage higher order thinking (critical thinking and decision making), although a goal of all types of learning is to encourage thinking that considers uncertainty and requires nuanced judgment, rather than just factual recall. Developing a Conceptual Framework One step common to the design of assessments is the development of a framework. One step common to the design of assessments is the development of a framework that describes the cognitive and content components of the proposed assessment. The framework often suggests the relative emphasis on each area of content, depending on the age of the test population and other factors. The conceptual underpinnings of the framework can be represented visually as a two-dimensional matrix, which serves as a blueprint for the more detailed phases of assessment design, the development of test specifications, and, ultimately, the development of test items. The committee developed a sample assessment matrix (Figure ES-2) modeled after conceptual frameworks developed for the National Assessment of Educational Progress (NAEP) for closely related subjects (e.g., science and mathematics) (NAGB, 2002, 2004). With one modification, the matrix includes the three dimensions of technological literacy described in Technically Speaking—knowledge, capabilities, and FIGURE ES-2 Proposed assessment matrix for technological literacy.

OCR for page 1
Tech Tally: Approaches to Assessing Technological Literacy ways of thinking and acting (renamed “critical thinking and decision making”)—as the cognitive levels, that is, the three column heads. For each cognitive level, there are four content areas, the row heads: technology and society; design; products and systems; and characteristics, core concepts, and connections. The proposed matrix is intended to be a starting point for designers of assessment frameworks for technological literacy. The committee recognizes that a number of other arrangements of content are possible. General Principles After reviewing existing assessment instruments and the literature on assessment, cognition, and technological literacy; consulting with a variety of stakeholders; and drawing upon the expertise of committee members, the committee developed the following general principles to guide the development of assessments of technological literacy for students, teachers, and out-of-school adults: Assessments should be designed with a clear purpose in mind. Assessment developers should take into account research findings related to how children and adults learn, including how they learn about technology. The content of an assessment should be based on rigorously developed learning standards. Assessments should provide information about all three dimensions of technological literacy—knowledge, capabilities, and critical thinking and decision making. Assessments should not reflect gender, culture, or socioeconomic bias. Assessments should be accessible to people with mental or physical disabilities. Findings and Recommendations In addition to these general principles, the committee developed findings and 12 related recommendations that address five critical areas (Table ES-1): instrument development; research on learning; computer-based assessment methods; framework development; and public perceptions of technology.

OCR for page 1
Tech Tally: Approaches to Assessing Technological Literacy TABLE ES-1 Recommendations by Category and Target Population   Opportunities for Assessment   Integrating Items into Existing Instruments Developing New In- struments Leveraging Research on Learning Exploiting Innovative Measurement Techniques Developing Frameworks Broadening the Definition of Technology K–12 Students 1, 2 3 7, 8 10 11 12 Teachers 4 5 8 10 11 12 Out-of- School Adults 6, 9 6 9 10 11 12 The committee’s overarching finding, based on the review of assessment instruments described above and the results of a committee-sponsored workshop, is that assessment of technological literacy in the United States is in its infancy. This is not surprising given that most students do not take (or have access to) courses in technology, the number of teachers involved in teaching about technology is relatively small, and little effort has been made to determine the nature and extent of adult knowledge of, or attitudes toward, technology. Assessment of technological literacy in the United States is in its infancy. On a more positive note, the committee finds no reason why valid, reliable assessments cannot be developed that address one or more of the cognitive dimensions and all of the content domains of technological literacy. Items related to ways of critical thinking and decision making may be the most challenging for assessment developers, and items intended to measure design-related capability pose special challenges related to time and resource constraints. But both types of items can and should be developed. Opportunities for Assessment There are two significant opportunities for expanding and improving the assessment of technological literacy in all three populations of interest. The first is to integrate technology-related items into existing instruments focused on related topics; the second is to create new assessments specifically for the measurement of technological literacy. These strategies are not mutually exclusive, and the committee believes that they should be pursued simultaneously.

OCR for page 1
Tech Tally: Approaches to Assessing Technological Literacy K–12 Students Technology-related items might be added to a handful of national and international assessments for K–12 students. These assessments are designed to measure levels of knowledge, capability, and reasoning related to mathematics, science, and history. Recommendation 1. The National Assessment Governing Board, which oversees the National Assessment of Educational Progress (NAEP), should authorize special studies of the assessment of technological literacy as part of the 2009 NAEP mathematics and science assessments and the 2010 NAEP U.S. history assessment. The studies should explore the content connections between technology, science, mathematics, and U.S. history to determine the feasibility of adding technology-related items to future NAEP assessments in these subjects. Recommendation 2. The U.S. Department of Education and National Science Foundation should send a recommendation to the International Association for the Evaluation of Educational Achievement and the Trends in Mathematics and Science Study (TIMSS) governing board encouraging them to include technological literacy items in TIMSS assessments as a context for assessments of science and mathematics. The U.S. Department of Education and National Science Foundation should send a recommendation to the Organization for Economic Cooperation and Development and the governing board for the Programme for International Student Assessment (PISA) supporting the inclusion of technological literacy items as a cross-curricular competency. The second area of opportunity for the K–12 population, the creation of new instruments for assessing technological literacy, would break new ground. The challenges to this ambitious approach would be great, but so would the potential benefits, especially the realization of a comprehensive picture of what young people know and can do with relation to technology. Recommendation 3. The National Science Foundation should fund a number of sample-based studies of technological literacy in K–12 students. The studies should have different assessment designs and should assess different population subsets, based on geography,

OCR for page 1
Tech Tally: Approaches to Assessing Technological Literacy population density, socioeconomic status, and other factors. Decisions about the content of test items, the distribution of items among the three dimensions of technological literacy, and performance levels should be based on a detailed assessment framework. K–12 Teachers Although many students have sophisticated technological capabilities, they cannot be expected to be fully technologically literate unless their teachers are. Technology is integral to all educational disciplines, from history to art to science, and teachers should be able to discuss technology-related issues in one form or another. However, very little information is available on the technological literacy of teachers. Although teachers and teachers’ unions may resist the idea of assessing technological literacy, the teacher-quality provisions of the No Child Left Behind Act (NCLB) may provide an opportunity to introduce technology-related test items into existing test instruments. New, standalone assessments would require protections of teachers’ privacy and limited uses of test data to encourage participation. Recommendation 4. When states determine whether teachers are “highly qualified” under the provisions of the No Child Left Behind Act (NCLB), they should ensure—to the extent possible—that assessments used for this purpose include items that measure technological literacy. This is especially important for science, mathematics, history, and social studies teachers, but it should also be considered for teachers of other subjects. In the review of state plans for compliance with NCLB, the U.S. Department of Education should consider the extent to which states have fulfilled this objective. Recommendation 5. The National Science Foundation and U.S. Department of Education should fund the development and pilot testing of sample-based assessments of technological literacy among pre-service and in-service teachers of science, technology, English, social studies, and mathematics. These assessments should be informed by carefully developed assessment frameworks. The results should be disseminated to schools of education, curriculum developers, state boards of education, and other groups involved in teacher preparation and teacher quality.

OCR for page 1
Tech Tally: Approaches to Assessing Technological Literacy Out-of-School Adults Very little is known about the technological literacy of out-of-school adults. Very little is known about the technological literacy of out-of-school adults, although a few instruments, such as the 2001 and 2004 ITEA/Gallup polls (ITEA, 2001, 2004) and the NSF’s now-discontinued biannual surveys of public understanding of science and technology (e.g., NSB, 2004) have focused on the understanding, attitudes, and opinions of adults related to technology. Recently, the United States and several other countries have developed and administered a revamped international literacy assessment, the Adult Literacy and Lifeskills Survey (ALL), that focuses on prose and document literacy but redefines quantitative literacy as numeracy, implying a broad range of content items, some of which could be relevant to the assessment of technological literacy (Lemke, 2004). In addition, ALL measures a cross-curricular area of competency related to problem solving, which is a distinguishing feature of the technological design process. Recommendation 6. The International Technology Education Association should continue to conduct a poll on technological literacy every several years, adding items that address the three dimensions of technological literacy, in order to build a database that reflects changes over time in adult knowledge of and attitudes toward technology. In addition, the U.S. Department of Education, working with its international partners, should expand the problem-solving component of the Adult Literacy and Lifeskills Survey to include items relevant to the assessment of technological literacy. These items should be designed to gauge participants’ general problem-solving capabilities in the context of familiar, relevant situations. Agencies that could benefit by knowing more about adult understanding of technology, such as the National Science Foundation, U.S. Department of Education, U.S. Department of Defense, and National Institutes of Health, should consider funding projects to develop and conduct studies of technological literacy. Finally, opportunities for integrating relevant knowledge and attitude measures into existing studies, such as the General Social Survey, the National Household Education Survey, and Surveys of Consumers, should be pursued.

OCR for page 1
Tech Tally: Approaches to Assessing Technological Literacy Research on Learning Because the assessment of technological literacy is in its infancy, many questions related to the nature of technological learning remain unanswered—in some cases, unasked. Therefore, the first step must be to collect and analyze work that has already been done that might suggest promising avenues for further investigation. The committee commissioned two reviews of the literature—one on learning related to technology (Petrina et al., 2004) and one on learning related to engineering (Waller, 2004). The reviews provided background information on cognitive issues related to technological literacy. In retrospect, however, the committee—and those interested in assessment in the domain of technology—would also have benefited from an analysis of studies in other areas, such as learning in science and mathematics, spatial reasoning, design thinking, and problem solving. Recommendation 7. The National Science Foundation or U.S. Department of Education should fund a synthesis study focused on how children learn technological concepts. The study should draw on the findings of multidisciplinary research in mathematics learning, spatial reasoning, design thinking, and problem solving. The study should provide guidance on pedagogical, assessment, teacher education, and curricular issues of interest to educators at all levels, teacher-education providers and licensing bodies, education researchers, and federal and state education agencies. An understanding of how people learn is critical to designing valid, meaningful assessment instruments. However, the research base on how people learn about technology, engineering, design, and related ideas is relatively immature compared with the state of knowledge in the general science of learning. Most of the information—particularly for engineering—is focused on what people know and how this varies by population, rather than on how information is acquired, processed, and represented. As the research base on learning about technology grows, assessments of technological literacy will also improve. However, real progress will require a decade or more of sustained effort, including the training of a cadre of researchers. An understanding of how people learn is critical to designing valid, meaningful assessment instruments. Recommendation 8. The National Science Foundation (NSF) and U.S. Department of Education should support a research-capacity-

OCR for page 1
Tech Tally: Approaches to Assessing Technological Literacy building initiative related to the assessment of technological literacy. The initiative should focus on supporting graduate and postgraduate research related to how students and teachers learn technology and engineering concepts. Funding should be directed to academic centers of excellence in education research—including, but not limited to, NSF-funded centers for learning and teaching—whose missions and capabilities are aligned with the goal of this recommendation. To the committee’s knowledge, no rigorous efforts have been made to ascertain how adults acquire and use technological knowledge. School and work experience could affect their performance, but adults who are no longer in the formal education system are also influenced by a variety of free-choice learning opportunities, including popular culture, the news media, and museums and science centers. Recommendation 9. The National Science Foundation should take the lead in organizing an interagency federal research initiative to investigate technological learning in adults. Because adult learning is continuous, longitudinal studies should be encouraged. Informal-learning institutions that engage broad populations, such as museums and science centers, should be considered important venues for research on adult learning, particularly related to technological capability. To ensure that the perspectives of adults from a variety of cultural and socioeconomic backgrounds are included, studies should also involve community colleges, nonprofit community outreach programs, and other programs that engage diverse populations. Exploiting Innovative Measurement Techniques The increasing speed, power, and ubiquity of computers in various configurations (e.g., desktops, laptops, personal digital assistants, e-tablets, and cell phones), combined with increasing access to the Internet, suggest a variety of innovative approaches to assessment in many domains, but particularly for assessment of technological literacy. Computer-adaptive testing, for example, has the potential to assess student knowledge of technology quickly, reliably, and inexpensively. Simulation could be a safe and economical approach to assessing procedural, analytical, and abstract capabilities and skills. Internet-based, massive,

OCR for page 1
Tech Tally: Approaches to Assessing Technological Literacy multiplayer online games could be an inexpensive way of engaging very large numbers of individuals for extended periods of time. However, more research and development will be necessary before computer-based assessments of technological literacy can be used with full confidence. For one thing, the formal, psychometric properties of simulation must be better understood. For another, the costs of developing simulations de novo may be prohibitive. Recommendation 10. The National Institute of Standards and Technology, which has a broad mandate to promote technology development and an extensive track record in organizing research conferences, should convene a major national meeting to explore the potential of innovative, computer-based techniques for assessing technological literacy in students, teachers, and out-of-school adults. The conference should be informed by research related to assessments of science inquiry and scientific reasoning and should consider how innovative assessment techniques compare with traditional methods. Framework Development A necessary first step in the development of assessments for technological literacy is the creation of a conceptual framework. Although a number of frameworks exist in other subjects, such as mathematics, science, and history, the committee found none in the domain of technology. The committee believes that existing content standards for K–12 students and, by inference, for pre-service teachers and out-of-school adults, are overly ambitious. Criteria similar to the ones used by AAAS Project 2061 (AAAS, 1990) to identify the most important ideas in science could be developed to help specify appropriate expectations in technology. In general, framework designers will have to narrow and prioritize the content to be assessed. Recommendation 11. Assessments of technological literacy in K–12 students, K–12 teachers, and out-of-school adults should be guided by rigorously developed assessment frameworks, as described in this report. For K–12 students, the National Assessment Governing Board, which has considerable experience in the development of

OCR for page 1
Tech Tally: Approaches to Assessing Technological Literacy assessment frameworks in other subjects, should commission the development of a framework to guide the development of national and state-level assessments of technological literacy. For K–12 teachers, the National Science Foundation and U.S. Department of Education, which both have programmatic interests in improving teacher quality, should fund research to develop a framework for an assessment of technological literacy in this population. The research should focus on (1) determining how the technological literacy needs of teachers differ from those of student populations and (2) strategies for implementing teacher assessments in a way that would provide useful information for both teachers and policy makers. The resulting framework would be a prerequisite for assessments of all teachers, including generalists and middle- and high-school subject-matter specialists. For out-of-school adults, the National Science Foundation and U.S. Department of Education, which both have programmatic activities that address adult literacy, should fund research to develop a framework for the assessment of technological literacy in this population. The research should focus on determining thresh-olds of technological literacy necessary for adults to make informed, everyday, technology-related decisions. Expanding the Definition of Technology Confusion about the word “technology” and the term “technological literacy” is a major challenge to improving technological literacy in the United States. Based on data from ITEA’s two Gallup polls, the results of the committee-sponsored workshop, and informal discussions with a variety of individuals knowledgeable about technological literacy, the committee concluded that confusion about the word “technology” and the term “technological literacy” is a major challenge to improving technological literacy in the United States. Although defining technology was not included in the statement of task for this study, the committee is aware that many people define technology as computers (and sometimes other electronic devices). A great deal of interest has been expressed in the education community and other sectors in measuring what people—adults and children—know about and can do with computer technology (e.g., NRC, 1999). Some states, testing companies (e.g., ETS), the federal government (through the No Child Left Behind Act), and others support the development of, or have

OCR for page 1
Tech Tally: Approaches to Assessing Technological Literacy developed assessments for, measuring computer-related literacy. The International Society for Technology in Education has developed performance standards that have been adopted or adapted by many states for the use of IT by K–12 students (ISTE, 2000) Of course, children and adults in a modern nation like the United States benefit by being able to use computer technologies. Thus, assessments of computer or IT literacy focused on application skills will be important, particularly for students. But such assessments would be even more valuable if they also addressed other crucial aspects of technology, as discussed in Technically Speaking (NAE and NRC, 2002) and detailed in national educational standards for science (AAAS, 1993; NRC, 1996) and technology (ITEA, 2000). Policy makers would benefit from knowing not only how capable people are with computer technology, but also whether they can think critically and make sensible decisions about technological developments. Recommendation 12. The U.S. Department of Education, state education departments, private educational testing companies, and education-related accreditation organizations should broaden the definition of “technological literacy” to include not only the use of educational technologies (computers) but also the study of technology, as described in the International Technology Education Association Standards for Technological Literacy and the National Academy of Engineering and National Research Council report Technically Speaking. Conclusion The committee’s recommendations are largely interdependent. For instance, all assessments for technological literacy will benefit from the development of detailed assessment frameworks (Recommendation 11), and frameworks and assessments will improve as more becomes known about how adults and children learn technology- and engineering-related concepts (Recommendations 7, 8, and 9). This same research will also inform efforts to use new techniques, such as simulation and gaming, for assessing what people know and can do with respect to technology (Recommendation 10). And these and other novel assessment tools have the potential to improve dramatically our ability to gauge technological literacy, particularly the capabilities dimension. As educators, policy makers, and the public at large adopt a broader view of

OCR for page 1
Tech Tally: Approaches to Assessing Technological Literacy technology (Recommendation 12), assessments of technological literacy will be considered not only important, but also necessary. Although all of the recommendations are important and should be implemented, some recommended actions will be easier and less costly to implement—and more likely to have near-term results—than others. For example, the integration of technology-related items into existing instruments (Recommendations 1, 2, 4, and 6), which would leverage already developed tests and a testing infrastructure, will be easier to accomplish than creating de novo assessments (Recommendations 3 and 5). Like traditional reading literacy, science literacy, civics, and numeracy, technological literacy is considered a public good. Hence, most of the entities addressed in the recommendations are federal and state government agencies, a large number of which have an interest or role in supporting science and engineering research, developing new technologies, maintaining and protecting the nation’s infrastructure, and training the technical workforce. However, many nongovernmental organizations will also benefit, directly or indirectly, from a more technologically literate public. The committee hopes that these organizations will become interested and involved in broad-based efforts to promote the assessment of technological literacy. The Full Report In the full report, the committee describes how the concept of technological design can be used to guide the development of assessments of technological literacy (Chapter 2). For readers not versed in the vocabulary of assessment and cognitive science, or unfamiliar with the state of research on technological learning, the report provides a primer on all three subjects (Chapter 3). Analyses of the 28 assessment instruments collected during the course of the project are also examined in detail (Chapter 4 and Appendix E). Concrete examples of some of the general principles of assessment of technological literacy are provided in the case studies in Chapter 5. The case studies range from a nationwide sample of 7th graders to assessments of visitors to a science museum. In addition to summaries of the assessment instruments collected by the committee, the report includes excerpts of K–12 learning goals related to the study of technology from three sets of content standards (Appendix B), and an annotated

OCR for page 1
Tech Tally: Approaches to Assessing Technological Literacy bibliography of some of the research on how people learn technology-and engineering-related concepts (Appendix D). The Executive Summary can be read online and downloaded free of charge from the website of the National Academies Press (NAP), www.nap.edu. The full report and individual chapters can be downloaded as PDF files for a fee, and the entire report can be ordered in hard copy, from NAP. References AAAS (American Association for the Advancement of Science). 1990. Science for All Americans. New York: Oxford University Press. AAAS. 1993. Benchmarks for Science Literacy. New York: Oxford University Press. Bloch, E. 1986. Scientific and technological literacy: the need and the challenge. Bulletin of Science, Technology and Society 6(2-3): 138–145. ISTE (International Society for Technology in Education). 2000. National Educational Technology Standards for Students: Connecting Curriculum and Technology. Available online at: http://cnets.iste.org/students/s_book.html (June 29, 2005). ITEA (International Technology Education Association). 1996. Technology for All Americans: A Rationale and Structure for the Study of Technology. Reston, Va.: ITEA. ITEA. 2000. Standards for Technological Literacy: Content for the Study of Technology. Reston, Va.: ITEA. ITEA. 2001. ITEA/Gallup Poll Reveals What Americans Think About Technology. A Report of the Survey Conducted by the Gallup Organization for the International Technology Education Association. Available online at: http://www.iteaconnect.org/TAA/PDFs/Gallupreport.pdf (October 5, 2005). ITEA. 2004. The Second Installment of the ITEA/Gallup Poll and What It Reveals as to How Americans Think About Technology. A Report of the Second Survey Conducted by the Gallup Organization for the International Technology Education Association. Available online at: http://www.iteaconnect.org/TAA/PDFs/GallupPoll2004.pdf (October 5, 2005). Lemke, M. 2004. Statement for the Assessing Technological Literacy Workshop, September 29, 2004. The National Academies, Washington, D.C. Unpublished. Meade, S.D., and W.E. Dugger, Jr. 2004. Reporting on the status of technology education in the United States. Technology Teacher 63(October): 29–35. NAE (National Academy of Engineering) and NRC (National Research Council). 2002. Technically Speaking: Why All Americans Need to Know More About Technology. Washington, D.C.: National Academy Press. NAGB (National Assessment Governing Board). 2002. Mathematics Framework for the 2003 National Assessment of Educational Progress. Available online at: http://www.nagb.org/pubs/math_framework/toc.html (December 9, 2004). NAGB. 2004. Science Framework for the 2005 National Assessment of Educational Progress. Available online at: http://www.nagb.org/pubs/s_framework_05/toc.html (October 21, 2005). NRC (National Research Council). 1999. Being Fluent with Information Technology. Washington, D.C.: National Academy Press. NSB (National Science Board). 2004. Science and Technology: Public Attitudes and Understanding in Science and Engineering Indicators, 2004. Available online at: http://www.nsf.gov/sbe/srs/seind04/c7/c7h.htm (May 25, 2005).

OCR for page 1
Tech Tally: Approaches to Assessing Technological Literacy Petrina, S., F. Feng, and J. Kim. 2004. How We Learn (About, Through, and for Technology): A Review of Research. Paper commissioned by the National Research Council Committee on Assessing Technological Literacy. Unpublished. Rutherford, J. 2004. Technology in the schools. Technology in Society 26(2-3): 149–160. Waller, A. 2004. Final Report on a Literature Review of Research on How People Learn Engineering Concepts and Processes. Paper commissioned by the National Research Council Committee on Assessing Technological Literacy. Unpublished.