National Academies Press: OpenBook

Tech Tally: Approaches to Assessing Technological Literacy (2006)

Chapter: 8 Findings and Recommendations

« Previous: 7 Computer-Based Assessment Methods
Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

8
Findings and Recommendations

The overarching goal of assessing technological literacy is to provide an accurate picture of what Americans of all ages know and can do with respect to technology. After reviewing the literature related to assessment, cognition, and technological literacy; receiving input from a variety of stakeholders; and drawing on its own experiences and judgment, the committee developed the following general principles to guide the development of assessments of technological literacy for students, teachers, and out-of-school adults:

  1. Assessments should be designed with a clear purpose in mind. The purpose must be clear to the developers of the assessment, as well as to test takers and the users of the results.

  2. Assessment developers should take into account research findings related to how children and adults learn, including how they learn about technology. Insights into how conceptual understanding of technology develops and the mental processes involved in solving technological problems can help assessment designers construct appropriate items and tasks.

  3. The content of an assessment should be based on rigorously developed learning standards. The knowledge and skills identified in learning standards reflect the judgments of technical experts and experienced educators about the development of technological literacy.

  4. Assessments should provide information about all three dimensions of technological literacy—knowledge, capabilities, and critical thinking and decision making. Meaningful

Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

conclusions about the state of technological literacy in the United States must reflect skills and knowledge in all three dimensions.

  1. Assessments should not reflect gender, culture, or socioeconomic bias. Because of the nature of technology, men and women, people from different cultures, and people from different economic backgrounds experience and value technology in different ways. Designers of assessments must take these differences into account to avoid including items and tasks that favor or disadvantage particular groups.

  2. Assessments should be accessible to people with mental or physical disabilities. In keeping with federal and state laws and regulations, assessments of technological literacy must be designed, as much as possible, to allow individuals with mental or physical disabilities to participate.

In addition to these general guidelines, the committee developed findings and related recommendations in five categories: opportunities for assessment; research on learning; the use of innovative measurement techniques; framework development; and broadening the definition of technology. The numbering of the recommendations does not indicate prioritization. Although some recommendations will be easier to implement than others, the recommendations are interdependent, and the committee believes that all of them are necessary.

Opportunities for Assessment

General Findings

Based on the review of assessment instruments (Chapter 4 and Appendix E) and input from participants in a committee-sponsored workshop, the committee finds that the assessment of technological literacy is in its infancy. This is not surprising considering that most students still have no access to courses in school that are likely to encourage technological thinking. Although a majority of states have adopted the learning goals spelled out in the ITEA standards in one form or another, fewer than one-quarter require that students take coursework consistent with the standards in order to graduate (Meade and Dugger, 2004). With the notable exception of technology educators, few teachers currently have an incentive to learn about or demonstrate knowledge of

Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

technology as described in Technically Speaking. Finally, very little thought has been given to assessing the technological literacy or attitudes toward technology of out-of-school adults.

On a more positive note, the review of assessment instruments suggests that valid and reliable items can be developed that address one or more of the cognitive dimensions and all of the content domains of technological literacy. Items related to critical thinking and decision making may be the most challenging for assessment developers, and time and resource constraints will pose obstacles to the development of items to measure design-related capability. But both types of items can and should be developed.

The paucity of instruments for measuring technological literacy in informal-learning settings, such as museums, indicates a major area of opportunity. Adults and children learn about many things, including technology, through exposure to television, the Internet, movies, magazines, books, and other media, as well as through life experiences and self-study. Very few of the assessments seen by the committee attempt to document the effects of learning outside of a formal school structure or learning related to the use of specific technologies (Gee, 2003; Kasesniemi and Rautiainen, 2002; Valentine et al., 2000).

Until rigorously developed assessments of technological literacy become more prevalent in the United States, neither educators nor policy makers, business leaders, or the public at large will be able to gauge the ability of citizens to participate confidently and responsibly in our technology-dependent world.

The committee finds there are two main areas of opportunity for increasing the use of assessments of technological literacy: (1) the modification of existing assessments; and (2) the development of new assessments. Existing assessments in technology-related subject areas, particularly science, mathematics, and history (or social studies), could be modified by adding items, tasks, or survey questions for measuring technological literacy. The obvious benefit of this strategy is that it leverages validated assessment designs and existing implementation networks. This “plug-and-play” approach would also provide data about technological literacy relatively quickly.

The committee finds there are two main areas of opportunity for increasing the use of assessments of technological literacy.

The second area of opportunity is the development of new assessment instruments devoted entirely to technological literacy. This more ambitious course of action would require breaking new ground. The development of assessment instruments de novo, especially in an area like

Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

technology, which is largely outside the mainstream of formal education, would face significant hurdles, as noted in several case studies in Chapter 6. However, the potential benefits would also be significant, especially the prospect of realizing a comprehensive picture of Americans’ understanding of and engagement in our technological world.

The two areas of opportunity just described are not mutually exclusive, and the committee recommends that both approaches be pursued simultaneously. As a practical matter, data gathered from early, integrative attempts to assess aspects of technological literacy would provide valuable input for comprehensive, stand-alone assessments, whether for students, teachers, or out-of-school adults. No matter which approach is taken, assessment items should be designed to encourage both higher order and design-related thinking, because questions and tasks that require the analysis and synthesis of information are more useful for measuring technological literacy than items that require the recall of information. And because design processes are at the heart of technology development, it makes sense that assessments of technological literacy provide opportunities for people to demonstrate design capability. Of course, test instruments and the specification of test items in assessments for students must be aligned with content standards and curriculum.

Assessment items should be designed to encourage both higher order and design-related thinking.

The gaps in our understanding of attitudes toward technology are as wide as the gaps in our knowledge of what people know and can do with respect to technology. Attitudinal information for all three populations would benefit the developers of assessments and researchers. Assessments designed to measure attitudes toward technology must include all of the components of attitudes—cognition, affect, and a tendency toward action (cf. Box 2-3). Designers should attempt to address all aspects of these components and should specifically elicit positive/negative or favorable/unfavorable responses to particular aspects or objects of technology.

Findings and Recommendations for K–12 Students

There are a handful of established, thoughtfully developed national and international assessments to which technology-related items might be added. The National Assessment of Educational Progress (NAEP) samples achievement in reading, mathematics, writing, and science, among other subjects, in U.S. 5th-, 8th-, and 12th-graders. All states that receive Title I funds must take part in the NAEP assessments of mathematics and reading; participation in writing and science

Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

assessments is optional. Data from a subset of public schools participating in state NAEP assessments are combined to provide a representative national sample. NAEP results are reported on the national and state level and by region of the country, but not by school district, school, or individual student. Group statistics are broken down by gender, race/ ethnicity, and a host of other variables to shed light on students’ instructional experiences.

The most recent science-focused NAEP assessment was conducted in spring 2005. In late 2004, the agency hired WestEd, a regional education laboratory, to develop a new framework for a science assessment, which will be the basis of the next NAEP science test, in 2009. A draft of the framework was published for public comment in fall 2005 (NAGB, 2005). NAEP also occasionally conducts so-called special studies, usually small-scale projects to test new approaches, explore new content, or assess achievement in particular population groups.

Independent of NAEP, all states must assess the reading and mathematics achievement for all public-school students in grades 3 through 8, and at least once in grades 10 through 12, as part of the No Child Left Behind Act of 2001 (NCLB; P.L. 107-110). NCLB requires that individual and school-level results be reported. States are required to begin a similar testing regimen for science achievement in 2007.

NCLB heavily promotes the use of educational technology in schools. Among other provisions, the law requires that states make a determination of the “technology literacy” of all students by the end of grade 8, and language in the accompanying House report on the legislation notes the importance of students becoming “technologically literate.” However, it is clear from the context of these references that the concept of technological literacy in these documents differs substantially from the concept described in Technically Speaking (NAE and NRC, 2002).

In early 2005, the Educational Testing Service released a new Information and Communication Technology (ICT) Literacy Assessment for college-age students that focuses on how well they understand, use, and make decisions about information technology (ETS, 2005).

The United States participates in two large-scale international assessments of K–12 students. The Trends in Mathematics and Science Study (TIMSS), based on science and mathematics curricula in participating countries, is given in grades 4, 8, and the last year of high school. TIMSS includes an analysis of frameworks and standards, a video analysis of teaching, and a review of textbooks.

Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

The second international assessment, the Programme for International Student Assessment (PISA), is intended to gauge how well 15-year-old students apply and use what they have learned in and out of school in mathematics, reading, and science as an indication of the quality of potential entrants to the workforce. PISA is administered every three years, with the emphasis on one of the three subjects. The 2003 assessment, which was focused on mathematics, included a one-time cross-curricular measure of problem solving (OECD, 2004). Among other competencies, test items addressed students’ ability to design solutions to practical problems under specified constraints and to troubleshoot everyday problems. The 2006 PISA assessment, which will be focused on science, will include a section called “Science and Technology in Society.”


Recommendation 1. The National Assessment Governing Board, which oversees the National Assessment of Educational Progress (NAEP), should authorize special studies of the assessment of technological literacy as part of the 2009 NAEP mathematics and science assessments and the 2010 NAEP U.S. history assessment. The studies should explore the content connections between technology, science, mathematics, and U.S. history to determine the feasibility of adding technology-related items to future NAEP assessments in these subjects.


Recommendation 2. The U.S. Department of Education and National Science Foundation (NSF) should send a recommendation to the International Association for the Evaluation of Educational Achievement and the Trends in Mathematics and Science Study (TIMSS) governing board encouraging them to include technological literacy items in TIMSS assessments as a context for assessments of science and mathematics. The U.S. Department of Education and NSF should send a recommendation to the Organization for Economic Cooperation and Development and the governing board for the Programme for International Student Assessment (PISA) supporting the inclusion of technological literacy items as a cross-curricular competency.

The second area of opportunity for assessing technological literacy in the K–12 population is to create de novo instruments. This more ambitious course of action would face significant challenges but would also have significant potential benefits, such as providing a comprehensive

Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

picture of young Americans’ understanding of and engagement in our technological world.

In spring 2005, the National Assessment Governing Board (NAGB), which oversees NAEP assessments, authorized a “probe study” of assessing technological literacy. Probe studies are small-scale research projects to determine the feasibility of developing new large-scale assessments under the NAEP umbrella. Among other things, the probe study of the assessment of technological literacy will look into the pros and cons of different assessment methods and collect considerable attitudinal data.

The NAGB probe study is a very encouraging development that suggests the possibility of collecting national sample-based data on technological literacy among U.S. K–12 students. However, the results of the study will not be known for many years. Because of the time required to develop a conceptual framework and conduct field tests of assessment items, the actual assessment will not take place until 2012. If NAGB then decides to add technological literacy to the subjects routinely assessed by NAEP, it could take another four years for a national test to be administered.

The committee believes that other efforts should be undertaken in the meantime to develop stand-alone assessments of technological literacy for K–12 students. For one thing, there is no guarantee that NAGB will ultimately decide to support a national test of technological literacy. Even if it does, however, the assessment methods, specific test items, and uses of assessment data are not likely to satisfy the needs or interests of all stakeholders.


Recommendation 3. The National Science Foundation should fund a number of sample-based studies of technological literacy in K–12 students. The studies should have different assessment designs and should assess different population subsets, based on geography, population density, socioeconomic status, and other factors. Decisions about the content of test items, the distribution of items among the three dimensions of technological literacy, and performance levels should be based on a detailed assessment framework.

Findings and Recommendations for K–12 Teachers

Technological literacy is especially important for teachers.

Technological literacy is especially important for teachers. In various forms, technology is integral to virtually every aspect of society. Therefore, no matter what the academic discipline, from history to art to

Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

science, teachers should be able to discuss issues related to technology. For example, technology has been a critical factor in the economic and cultural development of the United States and is, therefore, critical to a comprehensive understanding of social and historical developments. Whereas science and mathematics are important in their own right, technology requires the application of scientific and mathematical principles to solve practical problems. Most discoveries in science would not have been possible without technological tools, such as microscopes, radiotelescopes, and genetic engineering. This basic understanding should be an aspect of teaching in every subject, and both students and teachers should have hands-on exposure to design processes that involve the use of tools and materials.

Anecdotally, the committee found that many teachers, particularly science teachers, do introduce technology-related concepts in their classrooms. However, very little information is available about the technological literacy of teachers. An assessment for prospective teachers of technology is offered through the Educational Testing Service Praxis series, and technology-related items are included in the Praxis tests given to pre-service science teachers. However, the committee believes that neither assessment adequately measures technological literacy. In addition, because Praxis tests are designed to assess individual performance, the results are not aggregated or made public. Thus, they cannot easily inform policies related to teacher education or curriculum development.

The committee recognizes that it would be difficult to persuade teachers to take part in assessments. Teachers and teachers’ unions have traditionally opposed tests of knowledge or skills, except for the purposes of certification or licensing. However, since the passage of NCLB in 2001, the situation has changed somewhat. Provisions in NCLB related to teacher quality provide incentives to states to document that teachers are knowledgeable in the subjects they teach. By the end of the 2005–2006 school year, all states will be required to show that teachers of core subjects are “highly qualified” (DoEd, 2005),1 and future teachers will have to

1

Teachers are deemed “highly qualified” if they have a bachelor’s degree, certification, or license from the state and can prove that they know the subject they teach. Teachers may satisfy the last requirement in several ways, including having majored or earned college credits equivalent to a major in the subject, having gained advanced certification from the state or a graduate degree, or having passed a state-developed test. Although the number of teachers who have opted for the testing option is not known, nearly every state offers tests in the core subjects (K. Walsh, president, National Council on Teacher Quality, personal communication, August 18, 2005).

Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

meet the same requirements. (Even if NCLB is modified or abandoned completely by a future administration or Congress, it is likely that teacher competency related to information technology will still be required.)

Passing a competency test, such as those developed by Praxis or by the state, is one of several ways teachers can meet the quality mandate. Thus, given the relevance of technological understanding to a broad range of academic subjects and the current emphasis on teacher quality, NCLB provides an important opportunity for assessment at the state level.


Recommendation 4. When states determine whether teachers are “highly qualified” under the provisions of the No Child Left Behind Act (NCLB), they should ensure—to the extent possible—that assessments used for this purpose include items that measure technological literacy. This is especially important for science, mathematics, history, and social studies teachers, but it should also be considered for teachers of other subjects. In the review of state plans for compliance with NCLB, the U.S. Department of Education should consider the extent to which states have fulfilled this objective.

A different approach will be necessary to assess technological literacy among teachers at the national level. Because of financial, time, and logistical constraints, it will not be possible to administer the same battery of test items to every teacher in the United States. Thus, some type of sampling will be necessary. NAGB, which administers national student assessments, and other organizations use matrix sampling, in which participating individuals are presented with a subset of the total number of test items. By combining the results for a number of subsets, it is possible to construct a complete picture of performance.

One drawback of matrix sampling is that, because no individual answers all of the questions, individual scores cannot be reported. Thus, although matrix-sample results may reveal performance trends in certain subgroups—for example, all 3rd-grade teachers of science—it has no diagnostic value for individual teachers. In one sense, however, the absence of individual scores may be an advantage because it may alleviate fears that assessment results might be used by educational administrators to make pay or retention decisions.

Another form of sampling, census sampling, involves administering the same or comparable set of questions to a sample of people in a single group. With census sampling, individual scoring can be done,

Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

which is a significant advantage if the goal is to diagnose teacher’s strengths and weaknesses. However, census sampling typically involves more limited coverage of subject matter than matrix sampling because fewer questions can be asked. In addition, because individual performance levels can be identified, some teachers may not want to participate. Their reluctance might be overcome by legal assurances that the results would not be used for determining individual rewards or punishments and/or by providing reasonable compensation for participation.


Recommendation 5. The National Science Foundation and U.S. Department of Education should fund the development and pilot testing of sample-based assessments of technological literacy among pre-service and in-service teachers of science, technology, English, social studies, and mathematics. These assessments should be informed by carefully developed assessment frameworks. The results should be disseminated to schools of education, curriculum developers, state boards of education, and other groups concerned with teacher preparation and teacher quality.

Findings and Recommendation for Out-of-School Adults

Very little information is available about technological literacy levels among American adults.

Very little information is available about technological literacy levels among American adults. As noted in the Introduction to this report, government, industry, the media, social science researchers, and other groups would all benefit from having more information about what out-of-school adults know and think about technology.

Some important data are provided by the 2001 and 2004 ITEA/ Gallup polls, but they are limited in scope and treat the three dimensions of technological literacy unevenly, making it difficult to draw conclusions. NSF’s biannual reports on public understanding of science and technology (S&T) provide some useful information, but NSF’s efforts were focused on science and science literacy, rather than technology and technological literacy. In 2003, NSF discontinued funding for its longstanding survey of adult S&T literacy, which was published as part of the Indicators reports (see, for example, NSB, 2004). No other federal agencies with a role in education, technology research and development, or communicating with the public about technological issues have invested in efforts to document adults’ understanding of technology.

Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

Some of the questions from the NSF survey are being added to the 2005/2006 General Social Survey (GSS), another longstanding project now administered biennially by the National Opinion Research Center; other items from the NSF survey have been used by the Survey Research Center of the University of Michigan and other public opinion research groups (J. Miller, director, Center for Biomedical Communications, Northwestern University, personal communication, September 12, 2005). The GSS survey focuses mostly on national spending priorities, drinking behavior, marijuana use, crime and punishment, race relations, quality of life, confidence in institutions, and membership in voluntary associations (NORC, 2005). However, the survey periodically includes modules on new topics funded by outside sources, usually foundations or government agencies.

The National Household Education Surveys (NHES) Program, conducted by the National Center for Education Statistics, provides descriptive data on the educational activities of U.S. adults, children, and families. NHES, like GSS, also occasionally conducts one-time special studies. Since its inception in 1991, NHES has conducted three special studies, on civic involvement, household library use, and school safety and discipline (NCES, 2005).

Through the National Adult Literacy Survey, the United States assesses traditional literacy (i.e., written and spoken language competency) in adults. In the 1990s, the U.S. Department of Education, Educational Testing Service, and WestEd (a regional educational laboratory) participated in two International Adult Literacy Surveys (IALS), which surveyed prose and document literacy, as well as quantitative literacy. More recently, the United States and several other countries developed and administered a revamped international literacy assessment called the Adult Literacy and Lifeskills (ALL) Survey. Like IALS, ALL focuses on prose and document literacy, but also redefines quantitative literacy as numeracy, implying a broader range of activities, some of which might be relevant to assessing technological literacy (Lemke, 2004).

In addition, ALL measures a cross-curricular area of competency related to problem solving (OECD and Statistics Canada, 2005). The ITEA Standards for Technological Literacy suggest that all students should be familiar with problem solving, a distinguishing feature of the technological design process. Technological problem solving in adults might be manifested in concrete ways (e.g., determining possible reasons a flashlight

Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

does not work or a car does not start or a door sticks after heavy rain and then figuring out how to correct the problem).

Another consideration in assessing or surveying adults is defining the target population (e.g., technology consumers, people attentive to public policy, or the general population) and defining an appropriate purpose for an assessment of each (see Case 4 in Chapter 6). Technology consumers, for instance, may include adults and adolescents, who are major purchasers of technological products and services. Periodic surveys of consumers conducted by the University of Michigan Survey Research Center measure consumer understanding of selected technologies.


Recommendation 6. The International Technology Education Association should continue to conduct a poll on technological literacy every several years, adding items that address the three dimensions of technological literacy, in order to build a database that reflects changes over time in adult knowledge of and attitudes toward technology. In addition, the U.S. Department of Education, working with its international partners, should expand the problem-solving component of the Adult Literacy and Lifeskills Survey to include items relevant to the assessment of technological literacy. These items should be designed to gauge participants’ general problem-solving capabilities in the context of familiar, relevant situations. Agencies that could benefit by knowing more about adult understanding of technology, such as the National Science Foundation, U.S. Department of Education, U.S. Department of Defense, and National Institutes of Health, should consider funding projects to develop and conduct studies of technological literacy. Finally, opportunities for integrating relevant knowledge and attitude measures into existing studies, such as the General Social Survey, the National Household Education Survey, and Surveys of Consumers, should be pursued.

Research on Learning

The research base on how people learn about technology is relatively immature.

Based on a review of committee-commissioned surveys of the literature on learning related to technology (Petrina et al., 2004) and engineering (Waller, 2004), as well as the expert judgments of committee members, the committee finds that the research base on how people learn about technology, engineering, design, and related ideas is relatively immature. Most of the research—particularly related to engineering—relates

Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

to what people know or how knowledge varies by population, rather than how information is acquired, processed, and represented. Although researchers have turned up some important clues, the overall picture of how people come to understand and work with technology is far from clear. Based on the small number of published studies in this area, only a few graduate programs in engineering and technology education support research on how people learn.

The committee also finds that research on learning has traditionally been considered a public good and, therefore, has been supported by government agencies whose missions include the improvement of the U.S. education enterprise, rather than by the private sector. This is a trend that can be expected to continue.

A number of study designs, including those that use surveys, interviews, focus groups, and hands-on activities, are suitable for assessing some aspects of adult learning related to technology. Places where adults congregate for social, educational, or other reasons present interesting opportunities for data gathering, especially for pilot studies and measurement test beds. Exploratory studies with volunteer test takers could be an important part of this research, although more definitive studies would certainly require larger probability-based samples.

For all three populations—students, teachers, and out-of-school adults—it is important for assessment designers to know how mental structures, or schema, support problem solving in technology; the role of prior knowledge, including misconceptions, in understanding technology and design; how an understanding of core ideas in technology, such as systems and trade-offs, transfers to other knowledge domains; and how the social context (e.g., the classroom, home, or workplace) facilitates or hinders knowledge acquisition.

A number of specific research questions were suggested in the section on cognition in Chapter 4 of this report. Although that list is far from exhaustive, it provides a starting point for investigations in this area.


Recommendation 7. The National Science Foundation or U.S. Department of Education should fund a synthesis study focused on how children learn technological concepts. The study should draw on the findings of multidisciplinary research in mathematics learning, spatial reasoning, design thinking, and problem solving. The study should provide guidance on pedagogical, assessment, teacher education, and curricular issues of interest to educators at all levels, teacher-education

Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

providers and licensing bodies, education researchers, and federal and state education agencies.


Recommendation 8. The National Science Foundation (NSF) and U.S. Department of Education should support a research-capacity-building initiative related to the assessment of technological literacy. The initiative should focus on supporting graduate and postgraduate research related to how students and teachers learn technology and engineering concepts. Funding should be directed to academic centers of excellence in education research—including, but not limited to, NSF-funded centers for learning and teaching—whose missions and capabilities are aligned with the goal of this recommendation.


Recommendation 9. The National Science Foundation should take the lead in organizing an interagency federal research initiative to investigate technological learning in adults. Because adult learning is continuous, longitudinal studies should be encouraged. Informal-learning institutions that engage broad populations, such as museums and science centers, should be considered important venues for research on adult learning, particularly related to technological capability. To ensure that the perspectives of adults from a variety of cultural and socioeconomic backgrounds are included, studies should also involve community colleges, nonprofit community outreach programs, and other programs that engage diverse populations.

Innovative Measurement Techniques

The increasing speed, power, and ubiquity of computers in various configurations (e.g., desktops, laptops, personal digital assistants, e-tablets, and cell phones), combined with increasing access to the Internet, could support a variety of innovative approaches to assessment. Considerable work is already being done to develop software applications, including simulation authoring tools, that could be used by assessment developers to save time and money in the test design process.

Computer-based testing is particularly appealing for the assessment of technological literacy.

Computer-based testing is particularly appealing for the assessment of technological literacy. As detailed in Chapter 7, computer-adaptive testing has the potential to assess student knowledge of technology quickly, reliably, and inexpensively. Simulation could be used as a safe

Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

and economical means of assessing more procedural, analytical, and abstract capabilities and skills. The use of Internet-based, massive-multiplayer online games to conduct assessments could be sufficiently motivating and inexpensive to engage very large numbers of individuals for extended periods of time.

At the same time, it is clear that more research and development will be necessary before computer-based assessments can be used with full confidence—and affordability—to assess technological literacy. For one thing, the formal, psychometric properties of simulation must be better understood. In addition, the cost of developing simulations de novo may be prohibitive. Nevertheless, the possibilities are tantalizing, especially the prospect of providing children and adults with authentic problem-solving and design challenges that map to the dimensions of technological literacy.

A potentially large number of organizations and individuals have a direct or indirect interest in how computers might be used to measure design and problem-solving capabilities, a key aspect of technological literacy. They include federal agencies (e.g., National Science Foundation, U.S. Department of Defense, U.S. Department of Labor), museums and science centers, private assessment-development companies (e.g., Educational Testing Service, ACT, McGraw Hill, Knowledge Networks, Harris Interactive), computer game and software firms, technology-intensive industries (e.g., computer hardware and software manufacturers, aerospace firms, makers of telemedicine and computer-assisted surgical systems), and university-based scientists and social scientists working in this area. The federal government could encourage research in this area (as it has in other areas of national interest) by bringing these organizations and individuals together.


Recommendation 10. The National Institute of Standards and Technology, which has a broad mandate to promote technology development and an extensive track record in organizing research conferences, should convene a major national meeting to explore the potential of innovative, computer-based techniques for assessing technological literacy in students, teachers, and out-of-school adults. The conference should be informed by research related to assessments of science inquiry and scientific reasoning and should consider how innovative assessment techniques compare with traditional methods.

Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

Framework Development

An important and often necessary first step in the development of an assessment for technological literacy is the creation of a conceptual framework. Although a number of frameworks have been developed in other subjects, such as mathematics, science, and history, the committee found no frameworks for the domain of technology. Framework development requires resources and time but is essential for clarifying and organizing the content of an assessment. Ideally, rigorously developed frameworks should inform the development of both stand-alone assessments of technological literacy and assessments in other subjects that include technology-related questions. Even in the absence of a framework, however, the committee believes that the pursuit of integrative strategies for gaining information about technological literacy should continue.

The list of things one might know and be able to do with respect to technology is practically limitless. Even with the benefit of thoughtfully developed content standards, such as those produced by ITEA, creating a workable framework for the assessment of technological literacy will require narrowing the scope of the content. The authors of Science for All Americans (AAAS, 1990), whose argument for science literacy laid the groundwork for the AAAS and NRC science standards, addressed this problem directly. Their solution was to develop criteria (utility, social responsibility, intrinsic value of knowledge, philosophical value, and childhood enrichment) for determining the most important science content students should learn. Designers of a framework for an assessment of technological literacy will have to undertake a similar exercise to narrow and prioritize the content.

The list of things one might know and be able to do with respect to technology is practically limitless.

In Chapter 3, the committee proposed a matrix that could be helpful in designing a framework; however, the matrix is only one of a number of possible arrangements of content. No doubt the initial framework will require reworking as information is collected about what people know and can do with respect to technology. Reworking reflects a natural evolution and improvement in assessment design.

The committee’s matrix relies heavily on frameworks designed to support assessments in student populations, rather than teachers and out-of-school adults. However, the committee believes that the proposed matrix would also be useful for the development of assessment frameworks for the other two populations. For out-of-school adults, rather than

Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

content standards designed for use in formal education, expectations for technological literacy could be based on what an informed member of the public (someone who has had exposure to informal-learning opportunities, including news media and museums and science centers) might be expected to know and do. Framework designers might also take into consideration educational background and work experience, both of which could affect performance in one or more dimensions of technological literacy.


Recommendation 11. Assessments of technological literacy in K–12 students, K–12 teachers, and out-of-school adults should be guided by rigorously developed assessment frameworks.

  • For K–12 students, the National Assessment Governing Board, which has considerable experience in the development of assessment frameworks in other subjects, should commission the development of a framework to guide the development of national and state-level assessments of technological literacy.

  • For K–12 teachers, the National Science Foundation and U.S. Department of Education, which both have programmatic interests in improving teacher quality, should fund research to develop a framework for an assessment of technological literacy in this population. The research should focus on (1) determining how the technological literacy needs of teachers differ from those of student populations and (2) strategies for implementing teacher assessments in a way that would provide useful information for both teachers and policy makers. The resulting framework would be a prerequisite for assessments of all teachers, including generalists and middle- and high-school subject-matter specialists.

  • For out-of-school adults, the National Science Foundation and U.S. Department of Education, which both have programmatic activities that address adult literacy, should fund research to develop a framework for the assessment of technological literacy in this population. The research should focus on determining thresholds of technological literacy necessary for adults to make informed, everyday, technology-related decisions.

Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

Definition of Technology

Based on data from ITEA’s two Gallup polls on technological literacy (ITEA, 2001, 2004), input from the participants in the committee-sponsored workshop, and informal discussions with a variety of individuals knowledgeable about technological literacy, the committee finds that confusion about the word “technology” and the term “technological literacy” is one of the most serious challenges to improving technological literacy in the United States. Although resolving the confusion was not an explicit requirement of the committee’s charge, the committee concluded that everyone interested in assessments of technological literacy should be sensitized to this issue.

The confusion is fundamentally about the role of computers in our lives. There is considerable interest in the United States in measuring what adults and children know about and can do with computer technology. Some states, testing companies (e.g., ETS), and the federal government (through NCLB), among others, support the development of, or have developed assessments for measuring computer-related literacy. Standards for the use of information technology by K–12 students developed by the International Society for Technology in Education (ISTE, 1998) have been adopted or adapted by many states. In the K–12 arena, computer and other information technologies are now commonly referred to as “educational technologies,” tools to aid learning.

The confusion is fundamentally about the role of computers in our lives.

Undoubtedly, people who live in a modern nation like the United States benefit by being able to use computer technologies. Thus, assessments of computer or information-technology literacy focused on application skills are important, particularly for students. But these assessments would be even more useful if they were expanded to address more enduring conceptions of technology, as discussed in Technically Speaking (NAE and NRC, 2002) and detailed in national educational standards for science (AAAS, 1993; NRC, 1996) and technology (ITEA, 2000). Policy makers would benefit from knowing not only the capabilities of certain populations in using computer technology, but also the abilities of citizens to think critically and make sensible decisions about technological development in a much broader sense.


Recommendation 12. The U.S. Department of Education, state education departments, private educational testing companies, and education-related accreditation organizations should broaden the definition

Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

of “technological literacy” to include not only the use of educational technologies (computers) but also the study of technology, as described in the International Technology Education Association Standards for Technological Literacy and the National Academy of Engineering and National Research Council report, Technically Speaking.

Conclusion

Although all of the issues addressed in the recommendations are important, some recommended actions are easier and less costly to implement—and more likely to have near-term results—than others. For example, unlike the creation of de novo assessments (Recommendations 3 and 5), the integration of technology-related items into existing assessments (Recommendations 1, 2, 4, and 6) would take advantage of existing instruments and a testing infrastructure.

In addition, many of the recommendations are interdependent. For instance, all assessments for technological literacy would benefit from the development of detailed assessment frameworks (Recommendation 11), and frameworks and assessments would improve as more becomes known about how adults and children learn technology- and engineering-related concepts (Recommendations 7, 8, and 9). This research would also inform efforts to exploit new techniques, such as simulation and gaming, for measuring what people know and can do with respect to technology (Recommendation 10). And these novel assessment tools have the potential to improve dramatically our ability to gauge technological literacy, particularly the capabilities dimension. As educators, policy makers, and the public at large begin to adopt a broader view of technology (Recommendation 12), the assessment of technological literacy would be recognized as not only important, but necessary.

The recommendations are addressed to a large number of entities, most of them government agencies (Table 8-1). The focus on the public sector is deliberate, because technological literacy—like traditional literacy, science literacy, civics, and numeracy—is considered a public good. In addition, improving and expanding the assessment of technological literacy will require broad-based, coordinated efforts by federal and state agencies with an interest or role in supporting science and engineering research, developing new technologies, maintaining and protecting the infrastructure, and training the nation’s technical workforce. However, as noted in the Introduction, many nongovernmental

Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

TABLE 8-1 Recommendations, by Target Population, Type of Action, and Actors

Recommendation

Target Population

Type of Action

Actor(s)

1

K–12 students

Integrate items into existing national assessment.

National Assessment Governing Board (NAGB)

2

K–12 students

Integrate items into existing international assessments.

U.S. Department of Education (DoEd), National Science Foundation (NSF)

3

K–12 students

Fund sample-based studies and pilot tests.

NSF

4

K–12 teachers

Integrate items into existing assessments for teacher qualifications.

States, DoEd

5

K–12 teachers

Fund development and pilot testing of sample-based assessments.

DoEd, NSF, States

6

Out-of-school adults

Encourage or fund the integration of items into existing assessments.

International Technology Education Association (ITEA), DoEd, National Institutes of Health (NIH), NSF

7

K–12 students

Fund a synthesis study on learning processes.

NSF, DoEd

8

K–12 students, K–12 teachers

Support capacity-building efforts in learning research.

NSF, DoEd

9

Out-of-school adults

Organize an interagency initiative in learning research.

NSF

10

K–12 students, K–12 teachers, Out-of-school adults

Convene a major national meeting to explore innovative assessment methods.

National Institute of Standards and Technology

11

K–12 students, K–12 teachers, Out-of-school adults

Develop frameworks for assessments in the three populations.

NAGB, NSF, DoEd

12

K–12 students, K–12 teachers, Out-of-school adults

Broaden the definitions of technology and technological literacy.

DoEd, state education departments, private educational testing companies, and education-related accreditation organizations

Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

organizations will also benefit, directly or indirectly, from a more technologically literate citizenry. The committee hopes that these organizations will also become involved in the overall effort to promote assessment of technological literacy.

The impetus for technological literacy is a desire that all citizens be empowered to function confidently and productively in our technology-dependent society. A technologically literate public could engage in more-informed public dialogue on the pros and cons of technology-related developments, would provide a talent pool of technologically educated workers, and would contribute to the national science and engineering enterprise.

If we could assess technological knowledge, capability, and thinking skills in a rigorous and systematic way, we could track trends among students, teachers, and out-of-school adults. Reliable information would enable policy makers, educators, the business community, and others to take steps to improve the situation, if necessary. As a result, movement toward a more technologically literate society would be directed and purposeful, governed by data rather than anecdotal evidence and educated guesses. Over a period of many years, with considerable investment of human and financial resources, the benefits of technological literacy would be realized.

References

AAAS (American Association for the Advancement of Science). 1990. Science for All Americans. New York: Oxford University Press.

AAAS. 1993. Benchmarks for Science Literacy. Project 2061. New York: Oxford University Press.

DOEd (U.S. Department of Education). 2005. Fact Sheet: New No Child Left Behind Flexibility: Highly Qualified Teachers. Available online at: http://www.ed.gov/nclb/methods/teachers/hqtflexibility.pdf (January 22, 2005).

ETS (Educational Testing Service). 2005. ICT Literacy Assessment. Available online at: http://www.ets.org/ictliteracy/ (September 9, 2005).

Gee, J.P. 2003. What Video Games Have to Teach Us About Learning and Literacy. New York: Palgrave.

ISTE (International Society for Technology in Education). 1998. National Education Technology Standards for Students. Eugene, Ore.: ISTE.

ITEA (International Technology Education Association). 2000. Standards for Technological Literacy: Content for the Study of Technology. Reston, Va.: ITEA.

ITEA. 2001. ITEA/Gallup Poll Reveals What Americans Think About Technology. A Report of the Survey Conducted by the Gallup Organization for the International Technology Education Association. Available online at: http://www.iteaconnect.org/TAA/PDFs/Gallupreport.pdf (October 5, 2005).

Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

ITEA. 2004. The Second Installment of the ITEA/Gallup Poll and What It Reveals as to How Americans Think About Technology. A Report of the Second Survey Conducted by the Gallup Organization for the International Technology Education Association. Available online at: http://www.iteaconnect.org/TAA/PDFs/GallupPoll2004.pdf (October 5, 2005).

Kasesniemi, E., and P. Rautiainen. 2002. Mobile Culture of Children and Teenagers in Finland. Pp. 170–192 in Perpetual Contact: Mobile Communication, Private Talk, Public Performance, edited by J.E. Katz and M. Aakhus. Cambridge, U.K.: Cambridge University Press.

Lemke, M. 2004. Statement for the Assessing Technological Literacy Workshop, September 29, 2004. The National Academies, Washington, D.C. Unpublished.

Meade, S.D., and W.E. Dugger, Jr. 2004. Reporting on the status of technology education in the United States. Technology Teacher 63(October): 29–35.

NAE (National Academy of Engineering) and NRC (National Research Council). 2002. Technically Speaking: Why All Americans Need to Know More About Technology. Washington, D.C.: National Academy Press.

NAGB (National Assessment Governing Board). 2005. Science Framework for the 2009 National Assessment of Educational Progress. Draft, September 30, 2005. Available online at: http://www.nagb.org/pubs/2005science_framework_dr.doc (October 19, 2005).

NCES (National Center for Education Statistics). 2005. National Household Education Surveys Program. Survey Topics/Population. Available online at: http://nces.ed.gov/nhes/surveytopics_special.asp (January 24, 2005).

NORC (National Opinion Research Center). 2005. General Social Survey. Available online at: http://www.norc.uchicago.edu/projects/gensoc.asp (January 24, 2005).

NRC (National Research Council). 1996. National Science Education Standards. Washington, D.C.: National Academy Press.

NSB (National Science Board). 2004. Science and Technology: Public Attitudes and Understanding in Science and Engineering Indicators, 2004. Available online at: http://www.nsf.gov/sbe/srs/seind04/c7/c7h.htm (May 25, 2005).

OECD (Organisation for Economic Co-Operation and Development). 2004. Problem Solving for Tomorrow’s World: First Measures of Cross-Curricular Competencies from PISA 2003. Programme for International Student Assessment. Available online at: http://www.pisa.oecd.org/dataoecd/25/12/34009000.pdf (January 21, 2005).

OECD and Statistics Canada. 2005. Learning a Living: First Results of the Adult Literacy and Lifeskills Survey. Available online at: http://www.statcan.ca/english/freepub/89-603-XIE/2005001/pdf.htm (October 19, 2005).

Petrina, S., F. Feng, and J. Kim. 2004. How We Learn (About, Through, and for Technology): A Review of Research. Paper commissioned by the National Research Council Committee on Assessing Technological Literacy. Unpublished.

Valentine, G., S.L. Holloway, and N. Bingham. 2000. Transforming Cyberspace: Children’s Interventions in the New Public Sphere. Pp. 156–173 in Children’s Geographies: Playing, Living, Learning, edited by L. Holloway and G. Valentine. London and New York: Routledge.

Waller, A. 2004. Final Report on a Literature Review of Research on How People Learn Engineering Concepts and Processes. Paper commissioned by the Com-mittee on Assessing Technological Literacy. Unpublished.

Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 175
Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 176
Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 177
Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 178
Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 179
Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 180
Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 181
Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 182
Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 183
Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 184
Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 185
Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 186
Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 187
Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 188
Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 189
Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 190
Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 191
Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 192
Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 193
Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 194
Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 195
Suggested Citation:"8 Findings and Recommendations." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 196
Next: APPENDIX A Committee Biographies »
Tech Tally: Approaches to Assessing Technological Literacy Get This Book
×
Buy Hardback | $65.00 Buy Ebook | $49.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

In a broad sense, technology is any modification of the natural world made to fulfill human needs or desires. Although people tend to focus on the most recent technological inventions, technology includes a myriad of devices and systems that profoundly affect everyone in modern society. Technology is pervasive; an informed citizenship needs to know what technology is, how it works, how it is created, how it shapes our society, and how society influences technological development. This understanding depends in large part on an individual level of technological literacy.

Tech Tally: Approaches to Assessing Technological Literacy determines the most viable approaches to assessing technological literacy for students, teachers, and out-of-school adults. The book examines opportunities and obstacles to developing scientifically valid and broadly applicable assessment instruments for technological literacy in the three target populations. The book offers findings and 12 related recommendations that address five critical areas: instrument development; research on learning; computer-based assessment methods, framework development, and public perceptions of technology.

This book will be of special interest to individuals and groups promoting technological literacy in the United States, education and government policy makers in federal and state agencies, as well as the education research community.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!