5
Review of Instruments

To provide a basis for deliberations, the committee collected and analyzed assessments that have been used or might be used to measure an aspect of technological literacy, even if they were not designed explicitly for that purpose. In fact, only about one-third of the assessment “instruments” collected by the committee were explicitly designed to measure technological literacy. Of these, only a handful was based on a conceptual model of technological literacy like the one presented in Technically Speaking or Standards for Technological Literacy. Indeed, the universe of assessments of technological literacy is very small.

A combination of formal methods (e.g., database searches) and informal methods (e.g., inquiries to knowledgeable individuals and organizations) were used to collect assessment instruments. The committee believes most of the relevant assessment instruments were evaluated, but, because the identification process was imperfect, the portfolio of instruments should not be considered comprehensive.

Altogether, the committee identified 28 assessment instruments of several types, including formal criterion- or norm-referenced tests, performance-based activities intended to measure an aspect of design or problem-solving ability, attitude or opinion surveys, and informal quizzes. Item formats ran the gamut from multiple-choice and short-answer questions to essays and performance tasks. About half the instruments had been used more than once; a very few had been administered many times over the course of a decade or more. The others, such as assessments developed as research for Ph.D. dissertations, had been used once, if at all.



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 93
Tech Tally: Approaches to Assessing Technological Literacy 5 Review of Instruments To provide a basis for deliberations, the committee collected and analyzed assessments that have been used or might be used to measure an aspect of technological literacy, even if they were not designed explicitly for that purpose. In fact, only about one-third of the assessment “instruments” collected by the committee were explicitly designed to measure technological literacy. Of these, only a handful was based on a conceptual model of technological literacy like the one presented in Technically Speaking or Standards for Technological Literacy. Indeed, the universe of assessments of technological literacy is very small. A combination of formal methods (e.g., database searches) and informal methods (e.g., inquiries to knowledgeable individuals and organizations) were used to collect assessment instruments. The committee believes most of the relevant assessment instruments were evaluated, but, because the identification process was imperfect, the portfolio of instruments should not be considered comprehensive. Altogether, the committee identified 28 assessment instruments of several types, including formal criterion- or norm-referenced tests, performance-based activities intended to measure an aspect of design or problem-solving ability, attitude or opinion surveys, and informal quizzes. Item formats ran the gamut from multiple-choice and short-answer questions to essays and performance tasks. About half the instruments had been used more than once; a very few had been administered many times over the course of a decade or more. The others, such as assessments developed as research for Ph.D. dissertations, had been used once, if at all.

OCR for page 93
Tech Tally: Approaches to Assessing Technological Literacy The population of interest for most of the instruments was K–12 students. Teachers were the target population for two, the Praxis Technology Education Test (ETS, 2005) and the Engineering K–12 Center Teacher Survey (ASEE, 2005). The rest were designed to test out-of-school adults. Although the focus of this project is on assessment in the United States, the committee also studied instruments developed in Canada, England, and Taiwan. The approaches to assessment in non-U.S. settings provided useful data for the committee’s analysis. The purposes of the assessment tools varied as much as the instruments themselves. They included diagnosis and certification of students, input for curriculum development, certification of teachers, resource allocation, program evaluation, guidance for public policy, suitability for employment, and research. The developers of these assessments could be divided into four categories: state or federal agencies, private educational organizations, academic researchers, and test-development or survey companies. Table 5-1 provides basic information about the instruments, according to target population. More detailed information on each instrument, including sample items and committee observations, is provided in Appendix E. The committee reviewed each instrument through critiques written by committee members, telephone conferences, and face-to-face discussions. In general, the reviews focused on two aspects of the assessments: (1) the type and quality of individual test items; and (2) the format or design of the assessment. The reviews provided an overview of current approaches to assessing technological understanding and capability and stimulated a discussion about the best way to conduct assessments in this area. No single instrument struck the committee as completely adequate to the task of assessing technological literacy. Although a number of the instruments reviewed were thoughtfully designed, no single instrument struck the committee as completely adequate to the task of assessing technological literacy. This is not surprising, considering the general challenge of developing high-quality assessments; the multifaceted nature of technological literacy; the characteristics of the three target populations; the relatively small number of individuals and organizations involved in designing assessments for technological literacy; and the absence of research literature in this area. And as noted, only a few of the instruments under review were designed explicitly to assess technological literacy in the first place.

OCR for page 93
Tech Tally: Approaches to Assessing Technological Literacy TABLE 5-1 Technological-Literacy-Related Assessment Instruments Name Developer Primary Purpose Frequency of Administration K–12 Students   Assessment of Performance in Design and Technology Schools Examinations and Assessment Council, London Curriculum development and research. Once in 1989. Design Technology International Baccalaureate Organization Student achievement (part of qualification for diploma). Regularly since 2003. Design-Based Science David Fortus, University of Michigan Curriculum development and research. Once in 2001–2002. Design Team Assessments for Engineering Students Washington State University Assess students’ knowledge, performance, and evaluation of the design process; evaluate student teamwork and communication skills. Unknown. Future City Competition—Judges Manual National Engineers Week To help rate and rank design projects and essays submitted to the Future City Competition. Annually since 1992. ICT Literacy Assessmenta Educational Testing Service Proficiency testing. Launched in early 2005. Illinois Standards Achievements Test—Science Illinois State Board of Education Measure student achievement in five areas and monitor school performance. Annually since 2000. Industrial Technology Literacy Testb Michael Allen Hayden, Iowa State University Assess the level of industrial-technology literacy among high school students. Once in 1989 or 1990. Infinity Project Pretest and Final Test Geoffrey Orsak, Southern Methodist University Basic aptitude (pretest) and student performance. Ongoing since 1999. Information Technology in a Global Society International Baccalaureate Organization Student evaluation. Semiannually at the standard level since 2002; higher-level exams will be available in 2006. Massachusetts Comprehensive Assessment Systems—Science and Technology/ Massachusetts Department of Education Monitor individual student achievement, gauge school and district performance, satisfy Annually since 1998.

OCR for page 93
Tech Tally: Approaches to Assessing Technological Literacy Name Developer Primary Purpose Frequency of Administration Engineering   requirements of No Child Left Behind Act.   Multiple Choice Instrument for Monitoring Views on Science-Technology-Society Topics G.S. Aikenhead and A.G. Ryan, University of Saskatchewan Curriculum evaluation and research. Once in September 1987–August 1989. New York State Intermediate Assessment in Technologyb State Education Department/State University of New York Curriculum improvement and student evaluation. Unknown. Provincial Learning Assessment in Technological Literacyb Saskatchewan Education Analyze students’ technological literacy to improve their understanding of the relationship between technology and society. Once in 1999. Pupils’ Attitudes Toward Technology (PATT-USA)b E. Allen Bame and William E. Dugger, Jr., Virginia Polytechnic Institute and State University; Marc J. de Vries, Eindhoven University Assess student attitudes toward and knowledge of technology. Dozens of times in many countries since 1988. Student Individualized Performance Inventory Rodney L. Custer, Brigitte G. Valesey, and Barry N. Burke, with funding from the Council on Technology Teacher Education, International Technology Education Association, and the Technical Foundation of America Develop a model to assess the problem-solving capabilities of students engaged in design activities. Unknown. Survey of Technological Literacy of Elementary and Junior High School Studentsb Ta Wei Le, et al., National Taiwan Normal University Curriculum development and planning. Once in March 1995. Test of Technological Literacyb Abdul Hameed, Ohio State University Research. Once in April 1988. TL50: Technological Literacy Instrumentb Michael J. Dyrenfurth, Purdue University Gauge technological literacy. Unknown. WorkKeys—Applied Technologyc American College Testing Program Measure job skills and workplace readiness. Multiple times since 1992.

OCR for page 93
Tech Tally: Approaches to Assessing Technological Literacy Name Developer Primary Purpose Frequency of Administration K–12 Teachers       Engineering K–12 Center Teacher Survey American Society for Engineering Education Inform outreach efforts to K–12 teachers. Continuously available. Praxis Specialty Area Test: Technology Educationa Educational Testing Service Teacher licensing. Regularly. Out-of-School Adults       Armed Services Vocational Aptitude Battery U.S. Department of Defense Assess potential of military recruits for job specialties in the armed forces and provide a standard for enlistment. Ongoing in its present form since 1968. Awareness Survey on Genetically Modified Foods North Carolina Citizens’ Technology Forum Project Team Research on public involvement in decision making on science and technology issues. Once in 2001. Eurobarometer: Europeans, Science and Technology European Union Directorate General for Press and Communication Monitor changes in public views of science and technology to assist decision making by policy makers. Surveys on various topics conducted regularly since 1973; this poll was conducted in May/June 2001. European Commission Candidate Countries Eurobarometer: Science and Technology Gallup Organization of Hungary Monitor public opinion on science and technology issues of concern to policy makers. Periodically since 1973; this survey was administered in 2002. Gallup Poll on What Americans Think About Technologyb International Technology Education Association Determine public knowledge and perceptions of technology to inform efforts to change and shape public views. Twice, in 2001 and 2004. Science and Technology: Public Attitudes and Public Understanding National Science Board Monitor public attitudes, knowledge, and interest in science and technology issues. Biennially from 1979 to 2001. aAlso administered to community and four-year college students. bDesigned explicitly to measure some aspects of technological literacy. cAlso used in community college and workplace settings.

OCR for page 93
Tech Tally: Approaches to Assessing Technological Literacy Mapping Existing Instruments to the Dimensions of Technological Literacy Only about one-third of the instruments collected were developed with the explicit goal of measuring technological literacy. Only two or three of these were designed with the three dimensions of technological literacy spelled out in Technically Speaking in mind. Nevertheless, the committee found the three dimensions to be a useful lens through which to analyze all of the instruments. When viewed this way, some instruments and test items appeared to be more focused on teasing out the knowledge component than testing capability. Others were more focused on capability or critical thinking and decision making. In some cases, the instruments and items addressed aspects of two or even all three dimensions of technological literacy. Knowledge Dimension Every assessment instrument examined by the committee assumed some level of technological knowledge. Every assessment instrument examined by the committee assumed some level of technological knowledge on the part of the person taking the test or participating in the poll or survey. Because the three dimensions are interwoven and overlapping (see Chapter 2), even assessments focused on capability or ways of thinking and acting tap into technological knowledge. The committee did not undertake a precise count but estimated that one-half to three-quarters of the assessment instruments were mostly or entirely designed to measure knowledge. The knowledge dimension is evident in the handful of state-developed assessments, which are designed to measure content standards or curriculum frameworks that spell out what students should know and be able to do at various points in their school careers. Massachusetts and Illinois, for example, have developed assessments that measure technological understanding as part of testing for science achievement. The Massachusetts Comprehensive Assessment System (MCAS) science assessment instrument (MDE, 2005a,b) reflects the addition in 2001 of “engineering” to the curriculum framework for science and technology (MDE, 2001). In the 2005 science assessment, 9 of the 39 5th-grade items and 10 of the 39 8th-grade items targeted the technology/ engineering strand of the curriculum. In the 5th-grade test, 6 of the 9 questions were aligned with state standards for engineering design; the others were aligned with standards for tools and materials. Questions in the 8th-grade exam were related to standards for transportation,

OCR for page 93
Tech Tally: Approaches to Assessing Technological Literacy construction, bioengineering, and manufacturing technologies; engineering design; and materials, tools, and machines. Multiple-choice items that are well crafted can elicit higher order thinking. The 2002 8th-grade MCAS, for example, included the following item: An engineer designing a suspension bridge discovers it will need to carry twice the load that was initially estimated. One change the engineer must make to her original design to maintain safety is to increase the length of wires in tension diameter of wires in tension height of support towers length of the bridge To arrive at the suggested correct answer (B), students must be able to define “load” and “tension” in an engineering context. But they must also make the connection between the diameter and strength of the load-bearing structure (the wire in this case). A student would be more likely to be able to answer this question if he or she had participated in design activities in the classroom, such as building a bridge and testing it for load strength. Open-ended questions can also probe higher-order thinking skills. Although these kinds of questions are more time consuming to respond to and more challenging to score, they can provide opportunities for test takers to demonstrate deeper conceptual understanding. To assess students’ understanding of systems, for instance, a question on the New York State Intermediate Assessment in Technology requires that students fill in a systems-model flow chart for one of four systems (a home heating system, an automotive cooling system, a residential electrical system, or a hydroponic growing system). Recent versions of the Illinois science assessment (ISBE, 2003) were developed with the Illinois Learning Standards in mind (ISBE, 2001a,b). The standards spell out learning goals related to technological design and relationships among science, technology, and society (STS). Of the 70 multiple-choice items on the 2003 assessment, 14 were devoted to STS topics, and 14 were devoted to “science inquiry,” which includes technological design. As in the Massachusetts assessment, design-related

OCR for page 93
Tech Tally: Approaches to Assessing Technological Literacy items required that students demonstrate an understanding of the design process, although they were not asked to take part in an actual design task as part of the test. Even when learning standards are the basis for the assessment design, the connection between the standards and individual test items is not always clear. A sample question for the 4th-grade Illinois assessment, for instance, asks students to compare the relative energy consumption of four electrical appliances. The exercise is intended to test a standard that suggests students should be able to “apply the concepts, principles, and processes of technological design.” However, the question can be answered without knowing the principles of technological design. The Illinois State Board of Education has devised a Productive Thinking Scale (PTS) by which test developers can rate prospective test items according to the degree of conceptual skill required to answer them (Box 5-1). Similar in some ways to Bloom’s taxonomy (Bloom et al., 1964), PTS is specifically intended to be used for developing multiple-choice items. The state tries to construct assessments with most questions at level 3 or level 4; level 1 items are omitted completely; level 2 questions are used only if they address central concepts; level 5 items are used sparingly; and level 6 items are not used because the answers are indeterminate. Level 4, 5, and 6 items seem likely to encourage higher order thinking. Although PTS is used for the development of science assessments, the same approach could be adapted to other subject areas, including technology. BOX 5-1 Productive Thinking Scale Content Knowledge Process Knowledge Level 1: Recall of conventional uses, such as names or vocabulary Level 1: Recall of conventional uses, such as norms or units Level 2: Reproduction of empirical facts or effects Level 2: Reproduction of research sequences or instruments Level 3: Reproduction of empirical theories or causes Level 3: Reproduction of methodological reasons Level 4: Production of one-step problem solving Level 4: Production of research designs for single-variable control Level 5: Production of multistep problem solving Level 5: Production of research designs for multivariable control Level 6: Creation of new theory Level 6: Creation of new methods Source: ISBE, 2003.

OCR for page 93
Tech Tally: Approaches to Assessing Technological Literacy From 1977 through 1999, the federal National Assessment of Educational Progress (NAEP) periodically asked 13- and 17-year-olds the same set of science questions as part of an effort to gather long-term data on achievement. The committee commissioned an analysis of responses to the few questions from this instrument that measure technological understanding (Box 5-2). The Canadian Provincial Learning Assessment in Technological Literacy, an instrument administered in 1999 in Saskatchewan, includes a number of items intended to test 5th-, 8th-, and 11th-graders’ conceptions of technology and the effect of that understanding on responsible citizenship, among other issues (Saskatchewan Education, 2001). Student achievement was measured in five increasingly sophisticated BOX 5-2 Selected Data from the NAEP Long-Term Science Assessment, 1977–1999 In 1985, responses to technology-related questions in the 1976–1977 and 1981–1982 NAEP long-term science assessment were analyzed as part of a dissertation study (Hatch, 1985). The analysis included more than 50 questions common to assessments of 13- and 17-year-olds that met the author’s definition of technological literacy. In later editions of the test, which involved about 16,000 students, many of these questions were dropped. As part of its exploration of this “indirect” assessment of technological literacy, the committee asked Dr. Hatch to analyze data from all five times the test was administered, the most recent in 1999. Among the 12 questions common to both age groups, two were of particular interest to the committee: Would installing storm windows and insulation in your home help to save resources? In 1977, 94 percent of 17-year-olds and 92 percent of 13-year-olds answered “yes.” In 1999, only 65 and 53 percent, respectively, answered “yes.” What happens to the sulfur dioxide released by a factory’s smoke stack? In 1977, only 31 percent of 17-year-olds and 20 percent of 13-year-olds chose the correct answer, “The sulfur dioxide eventually falls back to Earth as acid rain.” By 1999, the percentage of correct answers had jumped to 68 percent for 17-year-olds and 54 percent for 13-year-olds. It is impossible to state with confidence the reasons for the dramatic changes in students’ apparent understanding of the benefits and negative consequences of technology use. The differences undoubtedly have something to do with changes in government and private-sector concerns about energy use and air pollution over this span of time. This example illustrates why items that mention specific technologies must be periodically reviewed for currency. Because storm windows have largely been replaced by double- or triple-glazed windows, a student faced with this same question today might not be able to answer it, simply because she did not understand what was being asked. More important, from the committee’s perspective, this example illustrates the potential value of time-series data for tracking changes in technological literacy. SOURCE: Hatch, 2004.

OCR for page 93
Tech Tally: Approaches to Assessing Technological Literacy FIGURE 5-1 Level 5 exemplar of eighth-grade student responses to a question about technology, Saskatchewan 1999 Provincial Learning Assessment in Technological Literacy. Source: Saskatchewan Education, 2001. levels according to a rubric developed by a panel of teachers, business leaders, parents, students, and others. Students who cited computers as the only example of technology, for instance, were classified at the lowest level. Students who had a more comprehensive understanding of technology as artifacts made by people to extend human capabilities scored significantly higher (Figure 5-1). A great many of the knowledge-focused assessments reviewed by the committee relied heavily on items that required test takers to recall facts or define terms. Although knowledge of certain facts and terminology is essential to the mastery of any subject, this type of item has a major drawback because it does not tap into deeper, conceptual understanding. The following question from an assessment intended for high school students is illustrative (Hayden, 1989). A compact disk can be used to store: numbers music pictures language all of the above

OCR for page 93
Tech Tally: Approaches to Assessing Technological Literacy In this question, the suggested correct answer is E, all of the above. One might disagree with the wording of the answers (e.g., is “language” or the printed word stored on a CD?). But the significant issue is that the question focuses on the superficial aspects of CD technology rather than the underlying concepts, such as that information can take multiple forms and that digitization facilitates the storage, retrieval, and manipulation of data. A correct answer does little more than demonstrate a familiarity with some of the capabilities of one type of data-storage device. Although there is a place in assessments for testing factual knowledge, questions of this type could easily dominate an assessment given the number of technologies about which one might reasonably ask questions. In addition, because of the pace of technological development, narrowly targeted items may quickly become obsolete as one technology replaces another. Narrowly targeted items may quickly become obsolete as one technology replaces another. Nearly one-third of the 100 items in the Pupils’ Attitude Toward Technology instrument address the knowledge dimension of technological literacy. The assessment, developed in the 1980s by a Dutch group headed by Marc de Vries, has been used in many countries, including the United States (Bame et al., 1993). The test includes statements with which students are asked to indicate agreement or disagreement. The statements deal with basic and important ideas about the nature of technology, such as the relationship between technology and science, the influence of technology on daily life, and the role of hands-on work in technological development. High school students and out-of-school adults considering entering the military can choose to take the Armed Services Vocational Aptitude Battery (ASVAB). The ASVAB has eight sections, including items on auto and shop knowledge, mechanics, and knowledge of electronics. Sample items in ASVAB test-preparation books require mostly technical rather than conceptual understanding (e.g., Kaplan, 2003). This reflects the major purpose of the test, which is to identify individuals suited for specialty jobs in the armed forces. ASVAB is notable because it is an online, “adaptive” testing option for adult test takers. In adaptive testing, a right or wrong answer to a question determines the difficulty of the next question. A group of engineering schools, the Transferable Integrated Design Engineering Education Consortium, has developed an instrument for testing knowledge of the design process (TIDEE, 2002). This is the only assessment in the committee’s analysis explicitly intended for college

OCR for page 93
Tech Tally: Approaches to Assessing Technological Literacy by filling the 12 cells with examples of assessment items that address the content and cognitive specifications of each cell. To this end, the committee reviewed the items in the portfolio of collected assessment instruments to identify those that might fit into the matrix (Table 5-2). For instance, the committee found questions that might be placed in the cell in the upper left-hand corner of the matrix representing the cognitive dimension of knowledge and the content domain related to technology and society. Some of the items fit the cells better than others, and, because they come from different sources, the style of question and target populations vary. Of course, one would not use this piecemeal approach to design an actual assessment, and the quality of most of the selected items is not nearly as high as for the items one would devise from scratch. Nevertheless, this exercise demonstrates the potential usefulness of a matrix and gives the reader a sense of the scope of the cognitive and content dimensions of an assessment of technological literacy.

OCR for page 93
Tech Tally: Approaches to Assessing Technological Literacy Table 5-2 begins on next page.

OCR for page 93
Tech Tally: Approaches to Assessing Technological Literacy TABLE 5-2 Assessment Matrix for Technological Literacy with Items from Selected Assessment Instruments   Technology and Society Design Knowledge In the late 1800s the railroad was built across Canada. What effects did this have on life in Saskatchewan? More people settled in Saskatchewan. The natural landscape of Saskatchewan changed significantly. The weather of Saskatchewan changed. A and B All of the abovea Marcus designed a television stand like the one shown below for his family.   What impact (or effects) do people living longer have? The development of new business enterprises to look after the wants and needs of the elderly. People are happier and there is no real impact on society. More taxes are collected to provide care for the elderly. A and C. All of the aboveb   His father is worried that the stand could tip over. Look at the measurements in the drawing. How can Marcus improve the design so the stand would be less likely to tip over? Make the base wider. Make the stand taller. Make the top narrower. Use a different material.d   Someone concerned about the environment would prefer to buy Drinks in glass bottles Eggs in a plastic carton Boxes covered with wax paper Books made of nonrecycled paperc   If you were designing a product that has to be easily serviced, you would assemble it with welded joints epoxy resin rivets threaded screwse Capabilities Business and industry use technology in a variety of ways. Sometimes the technology that is used has negative results. The automobile industry makes A group of young people have investigated the needs of people with small gardens and decided to make a floor and wall plant holder. They have

OCR for page 93
Tech Tally: Approaches to Assessing Technological Literacy Products and Systems Characteristics, Core Concepts, and Connections Which of the following is a key factor that enables an airplane to lift? Air pressure beneath the wing is greater than that above the wing. Pressure within the airplane is greater than that of the outside. Engine power is greater than that of friction. The plane’s wing is lighter than air.f Indicate whether you believe each item to be “definitely technology,” “might be technology,” “not technology,” or “don’t know” cup telephone airplane book bridge computer gun jeans TV commercial clock trombone old stone axe cheese deer cough medicine microwave flower restaurant river house planj Using a portable phone while in the bathtub creates the possibility of being electrocuted. (True or False)g To find the depth of the ocean, some ships send a sound wave to the ocean floor and record the time it takes to return to the detector. The kind of wave used by this detector is the same as that used by bats to detect objects in the dark. snakes to detect warm-blooded prey, such as mice. police cars to detect the speed of motorists. airports to detect the location of airplanes in the sky.h Which system would locate a lost person with the appropriate signal sending devices? Geographical information system (GIS) Global positioning system (GPS) Computer simulation system Robotics systemi A bicycle is considered a complex machine because it is used to perform a task. made from natural materials. made up of more than one simple machine. complicated to build and repair.k   Technical developments and scientific principles are related because: Science and technology have identical characteristics. Technological innovations always precede a scientific explanation. Scientific discoveries always precede a scientific explanation Sometimes technical developments give scientists something to explain and sometimes scientific discoveries lead to technical development. They are not related.l (This assessment item requires a clock radio) Set the time on the clock for 9:00 AM If you hold the FWD down and count for 20 seconds, what is the new reading of the clock?

OCR for page 93
Tech Tally: Approaches to Assessing Technological Literacy   Technology and Society Design Capabilities use of robotics for many reasons: to create products that are uniform, to ensure accuracy, to save time, and to perform dangerous tasks. Identify and describe one positive effect of robotics on manufacturing and one negative effect of robotics on the employees in the automobile industry. Why would a company use a technology even though it will have negative effects?m decided that the plant holder must be able to do the four things shown here. Your task today is to take this idea and develop it as far as you can in the time available. Design a floor and wall plant holder that: stands on the floor or can be fixed to a wall stacks or links together so that they can be arranged in a variety of ways has a rainwater drainage and storage system has a self-watering systemn     An igloo is a structure that is used for survival in extremely cold environments with snowstorms. The structure is typically made of blocks of ice laid one on another in order to form the shape of a dome. Describe how you would test this structure to evaluate its ability to withstand static and dynamic forces: Describe how you would test this structure to evaluate its thermal insulation:o Critical Thinking and Decision Making “Technology makes the world a better place to live in!” Do you totally agree with the above statement? Discuss in full detail, using specific examples, your viewpoint on the impact and responsibilities involving technology and technological developments.q A new prototype for a refrigerator that contains a computer and has a computer display screen mounted in its door has been developed. The display screen uses touch-screen technology. As well as standard computer programs, the computer runs a database on which the user can maintain a record of the refrigerator contents including sell-by. Explain three reasons for conducting market research before starting the design of the new refrigerator-computer for sale in the global marketplace.t   When a new technology is developed (for example, a better type of fertilizer), it may or may not be put into practice. The decision to use a new technology depends on whether the advantages to society outweigh the disadvantages to society. Your position basically: The decision to use a new technology depends mainly on the benefits to society, because if there are too many disadvantages, society won’t accept it and may discourage its further development. The decision depends on more than just the technology’s advantages and disadvantages. It   Why would tightly closed windows be a good design choice in cold climates but not a good design choice in hot climates?u

OCR for page 93
Tech Tally: Approaches to Assessing Technological Literacy Products and Systems Characteristics, Core Concepts, and Connections How many minutes did the clock reading advance? Set the time of the clock for 9:00 AM If you hold the FFWD down and count for 10 seconds what is the reading of the clock? How many minutes did the clock reading advance? Approximately how many times faster is the FFWD than the FWD? You are preparing to go to bed and have decided you want to wake up at 6:30 AM and listen to the radio for 2 hours straight while lying in bed. Unfortunately the clock radio is located across the room from your bed and you don’t want to get up to adjust it in the morning. How can you preset the clock radio the night before to meet your needs for the next morning?p   Explain the benefits of using standards for mobile phones.v People have frequently noted that scientific research has created both beneficial and harmful consequences. Would you say that, on balance, the benefits of scientific research have outweighed the harmful results, or have the harmful results of scientific research been greater than its benefits?y On the internet, people and organizations have no rules to follow and can say just about anything they want. Do you think this is good or bad? Explain.w When you put food in a microwave oven, it heats up rapidly. On the other hand, when you hold an barely warms at all. Give two reasons for this.x We always have to make trade-offs (compromises) between the positive and negative effects of science. Your position basically:   There are always trade-offs between benefits and negative effects: because every new development has at least one negative result. If we didn’t put up with the negative results, we would not progress to enjoy the benefits. because scientists cannot predict the long-term effects of new developments, in spite of careful planning and testing. We have to take the chance. because things that benefit some people will be negative for someone else. This depends on a

OCR for page 93
Tech Tally: Approaches to Assessing Technological Literacy   Technology and Society Design Critical Thinking and Decision Making depends on how well it works, its cost, and its efficiency. It depends on your point of view. What is an advantage to some people may be a disadvantage to others. Many new technologies have been put into practice to make money or gain power, even though their disadvantages were greater than their advantages. I don’t understand. I don’t know enough about this subject to make a choice. None of these choices fits my basic viewpoint.r     What do you think are the 3 most important pieces of technology ever made? Why do you think the 3 items you listed above are the most important? Explain how the first item you picked has affected society. How did society influence the development of the first item you picked?s   a1999 Provincial Learning Assessment in Technological Literacy, Saskatchewan Education. Student Test Booklet Day 1, Sample A, page 8, question 24, Intended level: 8th grade. b1999 Provincial Learning Assessment in Technological Literacy, Saskatchewan Education. Student Test Booklet Day 1, Sample A, page 4, question 8. Intended level: 8th grade. cIllinois Standards of Achievement Test–Science. Intended level: 4th grade students (page 37, question 59). dMCAS Spring 2003 Science and Technology/Engineering. Intended level: grade 5, page 235, question 30. eTest of Technological Literacy, Dissertation by Abdul Hameed, Ohio State University, 1988. page 173, question 9. Intended level: 7th and 8th grade. fTechnological Literacy of Elementary and Junior High School Students (Taiwan), question 55. gITEA/Gallup Poll on Americans’ Level of Literacy Related to Technology. Table 11, page 5, March 2002. Intended level: adults. hIllinois Standards of Achievement Test–Science (page 19, question 4). Intended level: 4th grade students. iInternational Baccalaurate: Information Technology in a Global Society. Standard Level, Paper 1, N02/390/S(1), page 3, question 4. Intended level: high school. j1999 Provincial Learning Assessment in Technological Literacy, Saskatchewan Education. Student test booklet day 1, Sample A, Part A, question 1. Intended level: 5th, 8th and 11th grade. kMCAS Spring 2003 Science and technology/Engineering (page 233, question 26). Intended level: grade 5. lThe Development and Validation of a Test of Industrial Technological Literacy (Hayden Dissertation), page 178, question 33. Intended level: high school. mNew York State Intermediate Assessment in Technology. Question 18. Intended level: 7th and 8th grade. nAssessment of Performance in Design and Technology, The Final Report of the APU Design and Technology Project, 1985– 1991, figure 7.5, page 104. Intended Level: students aged 15 years. oDesign-Based Science (Fortus Dissertation), Structures for extreme environments content test, question 19. Intended level: 9th and 10th grade.

OCR for page 93
Tech Tally: Approaches to Assessing Technological Literacy Products and Systems Characteristics, Core Concepts, and Connections   person’s viewpoint. because you can’t get positive results without first trying a new idea and then working out its negative effects. but the trade-offs make no sense. (For example: Why invent labour saving devices which cause more unemployment? or Why defend a country with nuclear weapons which threaten life on Earth?)   There are NOT always trade-offs between benefits and negative effects: because some new developments benefit us without producing negative effects. because negative effects can be minimized through careful planning and testing. because negative effects can be eliminated through careful planning and testing. Otherwise a new development is not used. I don’t understand. I don’t know enough about this subject to make a choice. None of these choices fits my basic viewpoint.z p1999 Provincial Learning Assessment in Technological Literacy, Saskatchewan Education. Performance station test, sample B, performance station 3 clock radio, stage 4, page 9 (questions 1–8). Intended level: 8th grade. q1999 Provincial Learning Assessment in Technological Literacy, Saskatchewan Education. Thread 2, page 33. Intended level: 11th grade. rThe Development of a Multiple Choice Instrument for Monitoring Views on Science-Technology-Society Topics. Intended Level: 12th grade.(80133). s1999 Provincial Learning Assessment in Technological Literacy, Saskatchewan Education. Student Test Booklet Day 2, Sample A, page 7, question 4a, b, c, d. Intended level: 8th grade. tInternational Baccalaureate: Design Technology. Fall 2004 exam N04/DESTE/HP3/ENG/TZ0/XX, Higher Level, Paper 3, page 12, question F4. uDesign-Based Science (Fortus Dissertation). Structure for extreme environments content test, question 17. Intended level: 9th and 10th grade. vInternational Baccalaureate: Design Technology. Fall 2004 exam N04/4/DESTE/HP3/ENG/TZ0/XX, page 20, question H4, Higher Level, Paper 3. w1999 Provincial Learning Assessment in Technological Literacy, Saskatchewan Education. Student Test Booklet Day 2, Sample A, page 4, question 3e. Intended level: 8th grade. xDesign-Based Science (Fortus Dissertation). Safer cell phones content assessment, page 6, question 16. Intended level: 9th and 10th grade. yNSF Indicators, Public Understanding of Science and Technology—2002, page 7–14, Appendix Table 7-18. Intended level: adults. zThe Development of a Multiple Choice Instrument for Monitoring Views on Science-Technology-Society Topics, question 40311. Intended level: 12th grade.

OCR for page 93
Tech Tally: Approaches to Assessing Technological Literacy References AAAS (American Association for the Advancement of Science). 1993. Benchmarks for Science Literacy. Project 2061. New York: Oxford University Press. Aikenhead, G.S., and A.G. Ryan. 1992. The Development of a New Instrument: Monitoring Views on Science-Technology-Society (VOSTS). Available online at: http://www.usask.ca/education/people/aikenhead/vosts_2.pdf (May 20, 2005). ASEE (American Society for Engineering Education). 2005. Teachers’ Survey Results. ASEE Engineering K–12 Center. Available online at: http://www.engineeringk12.org/educators/taking_a_closer_look/survey_results.htm (October 5, 2005). Bame, E.A., M.J. de Vries, and W.E. Dugger, Jr. 1993. Pupils’ attitudes towards technology: PATT-USA. Journal of Technology Studies 19(1): 40–48. Bloom, B.S., B.B. Mesia, and D.R. Krathwohl. 1964. Taxonomy of Educational Objectives. New York: David McKay. Cobb, M.D., and J. Macoubrie. 2004. Public perceptions about nanotechnology: risks, benefits and trust. Journal of Nanoparticle Research 6(4): 395–405. Custer, R.L., G. Valesey, and B.N. Burke. 2001. An assessment model for a design approach to technological problem solving. Journal of Technology Education 12(2): 5–20. ETS (Educational Testing Service). 2005. Sample Test Questions. The Praxis Series, Specialty Area Tests, Technology Education (0050). Available online at: http://ftp.ets.org/pub/tandl/0050.pdf (May 27, 2005). Finson, K.D. 2002. Drawing a scientist: what do we know and not know after 50 years of drawing. School Science and Mathematics 102(7): 335–345. Hatch, L.O. 1985. Technological Literacy: A Secondary Analysis of the NAEP Science Data. Unpublished dissertation, University of Maryland. Hatch, L.O. 2004. Technological Literacy: Trends in Academic Progress—A Secondary Analysis of the NAEP Long-term Science Data. Draft 6/16/04. Paper commissioned by the National Research Council Committee on Assessing Technological Literacy. Unpublished. Hayden, M.A. 1989. The Development and Validation of a Test of Industrial Technological Literacy. Unpublished dissertation, Iowa State University. Hill, R.B. 1997. The design of an instrument to assess problem solving activities in technology education. Journal of Technology Education 9(1): 31–46. IBO (International Baccalaureate Organization). 2001. Design Technology. February 2001. Geneva, Switzerland: IBO. ISBE (Illinois State Board of Education). 2001a. Science Performance Descriptors— Grades 1–5. Available online at: http://www.isbe.net/ils/science/word/descriptor_1-5.rtf (April 11, 2005). ISBE. 2001b. Science Performance Descriptors—Grades 6–12. Available online at: http://www.isbe.net/ils/science/word/descriptor_6-12.rtf (April 11, 2005). ISBE. 2003. Illinois Standards Achievement Test—Sample Test Items: Illinois Learning Standards for Science. Available online at: http://www.isbe.net/assessment/PDF/2003ScienceSample.pdf (August 31, 2005). ITEA (International Technology Education Association). 2001. ITEA/Gallup Poll Reveals What Americans Think About Technology. A Report of the Survey Conducted by the Gallup Organization for the International Technology Education Association. Available online at: http://www.iteaconnect.org/TAA/PDFs/Gallupreport.pdf (October 5, 2005). ITEA. 2004. The Second Installment of the ITEA/Gallup Poll and What It Reveals

OCR for page 93
Tech Tally: Approaches to Assessing Technological Literacy as to How Americans Think about Technology. A Report of the Second Survey Conducted by the Gallup Organization for the International Technology Education Association. Available online at: http://www.iteaconnect.org/TAA/PDFs/GallupPoll2004.pdf (October 5, 2005). Kaplan. 2003. ASVAB—The Armed Services Vocational Aptitude Battery. 2004 Edition. New York: Simon and Schuster. Kimbell, R., K. Stables, T. Wheeler, A. Wosniak, and V. Kelly. 1991. The Assessment of Performance in Design and Technology. The Final Report of the APU Design and Technology Project 1985–1991. London: School Examinations and Assessment Council. MDE (Massachusetts Department of Education). 2001. Science and Technology/ Engineering Curriculum Framework. Available online at: http://www.doe.mass.edu/frameworks/scitech/2001/0501.doc (April 11, 2005). MDE. 2005a. Science and Technology/Engineering, Grade 5. Released test items. Available online at: http://www.doe.mass.edu/mcas/2005/release/g5sci.pdf (August 31, 2005). MDE. 2005b. Science and Technology/Engineering, Grade 8. Released test items. Available online at: http://www.doe.mass.edu/mcas/2005/release/g8sci.pdf (August 31, 2005). Miller, J.D. 1983a. The American People and Science Policy: The Role of Public Attitudes in the Policy Process. New York: Pergamon Press. Miller, J.D. 1983b. Scientific literacy: a conceptual and empirical review. Daedalus 112(2): 29–48. Miller, J.D. 1986. Reaching the Attentive and Interested Publics for Science. Pp. 55– 69 in Scientists and Journalists: Reporting Science as News, edited by S. Friedman, S. Dunwoody, and C. Rogers. New York: Free Press. Miller, J.D. 1987. Scientific Literacy in the United States. Pp. 19–40 in Communicating Science to the Public, edited by D. Evered and M. O’Connor. London: John Wiley and Sons. Miller, J.D. 1992. From Town Meeting to Nuclear Power: The Changing Nature of Citizenship and Democracy in the United States. Pp. 327–328 in The United States Constitution: Roots, Rights, and Responsibilities, edited by A.E.D. Howard. Washington, D.C.: Smithsonian Institution Press. Miller, J.D. 1995. Scientific Literacy for Effective Citizenship. Pp. 185–204 in Science/Technology/Society as Reform in Science Education, edited by R.E. Yager. New York: State University of New York Press. Miller, J.D. 1998. The measurement of civic scientific literacy. Public Understanding of Science 7: 1–21. Miller, J.D. 2000. The Development of Civic Scientific Literacy in the United States. Pp. 21–47 in Science, Technology, and Society: A Sourcebook on Research and Practice, edited by D.D. Kumar and D. Chubin. New York: Plenum Press. Miller, J.D. 2001. The Acquisition and Retention of Scientific Information by American Adults. Pp. 93–114 in Free-Choice Science Education, edited by J.H. Falk. New York: Teachers College Press. Miller, J.D. 2004. Public understanding of, and attitudes toward scientific research: what we know and what we need to know. Public Understanding of Science 13: 273–294. Miller, J.D., and L. Kimmel. 2001. Biomedical Communications: Purposes, Audiences, and Strategies. New York: Academic Press. Miller, J.D., and R. Pardo. 2000. Civic Scientific Literacy and Attitude to Science and Technology: A Comparative Analysis of the European Union, the United States, Japan, and Canada. Pp. 81–129 in Between Understanding and Trust: The

OCR for page 93
Tech Tally: Approaches to Assessing Technological Literacy Public, Science, and Technology, edited by M. Dierkes and C. von Grote. Amsterdam: Harwood Academic Publishers. Miller, J.D., R. Pardo, and F. Niwa. 1997. Public Perceptions of Science and Technology: A Comparative Study of the European Union, the United States, Japan, and Canada. Madrid: BBV Foundation. Morgan, G., B. Fischhoff, A. Bostrom, and C.J. Atman. 2002. Risk Communication: A Mental Models Approach. New York: Cambridge University Press. NRC (National Research Council). 1999. How People Learn: Brain, Mind, Experience, and School. Edited by J.D. Bransford, A.L. Brown, and R.R. Cocking. Washington, D.C.: National Academy Press. NSB (National Science Board). 1981. Science Indicators, 1980. Washington, D.C.: Government Printing Office. NSB. 1983. Science Indicators, 1982. Washington, D.C.: Government Printing Office. NSB. 1986. Science Indicators, 1985. Washington, D.C.: Government Printing Office. NSB. 1988. Science and Engineering Indicators, 1987. Washington, D.C.: Government Printing Office. NSB. 1990. Science and Engineering Indicators, 1989. Washington, D.C.: Government Printing Office. NSB. 1992. Science and Engineering Indicators, 1991. Washington, D.C.: Government Printing Office. NSB. 1994. Science and Engineering Indicators, 1993. Washington, D.C.: Government Printing Office. NSB. 1996. Science and Engineering Indicators, 1996. Washington, D.C.: Government Printing Office. NSB. 1998. Science and Engineering Indicators, 1998. Washington, D.C.: Government Printing Office. NSB. 2000. Science and Engineering Indicators, 2000. Washington, D.C.: Government Printing Office. NSB. 2004. Science and Engineering Indicators, 2004. Washington, D.C.: Government Printing Office. Paul, R., and G.M. Nosich. 2004. A Model for the National Assessment of Higher-Order Thinking. Available online at: http://www.criticalthinking.org/resources/articles/a-model-nal-assessment-hot.shtml (January 19, 2006). Saskatchewan Education. 2001. 1999 Provincial Learning Assessment in Technological Literacy. May 2001. Available online at: http://www.learning.gov.sk.ca/branches/cap_building_acct/afl/docs/plap/techlit/1999techlit.pdf. TIDEE (Transferable Integrated Design Engineering Education). 2002. Assessments. Available online at: http://www.tidee.cea.wsu.edu/resources/assessments.html (August 31, 2005).