National Academies Press: OpenBook

Tech Tally: Approaches to Assessing Technological Literacy (2006)

Chapter: APPENDIX E Instrument Summaries

« Previous: APPENDIX D Research on Learning in Technology and Engineering: A Selected Bibliography
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

Armed Services Vocational Aptitude Battery

Background

Sponsor/Creator

U.S. Department of Defense

Purpose

Assess potential of military recruits for job specialties in the armed forces; provide a standard for enlistment

What is measured

Knowledge and reasoning skills in eight areas

Target population

Young Americans interested in military careers

Item format

Multiple choice

Sample size

More than 900,000 high school students annually

Frequency of administration

Ongoing in its present form since 1968

Availability

Sample items available from various test preparation books (e.g., Kaplan ASVAB 2004 edition, Simon and Schuster)

Scope

The U.S. Department of Defense maintains and administers the Armed Services Vocational Aptitude Battery (ASVAB)1 to assess the potential of military recruits for enlistment and various specialties. ASVAB is currently administered in three forms. High school students, the most common test takers, can take the form 18/19 version of the ASVAB as early as 10th grade. Recruiters can also administer a paper version or computer-adapted exam to prospective recruits who are no longer in

1

ASVAB is a registered trademark of the U.S. Department of Defense.

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

school. ASVAB includes eight sections: general science, arithmetic reasoning, word knowledge, paragraph comprehension, auto and shop information, mathematics, mechanical comprehension, and electronics information.

Scores are reported in each area, and a simple equation is used to calculate a raw score, which is converted into a percentile score. Test takers also receive composite scores in verbal ability, math ability, and academic ability. Minimum percentile scores are required for enlistment; combinations of scores from the eight areas are used to qualify test takers for specialties in each branch of the military.

Sample Items

Readers wishing to get a sense of the types of items on ASVAB are encouraged to look at an ASVAB test-preparation book, such as ASVAB, The Armed Services Vocational Aptitude Battery, 2004 Edition (Simon and Schuster).

Committee Observations

The ASVAB exam is an appropriate instrument for the military to assess a broad range of knowledge and abilities among high school students and young adults. The sections on spatial reasoning, mechanical comprehension, and auto and shop information seem relevant to technological literacy. Despite the emphasis on technological topics, however, most of the items are very narrow in scope and require only factual recall or low-level application of knowledge. The auto and shop questions favor males, who tend to have more exposure in these areas.

Assessment of Performance in Design and Technology

Background

Sponsor/Creator

Richard Kimbell, et al. at the Technology Education Research Unit, Goldsmiths College, University of London, with funding from the U.K. Department of Education and Science

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

Purpose

Curriculum development, research

What is measured

Design capabilities

Target population

15-year-old students in the United Kingdom

Item format

90-minute open-ended design tasks, half-day modeling tasks, and a long-term project

Sample size

Approximately 10,000 students from more than 700 schools in the United Kingdom

Frequency of administration

Once, in 1989

Availability

Kimbell, R., K. Stables, T. Wheeler, A. Wozniak, and V. Kelly. 1991. The Assessment of Performance in Design and Technology: The Final Report of the APU Design and Technology Project. London (D/010/B/91): Schools Examinations and Assessment Council/Central Office of Information. 285 pp.

Scope

The Assessment Performance Unit (APU) was established in 1975 to monitor student achievement in British schools. Over time, the focus of APU shifted from assessment to providing support for curriculum development. In 1985, APU commissioned an assessment of design and technology achievement to gauge how well students performed in design and technology activities. The assessment had three parts.

The first part, administered to approximately 9,000 students, was a 90-minute pencil-and-paper test on which students were asked to complete a structured activity. Twenty-one activities were created for the assessment each involving one of three contexts: people, environment, or industry. Each activity had a specific focus: starting point, early idea, development of a solution, evaluation of a product, or modeling. A

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

“starting-point” activity might ask students to suggest new or improved products or systems that could be designed for the garden. A “developing-solutions” activity might ask students to design a self-watering plant pot that could be stacked and interlocked. The activities ranged from closed, well-defined questions to open, loosely defined tasks.

In the second part of the assessment, about 1,500 students who completed the paper-and-pencil test, took part in a half-day, team-based modeling activity in which they could use various soft and rigid modeling materials, such as rubber bands, beads, string, and fabric, to create a prototype design.

The final assessment component involved approximately 70 of the 1,500 students from the second test. The students participated in long-term (up to nine months) school projects. Students were regularly interviewed to develop a long-term history of individual performance.

In all three assessment components, researchers tried to determine how well students formed ideas, organized their time and resources, considered alternative solutions, and modeled solutions that could be evaluated against the user’s needs. Activities were evaluated in three areas: the processes in design and development, the quality of communication, and conceptual understanding. Holistic marks indicating a student’s overall capabilities were awarded based on pre-established characteristics of good and poor performance. Individual discriminators of capability, questions to ascertain if a student’s responses included certain predetermined components, were also used to evaluate performance.

Sample Items

  • Developing-solutions, concept-model activity for the 90-minute paper-and-pencil test. (Concept-model activities were presented to students in physical form. Ready-made ideas presented in half-developed form allowed students to proceed quickly into the design stage of the project.)

When considering the needs of the elderly, a group of young people recognized the weakness of built-in cooker timers and decided to make one that was more suitable.

A member of the team came up with the idea that a portable timer could be designed that was set by a twisting action.

Design a timer for the elderly that:

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
  1. is portable

  2. is set with a twisting action

  3. will sound an alarm when the time runs out

  4. is a suitable size and shape

Your task today is to take this idea and develop it as far as you can in the time available.

(This activity included a drawing of a twist-type timer.)

  • Modeling activity for the half-day, team-based assessment

The team has decided to make a bird scarer for use in gardens and allotments.

A member of the team came up with the idea that “spinning in the wind” advertising could be developed for scaring birds.

Design a bird scarer that:

  1. Has sails or vanes that catch the slightest breeze

  2. Makes “bird scaring” movements

  3. Gives off “bird scaring” sounds from a sound box

  4. Fits into the environment

Your task today is to take this idea and develop it as far as you can in the time available.

(This activity included two drawings. The first was of a wind-sail mounted on top of a sound box. The second depicted a garden and was accompanied by a number of thought-provoking questions, such as “what about high winds?”, “what makes the sound?”, and “is it safe?”)

Committee Observations

This instrument reflects a curricular emphasis on “design and technology” in the U.K. educational system. Assessment activities seem to require higher order cognitive capabilities. The evaluation framework, which includes holistic, procedural, communication, and conceptual elements based on four domains (task clarification, investigation, solution generation, and appraisal), is conceptually robust.

This instrument is complex and would be difficult and expensive to administer, score, and report on a large scale. The task-centered

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

approach to assessment offers real insights into design competency but does not address technological literacy in the broad sense defined by the committee.

Awareness Survey on Genetically Modified Foods

Background

Sponsor/Creator

Jane Macoubrie, Patrick Hamlett, and Carolyn Miller, North Carolina State University, with funding from the National Science Foundation

Purpose

Research on public involvement in decision making on science and technology issues

What is measured

Knowledge and attitudes toward genetically modified foods

Target population

American adults

Item format

Multiple choice

Sample size

45 adults in North Carolina

Frequency of administration

Once, in 2001

Availability

Jane Macoubrie, Department of Communication, North Carolina State University

Scope

This project was inspired by the Danish practice of providing opportunities for citizens to participate in “consensus conferences” to discuss science and technology issues and make policy recommendations to the government. Conference participants are non-experts who are provided with extensive background information on a subject and then

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

convened to discuss the issue. Researchers at North Carolina State University conducted a Danish-style consensus conference in 2001 to assess the feasibility of consensus conferences in the United States. This survey, which was administered to participants prior to the conference, included 20 multiple-choice questions addressing ethical and scientific issues, as well as current practices in the farming of genetically modified crops.

Sample Items2

  • Can genes escape from genetically modified crops and jump to other plants?

    1. Yes and they often do

    2. Only to some crops, but those crops aren’t genetically modified

    3. Only during rare climatic conditions

    4. No, genes cannot move from species to species without human intervention

    5. I don’t know

(Suggested correct answer: A)

  • To keep genetically modified crops separate from traditional crops, farmers are currently required to do which of the following?

    1. Use different machines to harvest each field

    2. Use different storage bins and silos

    3. Transport separately to the production facility

    4. None of the above

    5. I don’t know

(Suggested correct answer: D)

  • Ethical arguments against the genetic modification of food products include:

    1. Genetically modified crops violate species integrity

    2. Biotechnology changes too fast to effectively understand and regulate it

    3. The belief that scientists should not “play God”

    4. All of the above

    5. I don’t know

(Suggested correct answer: D)

2

Reprinted with permission of the North Carolina Citizens’ Forum Project Team

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

Committee Observations

This content-specific survey does not require higher order thinking skills. In addition, the level of factual knowledge required to perform well is likely to be beyond the capability of most individuals in the target population. It would be interesting to administer a survey like this before and after participation in a consensus-type conference to determine what, if any, learning has taken place.

Design-Based Science

Background

Sponsor/Creator

David Fortus, University of Michigan

Purpose

Curriculum development, research

What is measured

Science and technology knowledge and transfer of design skills to new situations

Target population

9th- and 10th-grade students in the United States

Item format

Multiple-choice and open-ended questions and design skills projects

Sample size

92 students in 9th and 10th grade in one Michigan public high school

Frequency of administration

Once, in 2001–2002

Availability

Dissertation held at University of Michigan and an article describing the instrument (Journal of Research in Science Training 41(10): 1081–1110).

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

Scope

David Fortus developed the design-based science (DBS) curriculum as part of the dissertation for his Ph.D. The DBS curriculum has three units: structures for extreme environments, environmentally safe batteries, and safer cell phones. The course instructor (not Fortus) started each unit by administering a pre-instruction content-knowledge test. The test was followed by a number of weeks of classroom teaching on the science and technology related to the unit, as well as instruction in the design process. At the end of each unit, students were given an exam that included 13–15 multiple-choice questions and 2–5 open-ended questions. Multiple-choice questions required low, medium, and high cognitive skills; open-ended questions required medium and high cognitive skills (as determined by Fortus).

To test the transfer of design skills, students in groups of four were asked to apply knowledge from each unit to a new situation. The structures for extreme environments unit was followed by a design project requiring the design of a kite that could fly a mile high. The environmentally safe batteries unit was followed by a project requiring the design of a battery for an artificial heart. The unit on safer cell phones was followed by a project requiring the design of a hearing protector for rock musicians. Groups were evaluated in five categories: design variables; gathering of information; comparison of options; model, drawing, or diagram; and design evaluation. All four students in each group earned the same grade on the project.

Sample Items3

  • Safer cell phones unit multiple-choice question

A cell phone is similar to a microwave oven because:

  1. Both have been proven to be dangerous to your health

  2. They both emit microwaves

  3. They operate on the same voltage

3

From Fortus, D., R.C. Dershimer, J.S. Krajcik, R.W. Marx, and R. Mamlok-Naaman. 2004. Design-based science (DBS) and student learning. Journal of Research in Science Teaching 41(10): 1081–1110.

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
  1. They both operate on the same frequency

  • Environmentally safe batteries unit open-ended question

A group of students builds a battery from two strips of aluminum metal immersed in a beaker of distilled water. They connect the battery to the voltmeter and are surprised that the voltmeter shows no reading. Explain what’s wrong with their battery and what they should change in order to measure a voltage reading. Assume that the voltmeter and connecting wires are not broken, and that you can add or change materials to the setup if necessary.

  • Structures for extreme environments unit assessment of design skills

Can you design a kite that will fly one mile high?

Students were evaluated based on their analysis in the following areas (criteria were given to students with the assignment):

  1. Why won’t a standard kite you can buy at any toy store be able to fly one mile high? If you understand this, you then will know what will have to be the special characteristics of your kite.

  2. Where did you gather the information you needed (encyclopedias, books in the library, the web, hobby shops, family and friends, magazines, and so on)? What was the information you gathered and what was its relevance to the kite you designed?

  3. Did you identify all the factors that needed to be considered in designing the kite?

  4. Did your group come up with a range of design options? What were they?

  5. Did you select a single option from this range? Did you justify your decision based on functional, scientific, aesthetic or other considerations?

  6. How did you describe your solution? Did you use technical and concept drawings? Did you build models?

  7. Did you develop a plan for testing the kite and its components?

Committee Observations

The knowledge-transfer aspect of this instrument is intriguing. Knowledge transfer in the context of design seems to require higher order

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

thinking, and requiring both written answers and an evaluation of the processes used to come to those answers further suggests a focus on higher order thinking. The curriculum and the assessment could be improved by placing more emphasis on technological capabilities.

Design Team Assessments for Engineering Students

Background

Sponsor/Creator

Transferable Integrated Design Engineering Education (TIDEE), a consortium of schools under the direction of Denny C. Davis, Washington State University

Purpose

Assess students’ knowledge, performance, and evaluation of the design process; evaluate student teamwork and communication skills

What is measured

Student knowledge and skills in engineering design

Target population

Baccalaureate engineering students

Item format

Constructed-response, team design exercise, reflective essay

Sample size

Unknown

Frequency of administration

Unknown

Availability

http://www.tidee.cea.wsu.edu/assessment-tools/

Scope

This three-part assessment was developed in 2002. The first component, intended for early-stage baccalaureate engineering students,

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

is a formative assessment of students’ knowledge of the engineering design process, teamwork, and design communication. It includes three constructed-response questions and requires 15 to 20 minutes to administer. A detailed evaluation rubric identifies seven criteria for scoring each question.

The second component addresses students’ ability to perform crucial engineering design processes and is intended for pre-capstone design-project students. The assessment is administered to teams of four students who are allowed 35 minutes for the design activity and 7 minutes to complete the associated worksheets. A three-part evaluation rubric focuses on the steps of the design process rather than on the final project.

The final component, which builds on the second, requires a reflective essay in which students are asked to explain and improve upon the design process, as well as to consider the role of teamwork and communication in their design effort. The essay rubric evaluates how well students reflect on their team design experience.

Sample Items4

  • Component 1: Knowledge of engineering design process, teamwork, and design communication

In general, a process is an ordered set of activities to accomplish a goal. In the space below, describe and/or diagram your understanding of the engineering design process.

(Suggested correct answer mentions gathering information, defining requirements, generating ideas, evaluating ideas, making decisions, implementing ideas, and developing a process.)

  • Component 2: Ability to perform engineering design processes (in groups of four)

4

From Davis, D., S. Beyerlein, K. Gentili, L. McKenzie, M. Trevisan, C. Atman, R. Adams, J. McCauley, P. Thompson, P. Daniels, R. Christianson, T. Rutar, and D. McLean. 2002. Design Team Knowledge Assessment, Part 1 of the Design Team Readiness Assessment developed by the Transferable Integrated Design Engineering Education (TIDEE) Consortium. Available online at: http://www.tidee.wsu.edu.

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

Your group is charged with developing a testing procedure to convincingly show how well an assigned hand tool (or other device) satisfies one key customer expectation. Your testing procedure should be described such that another engineer could independently implement your procedure and obtain the same results.

  1. Describe your team organization and member responsibilities assigned to ensure that your team can complete this activity effectively and in the 35 minutes allotted.

  2. Identify customer expectations of the tool (list and give brief explanation of each).

  3. What source or sources of information did you use to aid in identifying customer expectations?

  4. Identify the most essential customer expectation. Justify your selection.

  5. Describe a complete testing procedure for your one selected feature. Itemize steps. As appropriate, include sketches or specifics about data collection and analysis.

(Suggested correct answer: (A) Credit is awarded for leadership assignment, explanation of time/task management, and details of roles and responsibilities of team members. (B) Credit is awarded for identifying at least five customer needs and explaining three of them. (C) Credit is awarded for identifying at least two sources of information. (D) Credit is awarded for selecting only one customer expectation as the most important and providing a reasonable explanation of its importance. (E) Points are awarded for listing relevant ideas for testing, defining detailed steps of testing procedures, considering variability and replication of the results, defining a means of quantifying test results, and providing criteria for the tool to pass the test.)

  • Component 3: Understanding of the engineering design process and analysis of team design performance

Prepare a 2-page essay, double-spaced in 12-point font, demonstrating your understanding of team-based engineering design processes focused on meeting a customer’s needs. Reflecting on your recent team design experience

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

(Component 2), explain what you did as a team, why it worked or didn’t work, and how you could improve your team’s performance. Specifically address these issues with respect to (1) the engineering design process; (2) teamwork; and (3) design communication.

(Suggested correct answer: A correct answer will include a discussion of actions and occurrences in the group, explain why things were effective or not, and propose improvements in the team design process in six areas: customer focus, management of the design process, assignment of roles/responsibilities, management of task/ time, oral/team dynamics, and writing the team log.)

Committee Observations

The multiple forms of assessment and open format provide a broad exploration of what students know about the design process. Although intended for a highly focused audience of baccalaureate engineering students, this assessment could also be used for teachers, high school students, and perhaps even middle school students. The reflective essay may be the most valuable part of the assessment, because it encourages metacognition but does not require specific jargon for a positive evaluation. However, this instrument does not require knowledge transfer, which seems to disconnect it from a real-world design situation. Students could perform well on this assessment without believing in any of the lessons of teamwork or the design process. That is, by memorizing jargon and the school-learned steps of the design process, a student could do well without demonstrating higher order thinking skills.

Design Technology (Higher Level)

Background

Sponsor/Creator

International Baccalaureate Organization, Geneva, Switzerland

Purpose

Student achievement (part of qualification for diploma)

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

What is measured

Knowledge and capability in technological design

Target population

Students in IB programs, ages 16 to 19

Item format

Multiple-choice, data-based, short-answer, and extended-response items

Sample size

IB students throughout the world following an “experimental sciences” curriculum

Frequency of administration

Regularly since 2003

Availability

IB North America, ibna@ibo.org

Scope

The International Baccalaureate (IB) Organization oversees the IB Diploma Program, which offers intensive pre-university courses and exams. Students in the program choose the focus of their intensive study while still pursuing a broad education in the sciences and humanities. The IB Diploma Program offers courses in six academic subjects: language A1, a second language, individuals and societies, experimental sciences, mathematics and computer sciences, and the arts. Students must take at least one course in each area. Diploma candidates must pursue at least three, but not more than four, subjects at the higher level (at least 240 teaching hours). All other courses are taken at the standard level (150 teaching hours).

Students who choose to focus their studies on the experimental sciences take courses in biology, chemistry, environmental systems, physics, and design technology. The syllabus for the standard-level design technology course stipulates that the curriculum must cover six areas of design technology: designers and the design cycle; the responsibility of the designer; materials, manufacturing processes, and techniques; production systems; and clean technology and green design. The curriculum for higher level courses covers these additional topics: raw material to final product; microstructures and macrostructures; and appropriate technologies.

Items in the three-part assessment—called Papers 1, 2, and 3—

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

are grouped into three increasingly challenging objectives. At the Objective 1 level, students are required to define, list, or measure, among other tasks. At the Objective 2 level, students are required to compare, calculate, estimate, and outline. At the highest level, Objective 3, students are asked to deduce, predict, evaluate, and design. Student performance is also evaluated by a teacher-directed “internal assessment,” which includes a design project. The rubric for the internal assessment includes student planning, data collection, data processing and presentation, and manipulative and personal skills.

Sample Items5

  • Paper 1, higher level, multiple-choice (November 2004)

Which technique fuses solid particles with heat and pressure without completely liquefying them?

  1. Injection molding

  2. Casting

  3. Sintering

  4. Lamination

(Suggested correct answer: C)

  • Paper 2, higher level, data-based (November 2004)

Figure 1 shows the London Eye, which was designed as a landmark project for the millennium. It is like a giant bicycle wheel (circumference 424 m) with a central hub and spindle (330 tonnes) connected to outer and inner rims by a total of 64 cable spokes, each 75 m long. 32 passenger capsules are mounted around the rim with a maximum capacity of 25 people per capsule. The entire structure stands 135 m high and is supported from one side only (see Figure 2). The wheel turns continuously anti-clockwise, during operating hours, at 0.26 m/s, even when people are getting on and off. As passengers travel from X to Y in fine weather they can see over 40 km in each direction (see Figure 3).

(Figure 1 is an aerial photograph of the London Eye on the Thames River in London. Figure 2 is an engineering drawing of the London

5

Reprinted with permission of International Baccalaureate Organization.

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

Eye that depicts the A-frame design and how the structure is supported. Figure 3 shows a circle representing the London Eye and demonstrates the height at which optimum views are possible. When radii are drawn to points X and Y, they form a right angle. The exam question has 10 parts; 3 are reproduced below.)

  1. Calculate how long to the nearest minute passengers enjoy the optimum views as the capsule they are inside is rotated from X to Y, as shown in Figure 3.

  2. List two dominant considerations in the design of the London Eye.

  3. State the importance of tensile forces in relation to the design of the wheel.

  • Paper 2, higher level, extended response (November 2004)

Figure 6 shows the Tizio lamp, a steel desk lamp using a low voltage/ low wattage light bulb designed by Richard Sapper in 1972. In this design there are two hollow beams connecting the electric cables which can be moved to adjust the angle and height of a light source over a working surface. Each beam has a counterbalanced weight at the end to keep the whole lamp in equilibrium.

(Figure 6 is a schematic drawing of the Tizio lamp, with diagrams showing the characteristics described in the question. There are 7 parts to this question; 2 are reproduced below.)

  1. Outline one suitable treatment or finish for the steel lamp.

  2. Suggest three ways in which the designer has balanced form with function in the design of the lamp.

  • Paper 3, higher level, extended response (November 2004)

Explain three problems associated with existing agricultural practices that have led to increased interest in organic agriculture.

Committee Observations

This instrument does a very good job of assessing knowledge related to the IB curriculum. Paper 1 tests basic knowledge at the

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

application level. Papers 2 and 3 require in-depth, higher order processing skills. The assessment seems particularly effective at teasing out knowledge of the design process. All three papers require technical knowledge beyond what might be considered basic technological literacy. Many of the items are difficult and may be appropriate only for 12th-grade or post-secondary students who have completed the appropriate coursework.

Engineering K–12 Center Teacher Survey

Background

Sponsor/Creator

American Society for Engineering Education (ASEE)

Purpose

Inform outreach efforts to K–12 teachers

What is measured

Attitudes, knowledge, and interest about engineering

Target population

K–12 teachers

Item format

Survey

Sample size

Approximately 400 teachers

Frequency of administration

Continuously available

Availability

http://www.engineeringk12.org/educators/taking_a_closer_look/survey1.cfm

Scope

ASEE uses this instrument to help shape communications, products, and services for the K–12 community. The instrument’s 44-question survey probes teachers’ perceptions of the accessibility of various careers, including engineering, to women and minorities. It also addresses teachers’ attitudes toward engineers as well as the efficacy of using engineering to help teach other subjects. In addition to tapping attitudes, the survey

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

collects demographic data, including gender, ethnicity, age, type of school, years of teaching experience, and family and friendship connections to an engineer.

Sample Items

  • Indicate whether you strongly disagree, disagree, are neutral, agree, or strongly agree with the following statement:

Engineering can be a way to help teach students language arts.

(Strongly disagree: 1.4 percent, disagree: 8.1 percent, neutral: 28.7 percent, agree: 47.2 percent, strongly agree: 14.6 percent)

  • Indicate whether you strongly disagree, disagree, are neutral, agree, or strongly agree with the following statement:

Majoring in engineering is harder than majoring in English.

(Strongly disagree: 2.1 percent, disagree: 9.4 percent, neutral: 25.0 percent, agree: 30.9 percent, strongly agree: 32.5 percent)

  • For which of the following careers would an engineering degree prepare you? Please select all that apply.

  • NASCAR crew chief

  • Sneaker designer

  • Business consultant

  • Pop music producer

  • Perfume maker

  • None of the above

(NASCAR crew chief: 89.9 percent; sneaker designer: 96.3 percent; business consultant: 75.5 percent; pop music producer: 57.9 percent; perfume maker: 72.4 percent; none of the above: 2.4 percent)

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

Committee Observations

This instrument does not, nor is it intended to, assess higher order thinking. However, by assessing teachers’ attitudes about engineering, the survey does convey a general sense of how effective teachers might be at encouraging student technological literacy. If teachers’ attitudes indicate inaccurate perceptions of engineering, it is unlikely they will be able to teach effectively technology-related concepts and skills or provide sound advice to students about opportunities for technology-related careers.

Eurobarometer: Europeans, Science and Technology

Background

Sponsor/Creator

European Union (EU) Directorate-General for Press and Communication

Purpose

Monitor changes in public views of science and technology to assist decision making by policy makers

What is measured

Opinions about science and technology

Target population

People 15 years and older in the EU

Item format

Survey

Sample size

16,029 people in all 14 EU member states

Frequency of administration

Surveys on various topics conducted regularly since 1973; this poll was taken in May/June of 2001

Availability

http://europa.eu.int/comm/research/press/2001/pr0612en-report.pdf

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

Scope

Participants (approximately 1,000 from each country) were asked to give their opinions on questions related to seven areas of science and technology: (1) information, interest, knowledge; (2) values, science, technology; (3) responsibilities and accountability of scientists; (4) genetically modified food; (5) levels of confidence; (6) young people and the scientific vocation crisis; and (7) European scientific research. Although some questions in the “information, interest, knowledge” section included questions testing knowledge of general science and technology, most questions asked only for opinions. The option of “don’t know” was always available.

Sample Items

  • Information, interest, and knowledge question related to how people get scientific information

Participants were asked if they tended to agree or disagree with the following statement.

I prefer to watch television programs on science and technology rather than read articles on this subject.

(66 percent of participants agreed with this statement, 24 percent disagreed, and 10 percent said they did not know.)

  • Information, interest, and knowledge question related to knowledge and perception of topical scientific subjects

Participants were asked to indicate whether the following statement is true or false.

Mad cow disease (bovine spongiform encephalopathy) is due to the addition of hormones in cattle feed.

(Suggested correct answer: false. 49 percent of participants thought the statement was true, 32 percent thought it was false, and 19 percent did not know.)

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
  • Values, science, and technology question on optimism regarding science

Participants were asked whether or not they agreed with the following statement.

Thanks to scientific and technological progress, the earth’s natural resources will be inexhaustible.

(21 percent agreed with the statement, 61 percent disagreed, and 17 percent did not know.)

Committee Observations

This is predominantly an opinion survey. Thus, it does not provide a meaningful assessment of technological literacy. However, it does demonstrate the importance of measuring public perceptions of science and technology. Most questions are straightforward and focus on current issues, but a few questions in the “perceptions of scientific methods” section appear to require some higher order thinking. In addition, the poll has a much stronger emphasis on environmental and biorelated science and technology issues (e.g., mad cow disease and genetically modified food) than might be expected in a similar American survey.

European Commission Candidate Countries Eurobarometer: Science and Technology

Background

Sponsor/Creator

Gallup Organization of Hungary, with funding from the European Commission

Purpose

Monitor public opinion on science and technology issues of concern to policy makers

What is measured

Opinions about various science and technology issues

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

Target population

People 15 years and older

Item format

Survey

Sample size

12,247 adults in 13 EU candidate countries

Frequency of administration

Periodically since 1973; this survey was administered in November 2002

Availability

http://www.europa.eu.int/comm/public_opinion/archives/cceb/2002/2002.3_science_technology.pdf

Scope

The European Union (EU) regularly monitors the opinions of citizens of member states about issues of concern to policy makers. This poll extends that model to a group of countries seeking membership in the EU. Poll questions primarily solicit opinions about science and technology, but a few questions attempt to assess general knowledge of these subjects. The accompanying report presents findings in eight areas: (1) information, interest, knowledge; (2) values, science, and technology; (3) the morality of science; (4) the bovine spongiform encephalopathy (BSE) epidemic; (5) food based on genetically modified organisms; (6) the scientific profession; (7) the scientific vocational situation; and (8) European scientific research. Survey results are weighted by age, sex, region, profession, religion, size of locality, educational level, and marital status. The results of this survey were compared to those of a poll in 2001, Eurobarometer 55.2, Europeans, Science and Technology (also reviewed by the committee), that asked similar questions.

Sample Items

  • Information, interest, knowledge related to fundamental scientific facts

Here is a little quiz. For each of the following statements, please tell me if you think it is true or false. If you don’t know, say so, and we will go on to the next one.

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

Antibiotics kill viruses as well as bacteria.

(Suggested correct answer: False. 23 percent in candidate countries and 40 percent in member countries answered correctly.)

  • Values, science, and technology question on superstition, ignorance about science and pre-modern nostalgia

I will now read out some statements about science, technology, or the environment. For each statement, please tell me if you tend to agree or tend to disagree.

In my daily life, it is not important to know about science.

(In candidate countries, 37 percent tended to agree and 52 percent tended to disagree. In member countries, 42 percent tended to agree and 49 percent tended to disagree.)

  • Lessons from the BSE epidemic

There has been much discussion about responsibilities for the “mad cow disease” problem. Could you please tell me if you tend to agree or disagree with the following statements?

The food industry carried a major part of the responsibility.

(51 percent of those polled in candidate countries tended to agree with this statement. 74 percent of those polled in EU member countries tended to agree.)

Committee Observations

Although this instrument mostly reflects opinions rather than knowledge or capabilities, some aspects of these polls are worth examining more closely. For example, correlating opinions and knowledge with religion and educational level, among other factors, may be useful for this type of assessment. In terms of technological literacy, however, the poll does not assess design or technology skills of any kind. Nor does it require higher order thinking.

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

Future City Competition—Judges Manual

Background

Sponsor/Creator

Engineers Week Committee, a consortium of professional and technical societies and U.S. corporations

Purpose

To help rate and rank design projects and essays submitted to the Future City Competition

What is measured

Design, writing, and presentation skills

Target population

7th- and 8th-grade American students

Item format

Scoring sheets with numerical scales (e.g., 0–5 and 0–10) to indicate performance on various parameters

Sample size

Approximately 30,000 students each year

Frequency of administration

Yearly since 1992

Availability

http://www.futurecity.org/docs/2004JudgesManual.pdf

Scope

In the Future City Competition, teams composed of three students, a teacher, and an engineer-mentor create a computer city design with SimCity software and a physical, scale model of part of the city. At the competition, students deliver a short, oral presentation to the judges in which they describe their model and computer simulation. The students also write an essay that describes how technology can meet an important social need. In 2005, the essay topic was “How can futuristic transportation systems effectively use aggregate materials—crushed stone, sand, and gravel—as a basic construction product?” Winners of regional competitions are invited to a national competition in Washington, D.C. The

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

instrument is part of the manual judges use to evaluate student submissions. It includes specific criteria for awarding points in five areas: computer evaluation of city (standard set of questions about the group’s SimCity model), computer city design, city model, presentation, and essay/abstract.

Sample Items

(The following are some of the criteria that the judges use to evaluate the students’ designs.)

  • Computer city design—transportation criteria (graded on a scale of 0 to 5 points)

    1. Does the public transportation system provide full mobility for the people? (rail, subway, and buses)

    2. Is there adequate mobility for the transport of goods and services? (rail and roads)

    3. Is there a seaport and an airport in the city?

  • City model—creativity criteria (graded on a scale of 0 to 10 points)

    1. Does the city illustrate futuristic concepts?

    2. Are there different sizes and shapes of buildings?

    3. Are different types of building materials used?

    4. Did any of the building components incorporate recycled materials?

  • Team presentation of city design and model—cooperation criteria (graded on a scale of 0 to 10 points)

    1. How well do the students work as a team during their presentation?

    2. How well do the students work as a team during the Question and Answer session by the judges?

    3. Are all the students able to answer questions about their city, or does only one student know all the answers?

Committee Observations

The Future City Competition allows students to combine an open-ended, engineering design task with communication skills, use of

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

technology, teamwork and an innovative, out-of-the-box thinking exercise. These activities foster higher-order thinking and allow students to be assessed in areas that have historically been extremely difficult to gauge with standard paper-and-pencil exams. This type of assessment can be very expensive to administer, although Future City relies on volunteer judges local to the competition venue. The instrument does not allow for individual assessment, which may present accountability problems. Despite the detailed judge’s manual, it may be extremely difficult to grade such projects consistently.

Gallup Poll on What Americans Think About Technology (2001, 2004)

Background

Sponsor/Creator

International Technology Education Association (ITEA), with funding from the National Science Foundation and National Aeronautics and Space Administration

Purpose

Determine public knowledge and perceptions of technology to inform efforts to change and shape public views

What is measured

Public understanding, opinions, and attitudes about technology and technological literacy

Target population

American adults

Item format

Survey

Sample size

1,000 people in 2001; 800 in 2004

Frequency of administration

Twice, in 2001 and 2004

Availability

Contact ITEA, which commissioned the poll, at http://www.iteawww.org

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

Scope

In 2001, ITEA contracted with the Gallup organization to conduct a survey of Americans’ understanding, attitudes, and beliefs about technology and technological literacy. ITEA was particularly interested in measuring public opinion about the importance of technological literacy. The 17-question telephone poll of 1,000 randomly selected Americans resulted in three major conclusions: (1) Americans believe technological literacy is important for everyone; (2) technology is understood very narrowly as being computers and the Internet; and (3) most people believe that schools should include the study of technology in their curricula.

Three years later, ITEA and Gallup conducted a follow-up poll, in which they repeated 5 questions from the first poll and introduced 11 new ones to build on and extend the findings of the first poll. The 2004 poll examined seven areas: (1) public concepts of technology; (2) the importance of being knowledgeable about technology; (3) the impact of technology on daily life and the world; (4) what people want to know and what they do know about technology: (5) decision making regarding technology and technological literacy; (6) differences based on gender: (7) and technology and education. For both polls, demographic information was collected, including age, gender, race, grade/educational level, and geographic location.

Sample Items

  • Public understanding of technology (2001 and 2004)

When you hear the word “technology,” what first comes to mind?

(In the 2001 survey, 67 percent of respondents answered computers; 4 percent electronics; 2 percent education; 2 percent new inventions; 1 percent or less all other answers. In the 2004 survey, 68 percent answered computers; 5 percent electronics; 2 percent advancement; 2 percent Internet; 1 percent or less all other answers.)

  • Knowledge of technology (2001)

Tell me if each of the following statements is true or false. How about:

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
  1. Using a portable phone while in the bathtub creates the possibility of being electrocuted.

  2. FM radios operate free of static.

  3. A car operates through a series of explosions.

  4. A microwave heats food from the outside to the inside.

(1. Correct answer is False. 46 percent of respondents thought this statement was true; 51 percent said it was false. 2. Correct answer is true. 26 percent answered true; 72 percent false. 3. Correct answer is true. 82 percent said true; 15 percent false. 4. Correct answer is false. 37 percent said true; 62 percent false.)

  • Influence on technology-related decision making (2004)

How much influence do you think people like yourself have on decisions about such things as the fuel efficiency of cars, the construction of roads in your community, and genetically modified foods? Would you say a great deal, some, very little, or no influence?

(9 percent of respondents said a great deal; 32 percent said some; 40 percent said very little; and 19 percent said no influence.)

Committee Observations

Both polls addressed aspects of the ITEA Standards of Technological Literacy related to the nature of technology, technology and society, and abilities for a technological world. Although the polls did not explicitly assess higher order thinking, some of the questions may have prompted participants to think deeply about certain issues, for example how technology is defined. On the whole, the polls were well designed, and the questions were clear and unbiased. However, opinion polls do not always yield valid information. Responses may represent confidence rather than competence. That is, a self-assessment of a person’s knowledge or capability may not reflect reality.

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

ICT Literacy Assessment

Background

Sponsor/Creator

Educational Testing Service (ETS)

Purpose

Proficiency testing

What is measured

Ability to use digital technology, communication tools, and/or networks to solve information-related problems

Target population

High school students, community college students, and freshmen and sophomores in four-year colleges (core assessment); rising juniors at four-year colleges (advanced assessment)

Item format

14 4-minute and one 15-minute simulated, scenario-based tasks delivered via the Web

Sample size

Approximately 4,500 examinees at 31 campuses (January through April 2005 administration of advanced assessment)

Frequency of administration

Advanced assessment launched in January 2005 (2006 test window was January 23– April 3; continuous testing to begin in August 2006). Pilot testing of the core assessment was January 23–February 17 (2006 test window was April 5–May 5; continuous testing to begin in August 2006.)

Availability

Test details and sample items available at http://www.ets.org/ictliteracy

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

Scope

The ICT Literacy Assessment was developed by ETS in collaboration with a consortium of seven institutions of higher education. The work of the consortium was guided by an International ICT Literacy Panel that published a framework document, “Digital Transformation: A Framework for ICT Literacy,” in 2002.6 The ETS proficiency model has seven elements:

Define—Use ICT tools to identify and appropriately represent an information need.

Access—Collect and/or retrieve information in digital environments.

Manage—Use ICT tools to apply an existing organizational or classification scheme for information.

Integrate—Interpret and represent information, such as by using ICT tools to synthesize, summarize, compare, and contrast information from multiple sources.

Evaluate—Judge the degree to which information satisfies the needs of the task in ICT environments (including determining authority, bias, and timeliness of materials).

Create—Adapt, apply, design, or invent information in ICT environments.

Communicate—Communicate information properly in its context (audience, media) in ICT environments.

According to ETS, academic institutions can use test results to decide about new course offerings, determine which courses need additional resources, and provide data for accreditation purposes. Students can use assessment results to help select courses and majors or determine readiness for the workforce or graduate school. Tests cost $35 each, and initial orders must include a minimum of 100 tests.

Sample Items

Actual test items are not publicly available. The ETS website contains a demo with three sample tasks.

Display and Interpret Data. Examinees create a visual representation of data to answer two research questions.

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

Scenario: As part of a project for a cultural studies class, examine trends in the public’s taste in books and use a graph creator to show how the popularity of different types of books has varied since the advent of television.


Advanced Search. Examinees construct an advanced search based on a complex information need. Scenario: Search a university library database for information about plans that various California state or municipal governing bodies (excluding San Francisco) have made to protect the public in the event of an earthquake. The search strategy must include Boolean logic, quotation marks, and asterisks.


Comparing Information. Examinees summarize information from a variety of courses and draw conclusions from their summary. Scenario: Collect information from several sources about office products intended for use by left-handed persons, and rank the desirability of these products based on a set of features desired by the office manager of an architectural firm.

Committee Observations

The strength of this assessment instrument is the measurement of practical skills in narrow, but important, information-technology applications. Questions are posed in a real-world context, which gives meaning to the scenario-based tasks. Successful performance requires more than information recall and rote memorization. Because examinees can improve their responses based on feedback, the assessment might be used to ascertain not only what examinees know, but also how they go about learning. It is not evident, however, that the test is designed to capture data on the how of learning.

Although the committee examined only a handful of sample tasks, it was apparent that examinees who do not have regular access to the Internet, e-mail, electronic card catalogs, graphing software, and other technologies featured in the assessment would be at a disadvantage. Given the target population for the assessment, this may not be a significant worry. The assessment was not designed with the ITEA Standards for Technological Literacy in mind, but it addresses some of the benchmarks in ITEA Standard 17: Information and Communication Technology. The assessment would be more challenging and perhaps more revealing of test takers’ capabilities if some of the tasks included open-ended elements.

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

Illinois Standards Achievement Test—Science

Background

Sponsor/Creator

Illinois State Board of Education

Purpose

Measure student achievement in five areas and monitor school performance

What is measured

Science-related knowledge and capability

Target population

4th- and 7th-grade students in Illinois

Item format

Multiple choice

Sample size

All eligible public school students in 4th and 7th grade in Illinois

Frequency of administration

Annually in April since 2000

Availability

http://www.isbe.net/assessment/PDF/2003ScienceSample.pdf

Scope

This assessment is aligned with the Illinois Learning Standards, which were adopted in 1997. Standards 11B and 13B are related to technology. Standard 11B requires that students “know and apply the concepts, principles, and processes of technological design.” Standard 13B requires that students “know and apply concepts that describe the interaction between science, technology, and society.” Four questions for 4th-grade students and five questions for 7th-grade students in the 2003 sample assessment address Standard 11B. Seven questions on the 4th-grade exam and six questions on the 7th-grade exam address standard 13B. The 70-question exam is administered in 80 minutes and covers science inquiry, life sciences, physical sciences, earth and space sciences, and science, technology and society. The committee reviewed only sample test items because the Illinois Board of Education does not release actual test items.

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

Sample Items

  • 4th grade, Standard 11B

Which color of roofing material would be best to help keep a house cool?

  1. White

  2. Black

  3. Gray

  4. Green

(Suggested correct answer: A)

  • 7th grade standard 11B

What is the volume of this box when folded together?

  1. 17 cubic centimeters

  2. 42 cubic centimeters

  3. 66 cubic centimeters

  4. 144 cubic centimeters

(Suggested correct answer: D)

  • 7th grade, Standard 13B

One of the principal causes of acid rain is

  1. acid from chemical laboratories leaking into groundwater.

  2. gases from burning coal and oil released into the air.

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
  1. gases from air conditioners and refrigerators escaping into the atmosphere.

  2. waste acid from chemical factories pumped into rivers.

(Suggested correct answer: B)

Committee Observations

In general, the questions in this assessment address everyday topics that average citizens might be expected to encounter. However, although the educators who designed the assessment claim that it tests higher order thinking skills, many test items require only low-level cognitive skills, mostly at the knowledge level, and occasionally at the application level. The 4th-grade exam includes questions that assess student awareness of how common technological devices function in their environment; these questions do not require recall of specific technical knowledge or jargon. Many items on the 7th-grade exam, however, either require factual recall or rely on logical reasoning. On the whole, a number of questions could be answered with little or no technological knowledge or understanding.

Industrial Technology Literacy Test

Background

Sponsor/Creator

Michael Hayden, Iowa State University

Purpose

Assess the level of industrial-technology literacy among high school students

What is measured

Knowledge in systems, applications, and interpretations of industrial technology

Target population

American high school students

Item format

Multiple choice

Sample size

806 high school and 265 college students

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

Frequency of administration

Once in 1988 or 1990

Availability

Dissertation by Michael Allen Hayden held at Iowa State University

Scope

Michael Hayden, a Ph.D. candidate in industrial education and technology at Iowa State University, created the Industrial Technology Literacy Test as part of his dissertation. The questions in the instrument, which were generated by students in an advanced industrial education and technology course in the spring of 1988, were modified and evaluated to create the present exam. The 45-question exam was administered to a group of high school students in Iowa in 1988 or 1989. The questions are intended to show students’ knowledge of industrial systems, applications, and interpretation. The results of Hayden’s study were correlated with several factors, such as grade level, gender, mother/father’s contact with tools or machines, and previous courses in industry/technology.

Sample Items

  • Multiple choice

The space shuttle and the Alaskan pipeline have as their most common characteristic the fact that:

  1. they were both the center of accidents

  2. they were both invented in the USA

  3. they are both made of metal

  4. USA workers made both of them

  5. they are both transportation systems

(Suggested correct answer: E)

  • Multiple choice

A superconductor is:

  1. a material that has very little electrical resistance at a certain temperature

  2. a type of elevated train

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
  1. a machine that accelerates nuclear material

  2. the electronic component that makes compact discs possible

  3. a type of metal used in cookware

(Suggested correct answer: A)

  • Multiple choice

One of the fastest growing “high tech” firms is RBI Incorporated. During each of the past 5 years it has added another 50 R&D specialists to its staff. These and other technologists have consistently kept RBI at the leading edge of computer innovation. On the average, the speed and capacity of RBIs CPUs has doubled every year. Their X1000 micro computer can perform a calculation in a nanosecond. The chip that allows this speed stores a megabyte of information.

How many calculations can an X1000 computer perform in 1 second?

  1. 103

  2. 105

  3. 109

  4. 1012

  5. 1015

(Suggested correct answer: C)

Committee Observations

Considering the broad range of questions about industrial technology in this instrument, it may be appropriate for measuring industrialtechnology literacy for high school students who have taken courses in the field. A few of the questions require higher order thinking, such as interpreting a graph; however, the majority of questions require factual recall. The exam is also gender biased, as was recognized by its author. In addition, the choices of answers for some questions (e.g., Sample Question 1) appear to be “incomparable alternatives” (i.e., some questions either have no clear answer or have more than one correct answer).

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

Infinity Project Pretest and Final Test

Background

Sponsor/Creator

Geoffrey Orsak, Southern Methodist University, with sponsorship from Texas Instruments and the Institute for Engineering Education

Purpose

Basic aptitude (pre-test) and student performance (end-of-year test)

What is measured

Cognitive skills and curriculum-related knowledge

Target population

American high school students

Item format

Open-ended and multiple-choice questions

Sample size

Thousands of students in 20 states

Frequency of administration

Ongoing since 1999

Availability

Samples are available at ftp://ftp.prenhall.com/pub/esm/sample_chapters/engineering_computer_science/orsak/index.html

Scope

Geoffrey Orsak, dean of the School of Engineering at Southern Methodist University, founded the Infinity Project in 1999 to interest more high school students in pursuing careers in engineering. The Infinity Project is a one-year high school curriculum designed for students who have taken algebra II and at least one course in a laboratory science. The curriculum, which focuses on information technology, includes textbooks, an Infinity Technology Kit for use in the classroom, and training for educators. The textbook, Engineering Our Digital Future (Prentice Hall, 2002), covers a variety of subjects in engineering and technology: the world of modern engineering, creating digital music, making digital images, math you can see, digitizing the world, coding information for

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

storage and security, communicating with ones and zeros, networks from the telegraph to the Internet, and the big picture of engineering. According to the Infinity Project website, more than 65 percent of students who complete the course plan to study engineering in college.

The problem-solving pre-test has 10 questions to measure cognitive skills, such as recognition of discrete patterns from continuous patterns, proportional reasoning, and reverse implication. All questions are open ended and include at least one figure. The end-of-year basic test (from May 2003) consists of 12 multiple-choice knowledge-based questions that cover course content.

Sample Items7

  • Coding information—problem-solving pre-test

Compressing information without information loss:

Engineers compress information. In critical situations, they can retrieve all the information.

7

The Infinity Project™, Institute for Engineering Education, School of Engineering, Southern Methodist University.

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

On the grid below, each row has a pattern representing the number of alternating white and black squares. For example, the first row has 9 white squares. The second row has 1 white, 1 black, 4 white, 1 black, and 2 white squares. If the # sign represents a new row then this image can be represented by the sequence

9#11412#12312#1111212#1121112#11322#11412#9

Using the blank grid above determine the image for the following sequence.

9#333#21312#1151#21312#333#9

(Suggested correct answer: The letter “O”)

  • Signal analysis—end-of-year basic test (May 2003)

If the period of a sinusoidal signal is 0.4ms, what is the frequency?

  1. 2500 Hz

  2. 2500 MHz

  3. 2.5 × 104 Hz

  4. 2.5 × 105 Hz

(Suggested correct answer: A)

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
  • Communication technology—end-of-year basic test (May 2003)

For a touchtone telephone which uses two tones per character, how many tones are used to make the signals for the twelve buttons?

  1. 2

  2. 7

  3. 12

  4. 27

(Suggested correct answer: B)

Committee Observations

The pre-test, with its open-ended format, tests higher order thinking skills at the applications, analysis, and evaluation levels. It also assesses students’ common sense, intelligence, and creativity, but these are not closely related to knowledge of specific technological systems. The end-of-year basic test is fact oriented and specific to course content and, therefore, not appropriate for assessing technological literacy in a general sense. A list of 36 cognitive specifications that were used to formulate the pre-test could be used to design an assessment of technological literacy.

Information Technology in a Global Society

Background

Sponsor/Creator

International Baccalaureate Organization (IBO), Geneva, Switzerland

Purpose

Student evaluation

What is measured

Students’ knowledge of information technology terminology, concepts, developments, trends, social significance, and ethical issues

Target population

16–19-year-old high school students who have taken the Information Technology in a Global Society course

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

Item format

Multiple-choice and extended-answer questions, portfolio, and project

Sample size

The IB program is active in 1,468 schools in 119 countries; 56,284 diploma candidates took a total of 186,661 exams (in all subjects) in May of 2004

Frequency of administration

Semiannually at the standard level since 2002; higher level exams will be available in 2006

Availability

IBO in North America at http://www.ibna@ibo.org

Scope

The International Baccalaureate Organization (IBO) oversees the IB Diploma Program, which offers intensive pre-university courses and exams. Students in the IB program choose their focus of concentrated study, and at the same time pursue a broad education in the sciences and humanities. The purpose of the ITGS course is to give students a broad knowledge of information technology, skills to understand and explore new technologies, and an appreciation of the ethical and social effects of technology on the world.

The student assessment includes components designed by IBO (external), taken by every student enrolled in the course, and by the school (internal), taken only by the students in that school. The assessment requires that students demonstrate an ability to understand, apply, use, discuss, evaluate, explore, and construct information technology in four areas: terminology and concepts, developments and trends, social significance, and ethical considerations. The external review has two parts: a 40-question multiple-choice exam focused on tools and applications of information technology; and five extended-response questions that emphasize social and ethical considerations related to information technology. Extended-answer questions are graded according to a detailed rubric that awards points based on the mention of correct topics/concepts.

The internal assessment includes a portfolio with at least four pieces of written work and a project. The works in the portfolio, which

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

focus on “the social significance of, and ethical considerations arising from, the widespread use of information technology in society,” are drawn from the following categories: abuse/security/crime; global society; the workplace; privacy; leisure, home, and travel; education; and networks/ communication. The project involves solving a problem that requires extensive use of information technology hardware and software and the integration of technology tools.

Sample Items8

  • Multiple-choice question

What feature of a spreadsheet makes it ideally suited to the construction of financial models?

  1. Video clips can be incorporated into some cells.

  2. Data can be validated easily by the use of a customized macro.

  3. Formulae allow the effect of a change of variable to be seen easily.

  4. Three-dimensional images can be created from the model.

(Suggested correct answer: C)

  • Extended-response question

Some schools issue identification cards to each student. These cards are similar to a credit card and contain the student’s identification number and other personal information. When a student arrives at school in the morning, the student swipes the identification card through a card reader which records the date and time of arrival at the school. In the afternoon the card is again used to record when the student has left the school.


Students may also use their identification card to purchase small stationary items in the school store, and lunches in the cafeteria. These purchases will be billed at the end of the month to the parents. This student’s identification card is also used to sign out books in the school library. As an added benefit, some local stores give a 5% discount to

8

Information Technology in a Global Society, Standard Level Paper 1, November 2002 © International Baccalaureate Organization 2002. The assessment model for the subject was changed in May 2004, and this style of question is no longer used.

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

students who show their identification cards when making a purchase.

  1. State two pieces of student information, other than identification number, which could be contained in the card.

  2. State two methods for indicating the identification number on the card so that it could be read by the identification card reader.

  3. Describe how two items of information are obtained by the school about a student without the student knowing it.

  4. Discuss three social and/or ethical concerns which students may have about owning and using the identification card and weigh up the importance of your arguments.

(Suggested correct answers: (a) photograph of the student, name of the student, signature of the student, address, telephone number, etc.; (b) printed as letters and numbers and read by using OCR software, printed as a bar code, on a magnetic strip, on a chip; (c) pattern of attendance, accumulated number of absences and latenesses, eating habits and food preferences, type of library products signed out, etc.; (d) ease of duplication for misuse, access by the student to the data collected, number and identity of people with access, access of data to other organizations or institutions, etc.).

Committee Observations

This instrument assesses students’ knowledge of information technology in considerable depth. The format allows for a thorough analysis of student achievement using a variety of assessment tools. In the external assessment, most of the multiple-choice questions rely on recall of information. The open-ended questions require higher order thinking in cognitive and affective dimensions. It was not possible to judge the capabilities portion of the assessment because the external assessment did not include skills tasks. Although this instrument has a number of strong points, it assesses only knowledge of information technology specifically, as opposed to knowledge of technology in general.

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

Massachusetts Comprehensive Assessment System—Science and Technology/Engineering

Background

Sponsor/Creator

Massachusetts Department of Education

Purpose

Monitor individual student achievement, gauge school and district performance, satisfy requirements of No Child Left Behind Act

What is measured

Knowledge of technology and engineering

Target population

5th-, 8th-, and 10th-grade students in Massachusetts

Item format

Multiple-choice and open-response items

Sample size

74,605 5th-grade and 78,686 8th-grade students in Massachusetts

Frequency of administration

Annually since 1998

Availability

http://www.doe.mass.edu/mcas/testitems.html

Scope

The 5th- and 8th-grade assessment is part of a combined Science and Technology/Engineering Test that all students must take. Ten of the 39 questions are devoted to technology and engineering. The 10th-grade Technology/Engineering Test is one of four subject-area assessments (the others are biology, chemistry, and introductory physics) designed for students who have taken courses in these areas. The 10th-grade

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

Technology/Engineering Test includes 20 multiple-choice questions and two open-response questions.

The questions become increasingly sophisticated and difficult through the grades. Fifth-grade students are tested on their knowledge of materials and tools and their understanding of the design process. The 8th-grade test requires deeper knowledge of the nature of technology, as well as specific domains of technology, such as construction, communication, manufacturing, transportation, and biotechnologies. The 10th-grade test is even more specific, with topics such as power and energy technologies in fluid, thermal, and electrical systems.

Sample Items

  • 5th-grade multiple-choice question (2003)

Which of the following tools would be most useful in determining the length and width of a school cafeteria?

  1. scale

  2. centimeter ruler

  3. tape measure

  4. thermometer

(Suggested correct answer: C)

  • 5th-grade open-ended question (2003)

The lever, pulley, inclined plane, wedge, wheel-and-axle, and the screw are simple machines.

  1. Identify and sketch four of these simple machines

  2. For each of the four machines that you sketch, describe an example of how it is used.

  • 8th-grade multiple-choice question (2004)

Several students are entering a bridge-building contest that requires using ice cream sticks and glue to construct the strongest bridge possible. The bridges must by 5 in. wide and span the length of 18 in.

Which of the following tests is the most accurate way to determine the strongest span design for these bridges?

  1. roll toy cars across each bridge until it collapses

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
  1. place concrete construction blocks on top of each bridge until it collapses

  2. stack coins on both ends of each bridge until it collapses

  3. place D-cell batteries at the center of each bridge until it collapses

(Suggested correct answer: D)

  • 10th-grade multiple-choice question (2004)

In the first step of making some ceramic cups, the following manufacturing process is used. Liquid clay is poured into a mold, allowed to solidify, then removed from the mold.

What is the name of this manufacturing process?

  1. casting

  2. milling

  3. finishing

  4. refining

(Suggested correct answer: A)

Committee Observations

This instrument is well matched to the Massachusetts standards in science, technology, and engineering at the 5th-, 8th-, and 10th-grade levels. Most of the questions are multiple choice and focus on knowledge of course content, identification of terms, and rote memorization. A few open-ended questions require higher order thinking or design thinking to solve problems. For some questions, the distracters seemed just as plausible as the suggested correct answers.

Multiple-Choice Instrument for Monitoring Views on Science-Technology-Society Topics

Background

Sponsor/Creator

Glen Aikenhead and Alan Ryan, University of Saskatchewan, with funding from the Canadian Social Sciences and Humanities Research Council

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

Purpose

Curriculum evaluation, research

What is measured

Student attitudes and understanding of science, technology, and society (STS)

Target population

12th-grade high school students in Canada

Item format

Multiple choice

Sample size

5,250 English-speaking and 1,732 French-speaking students

Frequency of administration

Once, September 1987–August 1989

Availability

G.S. Aikenhead and A.G. Ryan, University of Saskatchewan

Scope

This instrument was developed in the late 1980s in response to changes in the science curriculum in secondary schools in Ontario that placed new emphasis on the social context of science and technology. Questions were derived from an eight-part conceptual scheme: science and technology, influence of society on science/technology, influence of science/technology on society, influence of social science on society, characteristics of scientists, social construction of scientific knowledge, social construction of technology, and nature of scientific knowledge. During the development of the instrument, high school seniors were asked to provide written responses to statements about science/technology/society (STS) topics; their views were then used to create answers for multiple-choice questions. The answers not only reflect students’ stated opinions but are written in their own words.

Sample Items

  • Science and technology, defining technology

Defining what technology is can cause difficulties because technology does many things in Canada. But MAINLY technology is:

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

Your position, basically: (please read from A to J, and then choose one.)

  1. very similar to science.

  2. the application of science.

  3. new processes, instruments, tools, machinery, appliances, gadgets, computers, or practical devices for everyday use.

  4. robotics, electronics, computers, communications systems, automation, etc.

  5. a technique for doing things, or a way of solving practical problems.

  6. inventing, designing, and testing things (for example, artificial hearts, computers, space vehicles).

  7. ideas and techniques for designing and manufacturing things, for organizing workers, business people and consumers, for the progress of society.

  8. I don’t understand.

  9. I don’t know enough about this subject to make a choice.

  10. None of these choices fits my basic viewpoint.

  • Influence of society on science/technology government

Science would advance more efficiently in Canada if it were more clearly controlled by the government.

Your position, basically: (Please read from A to H, and then choose one.)

  1. Government should control science and make it more efficient by coordinating research work and by providing the money.

  2. The government’s control should depend on how useful the particular scientific research will be for Canadian society. Useful research should be more closely controlled and money should be provided.

  3. Government should NOT control science, but should give it money and leave the conduct of the science up to the scientists.

  4. Government should NOT control science but should leave the scientific research to private agencies or corporations; though government should provide the money for the scientific research.

  5. Government cannot make science more efficient because government is inefficient and cannot always be trusted.

  6. I don’t understand.

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
  1. I don’t know enough about this subject to make a choice.

  2. None of these choices fits my basic viewpoint.

  • Social construction of scientific knowledge, professional communication among scientists

When a research team makes a discovery, it is all right for them to announce it to the press before other scientists have discussed it.

Your position, basically: (Please read from A to H, and then choose one.)

The research team should announce it directly to the public:

  1. to get the credit for the discovery and prevent other scientists from stealing the idea.

  2. because the public has the right to know about a discovery as soon as it is made. Other scientists can discuss it later.

  3. the research team should be free to decide who hears about it first.

The research team should first present it to other scientists for discussion:

  1. to test and verify the discovery and prevent inaccurate stories from being published. This would ensure that harmful or embarrassing errors are worked out before it was made public.

  2. to improve the discovery before it is made public.

  3. I don’t understand.

  4. I don’t know enough about this subject to make a choice.

  5. None of these choices fits my basic viewpoint.

Committee Observations

Because the answer choices are generated by students, they provide a genuine reflection of how students feel about STS topics. By design, no attempt is made to gauge capabilities or knowledge of technology concepts, per se. Therefore, it would be difficult to assess technological literacy based on the results. In addition, because the authors make no judgments about the relative value of the answer choices, it is very difficult

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

to determine students’ understanding of STS issues. A high level of reading competency is required to complete the test. Thus, students with reading problems may have difficulty with this format.

New York State Intermediate Assessment in Technology

Background

Sponsor/Creator

New York State Education Department and University of the State of New York (USNY)

Purpose

Student evaluation; curriculum improvement

What is measured

Students’ knowledge and skills in seven areas

Target population

7th- and 8th-grade students in New York who have taken a technology education course

Item format

Multiple choice and extended response

Sample size

Unknown

Frequency of administration

Unknown

Availability

http://www.emsc.nysed.gov/ciai/mst/pub/tqsample.pdf

Scope

In 1986, New York schools began offering an Introduction to Technology course for 7th- and 8th-grade students. The State Department of Education developed this instrument in 2001 to test their knowledge. The exam is administered in one 90-minute session and contains

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

40 multiple-choice and 10 extended-answer questions in order of difficulty. The sample assessment reviewed by the committee included just 14 multiple-choice and four extended-answer questions. The sample questions covered engineering design; tools, resources, and technological processes; computer technology; technological systems; history and evolution of technology; impacts of technology; and management of technology. The state of New York does not require that schools administer this test or report the results.

Sample Items

  • History, evolution of technology multiple choice

Eli Whitney’s invention of the cotton gin changed the production of cotton by

  1. creating lighter cotton

  2. saving labor costs at harvest time

  3. enabling the production of cloth

  4. proving that some processes could never be automated

(Suggested correct answer: 2)

  • Technological systems multiple choice

Which type of system is operated by liquid under pressure?

  1. mechanical

  2. steam

  3. pneumatic

  4. hydraulic

(Suggested correct answer: 4)

  • Tools, resources, and technological processes multiple choice

Which device produces power by means of a chemical reaction?

  1. generator

  2. alternator

  3. battery

  4. engine

(Suggested correct answer: 3)

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
  • Tools, resources and technological processes extended response

Your class has been studying skyscraper design and the tremendous influence of skyscrapers on the landscape of cities. You are part of a group that has been assigned to build a model of the Empire State Building. You will be using balsa wood, construction paper, and acrylic plastic for your model.

Describe how each tool would be used:

  • Backsaw

  • Hot-melt glue and glue gun

  • Tape measure

  • Scissors or x-acto knife

  • Abrasive paper

  • Computer

(Students are awarded three points for identifying an appropriate use for at least five tools; two points for identifying an appropriate use for three or four tools; one point for one or two appropriate uses; and no points for no response or no identification of appropriate uses of tools.)

Committee Observations

This assessment includes a balance of multiple-choice questions on technological concepts and open-ended questions that require a reasonable level of higher order thinking. Although this is a paper-and-pencil assessment, some of the open-ended questions touch on the capabilities and ways of thinking and acting dimension of technological literacy. A few multiple-choice questions do not have clear answers (e.g., Sample Question 1), and some of the knowledge-based questions require only recall of definitions.

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

Praxis Specialty Area Test: Technology Education

Background

Sponsor/Creator

Educational Testing Service (ETS)

Purpose

Teacher licensing

What is measured

Pedagogical practices and knowledge in four areas of technology

Target population

College education majors who wish to teach technology education at the middle or high school level

Item format

Multiple choice

Sample size

Unknown

Frequency of administration

Regularly

Availability

http://ftp.ets.org/pub/tandl/0050.pdf

Scope

ETS offers a series of three tests, administered by the Praxis Service, to assess beginning teachers. Praxis I measures basic academic skills of students entering teacher-education programs. Praxis II tests mastery of particular subjects and is designed primarily to assist in licensing teachers. Praxis III assesses the classroom performance of first-year teachers and is also used in licensing decisions.

Of more than 100 Praxis II tests, the only one focused on technology education is designed for prospective technology education teachers at the middle and high school levels. In recent years, ETS has modified the test to bring it into alignment with the ITEA Standards for Technological Literacy. The test has 120 multiple-choice questions divided into five categories: pedagogical and professional studies, information and communication technologies, construction technologies, manufacturing technologies, and energy/power/transportation technologies. The

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

committee reviewed a 12-question sample test, provided by ETS on the Web. Thirty percent of Praxis II is devoted to pedagogical issues, including program development, implementation, and evaluation. Questions covering the four types of technologies focused on design, systems, processes, outputs, resources, and managerial processes.

Sample Items9

  • Pedagogical and professional studies

A student in the process of solving a fabrication problem in the manufacturing laboratory asks the teacher what assembly procedures should be used. The teacher’s best response would be to

  1. give an opinion as to the best assembly procedure for the particular problem

  2. suggest two or three possible assembly procedures and have the student select one

  3. place the responsibility completely on the student for making the judgment

  4. use leading questions to help the student review and analyze the relative merits of several assembly procedures

  5. refer the student to a reference on assembly procedures

(Suggested correct answer: D)

  • Information and communications technologies

The most important consideration in designing successful messages to be transmitted through graphic communications is knowledge and understanding of

  1. current technologies

  2. the capabilities of the designer

  3. the estimated cost of the project

  4. the limitations of the printer

  5. the nature of the audience

(Suggested correct answer: E)

9

Materials were selected from Tests at a Glance, Educational Testing Service. Reprinted by permission of Educational Testing Service, the copyright owner, for limited use by the National Academy of Engineering.

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
  • Construction technologies

Which two of the following composite materials used in manufacturing would generally be classified as laminar composites in the United States?

  1. Particle board

  2. Plywood

  3. Fiberglass

  4. Bimetal coins

  5. Concrete

  1. I and II

  2. I and III

  3. II and IV

  4. III and V

  5. IV and V

(Suggested correct answer: C)

Committee Observations

Although the items in the assessment are aligned with the ITEA Standards on Technological Literacy, they do not probe higher order thinking. Most questions focus on terminology, recall of definitions, and the identification of basic concepts. The test does not address the question of whether a teacher could put any of his or her knowledge related to technology into practice.

Provincial Learning Assessment in Technology Literacy

Background

Sponsor/Creator

Saskatchewan Education

Purpose

Analyze students’ technological literacy to improve their understanding of the relationship between technology and society

What is measured

Capabilities, knowledge, attitudes, and practices related to technological literacy

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

Target population

5th-, 8th-, and 11th-grade students in Saskatchewan, Canada

Item format

Multiple-choice, open-response, and hands-on computer and technology skills items

Sample size

Approximately 3,500 students from 182 schools

Frequency of administration

Dozens of times in many countries since 1988

Availability

http://www.sasked.gov.sk.ca/branches/cap_building_acct/afl/docs/plap/techlit/1999techlit.pdf

Scope

Saskatchewan Education created the Provincial Learning Assessment in Technology Literacy in 1999 to assess student skills, knowledge, attitudes, and practices in technological literacy and collect information on their home and school environments. The instrument focuses on four domains: (1) understanding, describing, and adapting technology; (2) accessing, processing, and communicating information; (3) responsible citizenship and technology; and (4) using technology, including computers.

Two different exams were administered. The performance exam consisted of five stations at which students carried out hands-on activities, such as word processing; using the Internet; using technology; and design, planning, and building models of technology. Student performance was assessed on a scale of 1 to 5 with a well defined rubric for each task (see sample question). The paper-and-pencil exam included open-format and multiple-choice items. Students were also required to submit a research project completed at school. Performance-related aspects of the exam were scored on a scale of 1 to 5 using a rubric for achievement in each domain.

Saskatchewan Education has not repeated this assessment and has no plans to do so in the future.

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

Sample Items

  • 5th-, 8th- and 11th-grade multiple-choice question

The main reason special effects are used in a number of commercials is

  1. To entertain the television audience

  2. To allow producers to show their creativity

  3. To show how technology has advanced

  4. To sell more product

  5. None of the above

(Suggested correct answer: D)

  • 8th -grade performance-station activity

Hint the best SEARCH ENGINE for item #5 is YAHOO

Item #5 Use the internet to find an INTERNET address or INTERNET site name on the Saskatchewan Roughriders

Item #6 Use the INTERNET to find the population of New Zealand

  • 8th-grade open-response question

Technology means different things to different people. When you read the word “technology” what comes into your mind? Tell what technology means to you by drawing pictures and writing about it in the space below.

Please write your definition of technology:

(Suggested best answer: A level-5 answer includes a sophisticated definition that encompasses a full range of technologies and provides four or more examples, with two strong contrasts, such as simple vs. complex. A level 3 answer provides a general definition that includes one criterion of product, process, and reason, and three or more examples, with one contrast. A level 1 answer includes one or two examples of similar technologies.)

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

Committee Observations

Many of the questions and skill assessments require that students engage in higher order thinking, as opposed to rote memorization. The rubrics seem flexible enough for educators to gauge multiple levels of student accomplishment. The assessment is computer intensive both in the performance-skills and knowledge sections.

Pupils’ Attitudes Toward Technology (PATT-USA)

Background

Sponsor/Creator

E. Allen Bame, Marc de Vries, and William E. Dugger, with funding from Virginia Polytechnic Institute and State University, the state of New Jersey, and the International Technology Education Association

Purpose

Assess student attitudes toward and knowledge of technology

What is measured

How gender, age, parents’ professions, technology in the home, and courses in technology influence students’ conceptions of and attitudes toward technology

Target population

American middle school students

Item format

Multiple-choice and open-ended questions

Sample size

10,349 students in seven states

Frequency of administration

Once in 1988

Availability

http://www.iteawww.org (under Conference Proceedings)

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

Scope

Pupils’ Attitudes Toward Technology (PATT), developed in the Netherlands in 1984 by Marc de Vries and colleagues, has been adapted and used in more than 20 countries around the world, making it particularly useful for cross-country comparisons. Nearly 77 percent of the students who took the PATT-USA survey were enrolled in or had previously taken a technology education or industrial arts class. The open-ended question asked student to describe what technology is. The 100 multiple-choice questions were divided into three sections. In one section, 11 questions asked for demographic information. In a second section, 58 questions were related to attitudes toward technology and used a Likert-scale response format (agree, tend to agree, neutral, tend to disagree, disagree). The questions in this section addressed interest in technology, beliefs about the consequences of technology, perceptions of the difficulty of technology, ideas about technological professions, gender stereotypes, and student ideas about technology as a subject in school. The final section included 31 questions that tested understanding and conceptions of technology. In this section, students were given three choices: agree, disagree, and don’t know. The questions concerned the relationship between technology and society, the relationship between technology and science, skills in technology, and the raw materials (or “pillars”) of technology.

Sample Items

  • Gender stereotypes

Boys know more about technology than girls do

(On the 5-point Likert scale, girls were more likely to consider technology an activity for both boys and girls [mean for girls = 1.66; mean for boys = 2.28]).

  • Consequences of technology

Because technology causes pollution, we should use less of it

(Students whose parents’ professions had “nothing” to do with technology had significantly more negative views

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

toward the consequences of technology than students whose parents’ professions had at least “a little” to do with technology. If technical toys or a personal computer was present in the home, attitudes toward the consequences of technology were more positive. Students who had taken a technology course or were interested in a technical profession also had more positive attitudes toward the consequences of technology.)

  • Knowledge about the relationship between science and technology

In my opinion, I think technology is not very old

(35 percent of students agreed with this statement, 27 percent did not know if it was true or not, and 38 percent disagreed.)

Committee Observations

The PATT-USA focuses on assessing attitudes of students toward technology. The questions that test students’ understanding of technological concepts do not require higher order thinking. Nevertheless, questions eliciting student views about technology yields some interesting results, especially with regard to gender differences. The survey is long and so might be be difficult for younger students to complete.

Science and Technology: Public Attitudes and Public Understanding

Background

Sponsor/Creator

National Science Board

Purpose

Monitor public attitudes, knowledge, and interest in science and technology issues

What is measured

Attitudes, opinions, and knowledge of science and technology

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

Target population

U.S. residents 18 and older

Item format

Survey

Sample size

Approximately 2,000 adults

Frequency of administration

Biennially from 1979 to 2001

Availability

http://www.nsf.gov/sbe/srs/seind02/c7/c7h.htm

Scope

The National Science Board (NSB), an independent body that oversees the National Science Foundation and provides policy advice to the president and Congress, has conducted biennial telephone surveys to assess public knowledge, attitudes, and opinions about science and technology since 1979. The 2001 survey questions were organized into the following categories: public interest and knowledge of science and technology (S&T); public attitudes toward S&T; public image of the science community; where Americans get information about S&T; science fiction and pseudoscience; and demographic questions (age, computer access, educational level, occupation, geographic location in the United States, race/ethnicity, sex). Based on this information, NSB reports on trends in public knowledge and interest in science correlated with the demographic data. NSB also compares American attitudes with attitudes on similar surveys in the European Union, Canada, and Japan.

The 2001 survey was the last in the series. Since 2004, the NSB report has relied on the 2001 survey, new Eurobarometer surveys on S&T, a number of Gallup polls, and other sources. Currently, NSB has no plans to resume the telephone surveys, and the 2006 survey will also rely on data from other sources. A major conclusion of both the 2001 and 2004 surveys was that, although Americans are interested in scientific discoveries and new technologies, they do not feel well informed or know a lot about technology-related issues.

Sample Items

(From the 2001 survey)

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
  • Where Americans get information about S&T

Now, I’d like to read you a short list of television shows and ask you to tell me whether you watch each show regularly, that is, most of the time, occasionally, or not at all.

Do you watch Nova regularly, occasionally, or not at all?

(8 percent answered regularly, 29 percent occasionally, and 63 percent not at all.)

  • Public interest and knowledge of S&T

Lasers work by focusing sound waves, true or false?

(45 percent of respondents knew that this statement was false. 61 percent of men, but only 30 percent of women, answered this question correctly.)

  • Public attitudes toward S&T

I’m going to name three types of biotechnology applications. I’d like you to tell me if you strongly support, moderately support, moderately oppose, or strongly oppose these uses of biotechnology.

Using genetic testing to detect diseases we might have inherited from our parents, such as cystic fibrosis. Overall would you say you strongly support, moderately support, moderately oppose, or strongly oppose this use of biotechnology?

(89 percent of survey participants answered either strongly support or moderately support genetic testing for inherited diseases. 9 percent were opposed.)

Committee Observations

(2001 survey only)

The NSF Indicators, Public Understanding of Science and Technology reports provide the only long-term data on trends in U.S. adult knowledge and attitudes toward science and, to a lesser extent, technology. Most of the questions focused on attitudes, and the knowledge-related questions did not require higher order thinking. Respondents with

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

college degrees fared better on the limited number of knowledge questions than respondents who did not have college degrees. This may be attributable simply to exposure to more scientific information, or it may indicate a bias in the poll. As an assessment of technological literacy, the survey had limited value because very few questions focused on or emphasized technology. Science is related to technology, of course, but the connection is indirect. Therefore, this survey is not very useful for assessing many domains of technological literacy.

Student Individualized Performance Inventory

Background

Sponsor/Creator

Rodney L. Custer, Brigitte G. Valesey, and Barry N. Burke, with funding from the Council on Technology Teacher Education, International Technology Education Association, and the Technical Foundation of America

Purpose

Develop a model to assess the problem-solving capabilities of students engaged in design activities

What is measured

Student achievement in 12 areas of design and problem solving

Target population

American high school students

Item format

Rubric

Sample size

Two small high school classes of 12 and 15 students

Frequency of administration

Several times for research purposes

Availability

Rodney Custer, Department of Technology, Illinois State University

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

Scope

The Student Individualized Performance Inventory, developed by education researchers at three different institutions, is a model for assessing students’ problem-solving skills on a design activity. The model divides the design process into four dimensions: problem and design clarification; development of a plan; development of a model/prototype; and evaluation of the design solution. Each dimension is further categorized into three “strands.” Students are evaluated on each strand of the four dimensions by matching their performance with descriptions in a detailed rubric. A score of 1 indicates the novice level of proficiency, 2 beginner, 3 competent, 4 proficient, and 5 expert.

The authors tested their model with two small groups of high school students. The students were asked to design the “school locker of the future” and given eight hours over a period of two days to complete the task. Student achievement was correlated with a number of factors, including geographic location, technology education experience, grade level, mathematics and science achievement score, personality type, problem-solving style, and gender.

Sample Items

(Because this instrument is a rubric, rather than an exam, there are no sample questions. The following examples illustrate the detailed descriptions of student performance in various dimensions and strands spelled out in the rubric.)

  • Dimension: Problem solving and design clarification

Strand: Examine content and define problem

(5) Expert: Poses pertinent questions for clarification; identifies and prioritizes sub-problems (within the larger problem); explores context.

(4) Proficient: Poses questions; identifies sub-problems but does not prioritize. Ignores context.

(3) Competent: Identifies key content; defines problem adequately. Asks some pertinent questions. Ignores context.

(2) Beginner: Expresses limited knowledge of context of problem area; problem is defined but needs clarification. Asks questions but not pertinent and too few.

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

Ignores context. Exhibits some indifference or frustration.

(1) Novice: Tends to home in on wrong problem, isolated subset, or easiest part to solve. Begins to solve without clarification or questions. Doesn’t see context. Exhibits considerable indifference or frustration.

  • Dimension: Model/prototype

Strand: Produce model/prototype

(5) Expert: Is adept with tools and resources, making continual adjustments to “tweak” the model/prototype. Demonstrates persistence with minor problems. Enjoys the challenge of refinements.

(4) Proficient: Uses tools and resources without guidance. Refines model to enhance appearance and capabilities.

(3) Uses tools and resources with little or no guidance. May redo model/prototype parts to improve quality.

(2) Uses tools and resources with some guidance. May have difficulty selecting the appropriate resource. Refines work, but may prefer to leave model as first produced.

(1) Novice: Needs guidance in order to use resources safely and appropriately. Model/prototype is crude, with little or no refinements made.

Committee Observations

The Student Individualized Performance Activity is a well considered tool for assessing design skills, and the rubric adheres to the ITEA Standards for Technological Literacy. Acceptable performance for particular scores in each dimension or strand is well defined. The most attractive feature of this instrument is that it is based on authentic responses of learners. The instrument genuinely provides data based on the processes students use in design, as well as the outcomes that result from their work. The expert-to-novice scoring scale, as opposed to an A-to-F scale, is another positive feature. Like all rubrics, this one raises questions about the reliability of the rater. Normative words, such as “pertinent” and “limited” may contribute to these questions. The underlying assumption that successful, effective designers are always associated with the same qualities may not be correct.

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

Survey of Technological Literacy of Elementary and Junior High School Students

Background

Sponsor/Creator

Ta Wei Lee, Wei Lin, Kuo-Hung Tseng, and Kuang-Chao Yu at National Taiwan Normal University

Purpose

Curriculum development and planning

What is measured

Knowledge in 10 areas of technological literacy and 6 technology systems

Target population

Elementary and junior high school students in Taiwan

Item format

Multiple choice

Sample size

3,066 9th-grade and 3,420 6th-grade students

Frequency of administration

Once in March 1995

Availability

Not available in English. An article describing the assessment is available online at: http://nr.stic.gov.tw/ejournal/ProceedingD/v8n2/68-76.pdf

Scope

The Survey of Technological Literacy among Junior High School and Elementary Students was created for educators to provide a reference point for planning a new curriculum emphasizing the study of technology. According to the test developers, the 80-question exam includes questions in 10 areas of abilities in technological literacy and six technology systems.

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

Abilities in Technological Literacy:

  1. understand the definition and content of technology

  2. understand the major domains of technology

  3. understand the evolution of technology

  4. understand and predict future trends of technological development

  5. understand the basic principles of technology

  6. understand and use effectively the tools, machines, materials, products, and operational procedures of technology systems

  7. use technological literacy in the cognitive, affective, and psychomotor domains for problem solving

  8. make proper judgments of technology and its products through data gathering, analysis, and induction

  9. understand the impacts of technology on the individual, society, culture, and environment

  10. adopt measures to adapt to changes brought on by technology

Technology Systems:

  1. construction technology

  2. manufacturing technology

  3. transportation technology

  4. communication technology

  5. energy and power technology

  6. biotechnology

This instrument was intended for both elementary and junior high students. The first 40 questions were considered “fundamental” and appropriate for both groups. The next 40 questions were considered “advanced” and were only administered to junior high students. In addition to the exam questions, students were asked to provide their gender and grade level, which were used to demonstrate correlations in the final analysis of the assessment. Between 1995 and 2000, this assessment was used in several local studies and a number of master’s theses. However, it is not used regularly or widely to assess technological literacy.

Sample Items

(This instrument was written in Chinese and translated into English by one of the authors for the committee’s review.)

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
  • Fundamental section

In general, which of the following is not the function of a reservoir?

  1. Flood prevention

  2. Water supply for farm fields

  3. Water power supply

  4. Ecological conservation

(Suggested correct answer: D)

  • Fundamental section

Place the following air transportation in order of their invention

  1. Hot air balloon → glider → airplane

  2. Glider → hot air balloon → airplane

  3. Hot air balloon → airplane → glider

  4. Airplane → hot air balloon → glider

(Suggested correct answer: A)

  • Advanced section

How is the method of laser incision on hard materials different from traditional method?

  1. The tool does not contact the item

  2. The tool requires electricity

  3. The size of incision is uncontrollable

  4. The tool has multiple shapes

(Suggested correct answer: A)

Committee Observations

For the most part, the items in this assessment appear to be appropriate for the target populations. With the multiple-choice format, knowledge of technology can be assessed, but not students’ capabilities. Many of the test questions require higher order thinking, but problems with the translation to English seem to reveal cultural bias in some items.

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

Test of Technological Literacy

Background

Sponsor/Creator

Abdul Hameed, Ohio State University

Purpose

Research

What is measured

Knowledge in four areas: construction, manufacturing, communication, and transportation technologies

Target population

7th- and 8th-grade American students

Item format

Multiple choice

Sample size

1,350 students from 20 schools

Frequency of administration

Once in April 1988

Availability

Dissertation by Abdul Hameed held at Ohio State University

Scope

This test was developed by Abdul Hameed in the late 1980s as part of his Ph.D. dissertation in technical education (industrial arts) at Ohio State University. The 64-question exam, which is intended to be completed in a single class period, tests students’ understanding of using, making, and controlling technology.

Sample Items

  • A manufacturing control question

Which of the following items needs the highest design safety factor?

  1. Airplane

  2. Gasoline Engine

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
  1. Radio

  2. Bicycle

(Suggested correct answer: A)

  • A construction making question

Steel reinforcement is placed in concrete in order to

  1. Help keep the concrete from breaking and separating

  2. Improve the appearance

  3. Provide holes for ventilation

  4. Increase the weight

(Suggested correct answer: A)

  • An understanding transportation question

The first manned rocket to enter space was launched in the

  1. late 1930s and early 1940s

  2. late 1940s and early 1950s

  3. late 1950s and early 1960s

  4. late 1960s and early 1970s

(Suggested correct answer: C)

Committee Observations

This assessment covers a broad range of general knowledge about technology, but few questions require that students do anything other than recall information. The test does not require problem-solving, decision-making, or technology-related skills. A number of test items refer to specific technologies that were state of the art in the early 1980s but would not be familiar to many students today.

TL50: Technological Literacy Instrument

Background

Sponsor/Creator

Michael Dyrenfurth, Purdue University

Purpose

Gauge technological literacy

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

What is measured

General knowledge of technology in eight areas

Target population

High school students, university students, and adults

Item format

Multiple choice

Sample size

Unknown

Frequency of administration

Unknown

Availability

Michael J. Dyrenfurth, College of Technology, Purdue University

Scope

This 50-question, multiple-choice instrument is designed to assess technological literacy in eight areas: (1) working with technology; (2) technological procedures; (3) overview of technology; (4) overview of industrial technology; (5) fundamentals of communications technology; (6) applications of energy and power technologies; (7) fundamentals of materials and processing technologies; and (8) impact of technologies on society. Slightly more than half of the items address technological procedures.

Sample Items

  • Technological procedures: systems analysis and synthesis questions

Consider a typical factory’s automated spray paint station that uses a robot to paint parts passing on a conveyor. Which of the answers contains the best list of subsystems of such a work station?

  1. Controlling computer, transfer robot, auto-conveyor, cell perimeter

  2. Instrumentation unit, auto-conveyor, warehouse unit, read-out and input unit

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
  1. Auto-conveyor, controlling computer, spray robot, read-out and input unit

  2. Vision system, auto-conveyor, light system, transfer robot

(Suggested correct answer: C)

  • Fundamentals of materials and processing technologies: materials technology basics question

The process of tempering material:

  1. Softens the metal and removes internal stresses

  2. Increases the metal’s resistance to scratching and abrasion

  3. Toughens the material

  4. Is not described

(Suggested correct answer: C)

  • Technological procedures: technology assessment/evaluation (impacts) question

To properly judge the effects of a technological innovation, one should:

  1. Measure the dollar effects resulting from it

  2. Estimate the impacts of it on our society

  3. Identify its impact on the people using it

  4. All of the above

(Suggested correct answer: D)

Committee Observations

Although this instrument includes questions that require interpretations of simple graphs and analog scales, the majority of questions rely heavily on memorization and knowledge of terminology that may become outdated or may not transfer well among population groups. The instrument may not be appropriate for university students in most science or technology fields because much of the content is basic and does not require a higher education.

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

WorkKeys—Applied Technology

Background

Sponsor/Creator

ACT

Purpose

Determine workforce readiness; identify skills gaps in current and potential employees.

What is measured

Practical reasoning and problem-solving skills related to four applied-technology domains: electricity, mechanics, fluid dynamics, and thermodynamics

Target population

High school and community college students, adults transitioning to the workforce, current workers in technology-dependent businesses

Item format

32 items at four levels of difficulty administered over 55 minutes (online) or 45 minutes (paper and pencil)

Sample size

Since 1992, when ACT introduced the WorkKeys Program, some 9 million individuals have taken one or more of the program’s 10 assessments (M.J. Klemme, WorkKeys consultant, personal communication, December 20, 2005).

Frequency of administration

There is no fixed schedule of test administration. Assessments may be taken either through an employer licensed by ACT or a licensed WorkKeys site, typically an educational institution.

Availability

Test details, sample items, and information about ordering a practice test are available at http://www.act.org/workkeys

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

Scope

The Applied Technology Assessment is one of 10 assessments offered by WorkKeys. The others are Reading for Information, Applied Mathematics, Business Writing, Writing, Locating Information, Teamwork, Observation, Listening, and Readiness. Although the assessments can be given individually, they were originally designed to be part of a larger ACT job-skills program. The program includes a component to help employers identify the skills necessary for specific jobs and a training element to close skill gaps revealed by the assessment. Test items are grouped into four difficulty levels, 3–6, based on the number and complexity of the skills required to answer each item correctly. A Level 3 item, for instance, describes a simple system with three to five components, portrays a problem with one variable, and includes all the information necessary to solve it. A Level 6 item describes a complex system with 10 or more components, presents a variety of possible problem sources, and includes considerable extraneous information. Assessment results can be presented as a level score, which ACT says should be used for employee selection, promotion, or other high-stakes purposes, or a scale score, which can show individual improvement over time, provide for group comparisons, or show a likelihood of benefit from educational opportunity. ACT charges $4 per test for educational and government institutions that use the assessment with their own students. The rate is higher for businesses.

Sample Items10

Level 3

You are building a greenhouse like the one shown in Figure 1 for a local nursery. The owners specified that the greenhouse should have automatic vents, controlled by a thermostat, which will open when the temperature in the greenhouse gets too high for the plants. Figure 2 shows the floor plan of the greenhouse.

10

These sample items appear on the WorkKeys website at http://www.act.org/workkeys/assess/tech and are reprinted with permission.

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

Figure 1

Figure 2

A thermostat will control the opening and closing of the automatic vents. It is a temperature-sensitive device that can be set to activate when the air around it reaches a certain temperature. The owners of the greenhouse want to have the vents open when the air around the majority of the plants reaches 90°F. At what height and location in Figure 2 should you install the thermostat so it gives the desired results?

  1. About 4 feet from the floor at location A

  2. About 4 feet from the floor at location B (suggested correct answer)

  3. About 8 feet from the floor at location C

  4. About 8 feet from the floor at location D

  5. Near the peak of the roof at location E

Level 4

Your industrial services company has been hired to deliver a small but heavy gearbox. The container is too small to justify renting a large truck and too heavy for the company’s pickup truck. You decide to rent a heavy-duty utility trailer

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

and pull it with the pickup truck.

At which spot, labeled 1–5, on the trailer shown should you place the container to pull the load most easily and safely?

  1. 1

  2. 2

  3. 3 (suggested correct answer)

  4. 4

  5. 5

Level 5

The band saw where you work will not start. This saw uses 240 volts, draws 25 amps, and has 30-amp cartridge fuses. These fuses (see diagram shown) are designed to protect an electrical circuit. Their main component is a fuse wire made of a low-resistance, low-melting-point alloy. When a higher than tolerable current goes through such a fuse, this fuse wire melts. Your supervisor has told you to check the fuses in the band saw. By looking at the fuses, you cannot tell if they are good or bad.

You have turned off the power to the saw and removed one of the fuses. You check this fuse with a volt-ohmmeter (a device that measures resistance to the flow of electrical current). If the fuse is good, the resistance (measured in ohms) for the fuse will be:

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
  1. 0 ohms (suggested correct answer)

  2. 10 ohms

  3. 50 ohms

  4. 100 ohms

  5. infinite

Level 6

The garage where you work is equipped with a hydraulic lift, like the one shown, that you use to raise cars off the floor so it is easier to service them. An air compressor capable of generating pressures of 120 pounds per square inch (psi) powers the lift. The air regulator releases a steady amount of air pressure (usually 30 to 40 psi), and the control valve directs the flow of that air through the lines. Pushing the control valve forward (as shown in the figure) allows air into the lines, raising the lift. Moving the valve to the middle position seals the line so no air can escape, and pulling the valve back releases air from the line, lowering the lift. The air from the compressor exerts a force on a tank of hydraulic fluid, which, in turn, transmits this force to the bottom of the lifting piston.

*Figure adapted from Principles of Technology Teacher’s Guide, Year 1, Unit 7, Force Transformers (Waco, TX: Center for Occupational Research and Development, 1991), 94. Used with permission.

You have been working on a car up on the lift for about an hour. When you raised the car, the lift worked normally, but now the lifting piston has begun to creep down. You check the control valve and it is fine. Also, there is no hydraulic

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

fluid on the garage floor or in the lift pit below the garage floor. The next thing you should check to determine the problem is the:

  1. air compressor

  2. air regulator

  3. air line between the compressor and the control valve

  4. air line between the control valve and the hydraulic fluid reservoir (suggested correct answer)

  5. line between the hydraulic fluid reservoir and the lifting piston

Committee Observations

This assessment is notable for its focus on problem solving and reasoning in technological systems. Although not designed with the ITEA Standards for Technological Literacy in mind, the sample items are consistent with benchmarks described in the ITEA standards related to energy and power (Standard 16), using and maintaining technological products and systems (Standard 12), and problem solving and trouble-shooting (part of Standard 10). The sample items suggest that the assessment requires that examinees have basic knowledge of fundamental scientific concepts and cause-and-effect relationships in technological systems. The items also require a fairly high degree of reading skill, which may pose challenges for examinees learning English. The scenarios presented

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×

This page intially left blank

Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 265
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 266
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 267
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 268
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 269
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 270
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 271
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 272
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 273
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 274
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 275
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 276
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 277
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 278
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 279
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 280
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 281
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 282
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 283
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 284
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 285
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 286
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 287
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 288
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 289
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 290
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 291
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 292
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 293
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 294
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 295
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 296
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 297
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 298
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 299
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 300
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 301
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 302
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 303
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 304
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 305
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 306
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 307
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 308
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 309
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 310
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 311
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 312
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 313
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 314
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 315
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 316
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 317
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 318
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 319
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 320
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 321
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 322
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 323
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 324
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 325
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 326
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 327
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 328
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 329
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 330
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 331
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 332
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 333
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 334
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 335
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 336
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 337
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 338
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 339
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 340
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 341
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 342
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 343
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 344
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 345
Suggested Citation:"APPENDIX E Instrument Summaries." National Academy of Engineering and National Research Council. 2006. Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press. doi: 10.17226/11691.
×
Page 346
Next: Index »
Tech Tally: Approaches to Assessing Technological Literacy Get This Book
×
Buy Hardback | $65.00 Buy Ebook | $49.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

In a broad sense, technology is any modification of the natural world made to fulfill human needs or desires. Although people tend to focus on the most recent technological inventions, technology includes a myriad of devices and systems that profoundly affect everyone in modern society. Technology is pervasive; an informed citizenship needs to know what technology is, how it works, how it is created, how it shapes our society, and how society influences technological development. This understanding depends in large part on an individual level of technological literacy.

Tech Tally: Approaches to Assessing Technological Literacy determines the most viable approaches to assessing technological literacy for students, teachers, and out-of-school adults. The book examines opportunities and obstacles to developing scientifically valid and broadly applicable assessment instruments for technological literacy in the three target populations. The book offers findings and 12 related recommendations that address five critical areas: instrument development; research on learning; computer-based assessment methods, framework development, and public perceptions of technology.

This book will be of special interest to individuals and groups promoting technological literacy in the United States, education and government policy makers in federal and state agencies, as well as the education research community.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!