7
Recommendations for Future Literacy Assessments

In today’s society, literacy is a critical skill, one that has the power to enhance the number and variety of opportunities available to individuals and that can enable them to lead productive lives and become informed community members and citizens. The 1992 assessment of adults’ literacy skills yielded a tremendous wealth of information on the literacy needs of adults living in the United States, information that served to strengthen and refocus efforts on new and existing programs. We expect that the results from the 2003 assessment will be equally useful.

Conducting regular and periodic large-scale assessments of adult literacy provides a means for determining what literacy hurdles this country has overcome and which hurdles still lie ahead. The committee understands that there are currently no plans to conduct a follow-up to the National Assessment of Adult Literacy (NAAL). We think, however, that ongoing assessment of the literacy skills of the nation’s adults is important, and that planning for a follow-up to NAAL should begin now. In this chapter, in an effort to be forward looking, we offer suggestions for ways to improve the assessment instrument and expand the literacy skills assessed.

Through our own research and analyses of the National Adult Literacy Survey (NALS) and NAAL and in listening to stakeholders and standard-setting participants—members of policy-making, curriculum-building, and practitioner communities—the committee came to realize that there may be ways the assessment could be altered and expanded to enable better understanding of literacy issues in the United States. Throughout this report, we have alluded to certain weaknesses in NALS and NAAL and have pointed



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 182
Measuring Literacy: Performance Levels for Adults 7 Recommendations for Future Literacy Assessments In today’s society, literacy is a critical skill, one that has the power to enhance the number and variety of opportunities available to individuals and that can enable them to lead productive lives and become informed community members and citizens. The 1992 assessment of adults’ literacy skills yielded a tremendous wealth of information on the literacy needs of adults living in the United States, information that served to strengthen and refocus efforts on new and existing programs. We expect that the results from the 2003 assessment will be equally useful. Conducting regular and periodic large-scale assessments of adult literacy provides a means for determining what literacy hurdles this country has overcome and which hurdles still lie ahead. The committee understands that there are currently no plans to conduct a follow-up to the National Assessment of Adult Literacy (NAAL). We think, however, that ongoing assessment of the literacy skills of the nation’s adults is important, and that planning for a follow-up to NAAL should begin now. In this chapter, in an effort to be forward looking, we offer suggestions for ways to improve the assessment instrument and expand the literacy skills assessed. Through our own research and analyses of the National Adult Literacy Survey (NALS) and NAAL and in listening to stakeholders and standard-setting participants—members of policy-making, curriculum-building, and practitioner communities—the committee came to realize that there may be ways the assessment could be altered and expanded to enable better understanding of literacy issues in the United States. Throughout this report, we have alluded to certain weaknesses in NALS and NAAL and have pointed

OCR for page 182
Measuring Literacy: Performance Levels for Adults out areas of concern for users of NAAL results. In this chapter we restate some of the weaknesses and concerns with the intent that our recommendations will stimulate reflective and proactive thinking for future literacy assessment developments. Some of our suggestions are speculative and far-reaching, but we raise these issues as a means of opening the dialogue with the Department of Education about what an assessment like NAAL could, and perhaps should, encompass. The committee addresses four areas of concern. First, we revisit the issue of the type of inferences that policy makers, media, and the public wanted to make about NALS results when they were released in 1993. There are alternative approaches to test development, such as those used for licensing and certification tests, which produce assessment results that support standards-based inferences. We encourage exploration of the feasibility of these methods for future literacy assessments. We describe an approach that could be considered, which we refer to as a “demand-side analysis of critical skills,” and explain how it can be used to enhance and expand NAAL. Second, there are ways to improve on the information currently collected about adults’ quantitative skills, and we make suggestions for strengthening and expanding this portion of the assessment. Third, we provide a rationale for expanding the test development and administration processes to better evaluate the literacy needs of growing populations of nonnative English speakers. Finally, we propose ways to broaden the conception of literacy on which NALS and NAAL were based. The definitions of literacy that guided test development for NALS and NAAL placed constraints on the inferences made about the results and the generalizability of the findings. Defining literacy in a meaningful way is perhaps the most fundamental aspect of constructing a literacy assessment. If the definition of literacy that underpins item development is narrow and limited, then the inferences based on the assessment results will likewise be narrow and limited. We suggest ways to broaden the conception of literacy that underlies NAAL. DEMAND-SIDE ANALYSIS OF CRITICAL SKILLS Formal large-scale assessment programs are designed to fulfill a variety of needs, ranging from a simple information-gathering survey used to evaluate program needs, to assessments used for more high-stakes purposes, such as grade promotion, high school graduation, and professional licensure. NALS and NAAL are examples of assessments used primarily to evaluate program needs. For instance, the findings from NALS generated intense interest and funding for literacy education programs for adults who spoke

OCR for page 182
Measuring Literacy: Performance Levels for Adults English or who were trying to learn English. Similarly, the results of NAAL are likely to be used to shape and refocus existing programs as well as target additional needy populations. The approach to test development for NALS and NAAL reflected these intended purposes. Many audiences for the assessment results sought to make standards-based inferences about NALS: they wanted to know how many adults were “illiterate” and how many had the skills needed to function adequately in society. As we have discussed, however, the test development process used for NALS and repeated for NAAL was not intended to support such claims. An alternative approach to test development, similar to that used for certification and licensing tests, would allow such inferences to be made about the results. Test development for credentialing examinations typically begins with identification of the critical skills that an individual should master in order to obtain the specific credential. Often this is handled by gathering feedback (e.g., via surveys or focus groups) from the community of experts and practitioners who work in the specific domain. Experts and practitioners help to define the set of knowledge, skills, and competencies that individuals should be able to demonstrate; they also assist with the development and review of test items and actively participate in the determination of the cut score required to pass the exam. NALS and NAAL currently draw test questions from six contexts in which adults utilize their literacy skills: work, health and safety, community and citizenship, home and family, consumer economics, and leisure and recreation. Our suggested approach to test development would involve a systematic review of each of these contexts to determine the critical literacy demands required to function adequately, which would then serve as the foundation for the test development process. Included in this approach would be a review of print materials in each context that adults are expected to read, understand, and use. This task could also include focus groups or other types of discussion with low-literate adults who could talk about what they see as the literacy skills they need in their home and work lives. Standard setting, that is, determinations of the level of proficiency that adults need or should have, would be a natural outgrowth of such an approach, including, if desired, the setting of multiple standards. For instance, performance levels could be established that reflect judgments about the levels of proficiency adults need in order to excel, to function adequately, or simply to get by in this country. This approach to test development could produce assessment results intended to support claims about the levels of literacy judged to be adequate. The psychometric literature provides documentation of these procedures for professional licensing and

OCR for page 182
Measuring Literacy: Performance Levels for Adults certification that could serve as a resource for this approach in the context of literacy assessment.1 We recognize that, in the case of literacy assessment, this is no easy task and that previous attempts to characterize adequate literacy skills for adults have not been entirely successful (e.g., the work conducted as part of the Adult Performance Level Study described in Adult Performance Level Project, 1975). Furthermore, the construct of literacy is much broader than the set of skills and competencies evaluated by credentialing tests. Nevertheless, we encourage further exploration of the feasibility of this strategy toward test design. The work of Sticht (1975) would be relevant in this endeavor. Providing Scores for the Context Areas Systematic sampling of the literacy demands in the six contexts, via a demand-side analysis, could be used to support the existing prose, document, and quantitative scores, but it could also result in separate scores for the different contexts. For example, NAAL could be better designed to measure literacy skills that are directly relevant to citizenship. Prose, document, and quantitative literacy items, drawn from civic and politically relevant real-world examples, could be added to the assessment to inform the development of instructional materials for adult education and citizenship preparation classes. Prose items could measure understanding of a proposal on a ballot and in a voter information pamphlet, or they could measure skill in identifying political candidates’ perspectives about certain issues; quantitative items could measure understanding of the allocation of public funds. The addition of civic-related materials would enhance NAAL by lending much-needed guidance to those who are working to ensure access to the democratic process for all. Including a number of test items regarding literacy and citizenship in future generations of NAAL would offer the opportunity to evaluate the extent to which adults’ literacy skills are sufficient to make informed decisions regarding civic matters. Development of the health literacy score could be used as a model for exploring the feasibility of reporting literacy scores in other NAAL content domains. The health literacy community was active in the design of new items for NAAL that would support a health literacy score. NAAL items drawn from the health and safety context were developed and included on the assessment in such a way that they contribute to prose, document, and quantitative literacy scores but also yield a separate health literacy score. 1   Procedures used to develop the National Board for Professional Teaching Standards’ advanced certification program for teachers provides one example (see http://www.nbpts.org).

OCR for page 182
Measuring Literacy: Performance Levels for Adults Similar procedures could be used to provide literacy information for each of the specified contexts. We believe it is worthwhile considering the feasibility of this approach. Expanding Information Collected on the Background Questionnaire Feedback from experts in each of the contexts could also be used to expand and focus the information collected on the background questionnaire. As it currently exists, the background questionnaire is a tremendous resource, but there are ways in which it could be improved. As described in Chapters 4 and 5, we were not able to conduct some of the desired analyses, either because the data were not available from the background questionnaire or because the information collected did not allow for fine enough distinctions to use in setting standards. Changes were made to the 2003 background questionnaire as a result of the efforts to create a health literacy score, and questions were added to gather background information with regard to health and safety issues. Similar procedures could be used to link demand-side analyses with the construction of the background questionnaire items for the various contexts, with input and guidance provided by panels of domain-specific experts, stakeholders, and practitioners. For example, with respect to the context of community and citizenship, NAAL currently includes measures of voting and volunteering in its background survey. Future surveys should draw more effectively on the existing literature to include a larger and more carefully designed battery of items measuring both attitudinal and behavioral dimensions of civic and political engagement. Doing so would allow for much richer and more definitive analyses of the relationship between literacy and effective democratic citizenship. The following two recommendations convey the committee’s ideas with regard to the test development approach and revisions to the background questionnaire. RECOMMENDATION 7-1: The Department of Education should work with relevant domain-specific experts, stakeholders, and practitioners to identify the critical literacy demands in at least six contexts: work, health and safety, community and citizenship, home and family, consumer economics, and leisure and recreation. Future generations of NAAL should be designed to measure these critical skills and should be developed from the outset to support standards-based inferences about the extent to which adults are able to perform these critical skills. RECOMMENDATION 7-2: The background questionnaire included in NAAL should be updated and revised. The Department of Education should

OCR for page 182
Measuring Literacy: Performance Levels for Adults work with relevant domain-specific experts, stakeholders, and practitioners to identify the key background information to collect with regard to at least six contexts: work, health and safety, community and citizenship, home and family, consumer economics, and leisure and recreation. Relevant stakeholders should be involved in reviewing and revising questions to be included on the background questionnaire. Maintaining the Integrity of Trends The validity of any assessment rests on the strengths of the item pool for that assessment. Although much time and many resources have been invested in development and testing the current NAAL item pool, these items will eventually become obsolete. As with any large-scale assessment, review and revision of the item pool requires continuous efforts. Items need to incorporate current and future uses of texts, behaviors, and practices, as well as adoption of components that reflect current bodies of research in domain-specific areas. We recognize that altering the test development approach or making changes in the item pool has the potential to interfere with efforts to monitor trends. We therefore suggest that while each new generation of NAAL should update the assessment items to reflect current literacy requirements and expectations in each context, some time-invariant items should also be retained to enable trend analysis. We therefore recommend the following: RECOMMENDATION 7-3: The Department of Education should work with relevant domain-specific experts, stakeholders, and practitioners to monitor literacy requirements in at least six contexts: work, health and safety, community and citizenship, home and family, consumer economics, and leisure and recreation. For every administration of the adult literacy assessment, the Department of Education should document changes in the literacy demands in these contexts. Each new instrument should update the assessment items to reflect current literacy requirements and expectations in each context but should also retain some time-invariant items to enable trend analysis. Consideration of Written Expression and Computer Skills During both standard settings, panelists raised questions about the role of written expression in NALS and NAAL. Many of the assessment questions require written responses, but the quality of the writing is not considered in the scoring process. For instance, some assessment questions require the test taker to write a brief letter, but it is the content of the response that is scored, not the writing: a one- or two-sentence response is accorded the

OCR for page 182
Measuring Literacy: Performance Levels for Adults same weight as a one- or two-paragraph response. Although adding a full measure of written expression is not a simple endeavor, we think that writing is a critical aspect of literacy and of functioning in modern society. We suggest that writing be explored as part of a demand-side analysis by evaluating the extent to which written expressive skills are critical for functioning in the six contexts specified. A demand-side analysis could also cover the need for computer and technological literacy skills in each of the contexts, such as using a computer to handle daily activities, accessing and navigating the Internet to research and locate information, and deciphering multimedia. These skills include the motor skills needed to manage a keyboard, a mouse, and menus, but they also go far beyond them to include the kinds of reading that are required to navigate in hypermedia. As government and private industry shift critical literacy tasks, such as interaction with forms and applications, to online media, assessing functional literacy without considering the role of computer usage will understate the complexity of such daily tasks and may tend to overestimate the functional literacy of the population. Furthermore, it will be impossible to assess computer-mediated communication skills without computer-mediated testing. Therefore, we suggest that computer skills be considered in a demand-side analysis. We acknowledge that developing assessments of these skills introduces a host of complexities, not the least of which is defining the specific domain to be assessed and determining a means for reliably scoring responses to the tasks. We further recognize that such assessments are labor-intensive and may prove to be too expensive to be conducted on a large-scale basis. It may be possible, however, to implement assessments of these skills on a smaller scale, such as through a subsample of participants or focused special studies. Therefore, we encourage further exploration of the feasibility of assessing these skills on future generations of the assessment. IMPROVING THE ASSESSMENT OF QUANTITATIVE SKILLS The second area in which changes are warranted is the quantitative literacy scale. As described in Chapter 4, analyses of the dimensionality of NALS, conducted by the committee and others (e.g., Reder, 1998a, 1998b), revealed very high correlations among the three literacy scales. These factor analytic studies suggest that a single dimension, not three, underlies performance on the assessment. In part, this may be due to the fact that every item that measures quantitative literacy is embedded in a text-based or document-based stimulus. To perform the required mathematics, test takers must first be able to handle the reading tasks presented by the stimulus materials as well as the reading required in the instructions for the question. Thus, every item in the quantitative literacy scale confounds skill

OCR for page 182
Measuring Literacy: Performance Levels for Adults in mathematics with factors associated with understanding text-based or document-based materials. Mathematical demands in society are not easily separated from the task of reading; hence, the overlapping nature of the stimuli used for NALS and NAAL mirrors tasks that occur in real-life situations. Nevertheless, the overlap presents problems when interpreting the results. A difficult quantitative literacy item may be so because it requires a good deal of text-based or document-based interpretation, while the mathematical skill required to complete the item may be as simple as adding two amounts of money. This characteristic of the quantitative tasks was noted by participants in both of the committee’s standard-setting sessions. Some panelists commented that they were surprised by the extent of reading required for the questions that were intended to measure quantitative skills, cautioning that the NALS and NAAL quantitative scale should not be construed as a mathematics test. Panelists were also surprised at the level of mathematical skill evaluated on NALS and NAAL, observing that most questions required only very basic mathematics (e.g., addition, subtraction, simple multiplication, division). Research has shown that skill in mathematics may correlate even more strongly with economic success than reading (Murnane, Willet, and Levy, 1995). We therefore think it is important to understand the mathematical skill level of the adult population. When NALS was first developed, scant attention was paid to mathematics in the adult basic education and literacy education system. Since then, the emphasis on numeracy—the mathematics needed to meet the demands of society, which differs somewhat from school or highly formal mathematics—has been increasing. This emphasis on numeracy skills is reflected in decisions made about the Adult Literacy and Lifeskills Survey, the successor to the International Adult Literacy Survey. In 2002, Statistics Canada and other organizations who work on international literacy assessments reexamined the components of the International Adult Literacy Survey and recommended that the quantitative literacy scale of the Adult Literacy and Lifeskills Survey be replaced by a broader numeracy construct (Murray, 2003). The Organisation for Economic Co-Operation and Development Programme for International Student Assessment (http://www.pisa.oecd.org) and the Center for Literacy Studies’ Equipped for the Future program (http://www.eff.cls.utk.edu) are two other large-scale endeavors that include mathematics or numeracy as separate from skill in reading and writing. Neither NALS nor NAAL was meant to be a formal test of mathematical proficiency in higher level domains, such as algebra, geometry, or calculus, and we are not suggesting that this should be the case. That said, it is the committee’s view that the mathematical demands in a technological

OCR for page 182
Measuring Literacy: Performance Levels for Adults society require more than a basic grasp of whole numbers and money, as currently reflected in the NAAL. A fuller development of a quantitative literacy scale could include such skills as algebraic reasoning (with an emphasis on modeling rather than symbol manipulation), data analysis, geometric and measurement tasks, and the various forms and uses of rational numbers, in addition to the basic operations with time and money that are assessed in the NAAL. These arguments suggest that mathematical skill and literacy could be assessed more accurately as separate and more fully developed constructs, less tied to prose or document literacy, yet still reflective of the types of tasks encountered by adults in everyday situations. In line with the demand-side analysis of critical skills discussed in the preceding section, the committee suggests that a reconceptualization of the quantitative literacy scale include an examination of the research into the mathematical and commingled mathematical and reading demands of society as well the aspects that contribute to the complexity of a variety of mathematical tasks. NALS put mathematics on the map by including quantitative literacy, but it would be useful if future assessments of adult literacy were to go further. Expansion of the quantitative literacy construct would enable a more accurate assessment of those at higher levels of mathematical skill. NAAL results could be used to generate discussion about college remediation programs with the same vigor that energizes discussion of literacy skills at the lower end of the scale. There is a significant body of international research on numeracy and cognitive assessments of adult problem-solving skills that could be used as a starting point for rethinking the quantitative literacy scale. Other entities available as resources for rethinking measurement of quantitative literacy include Adults Learning Mathematics—A Research Forum (http://www.alm-online.org/) and the Adult Numeracy Network (http://www.shell04.theworld.com/std/anpn/), an affiliate of the National Council of Teachers of Mathematics. We therefore recommend the following: RECOMMENDATION 7-4: The Department of Education should consider revising the quantitative literacy component on future assessments of adult literacy to include a numeracy component assessed as a separate construct, less tied to prose or document literacy but still reflective of the types of tasks encountered by adults in everyday situations. The numeracy skills to include on the assessment should be identified as part of an analysis of critical literacy demands in six content areas. The types of numeracy skills assessed on the Adult Literacy and Lifeskills Survey could serve as a starting place for identifying critical skills.

OCR for page 182
Measuring Literacy: Performance Levels for Adults IMPROVING THE INFORMATION COLLECTED ABOUT ADULT NON-ENGLISH SPEAKERS The third area in which the committee thinks significant modifications of future NAAL instruments should be made is with regard to collecting information about the literacy skills of non-English speaking adults. As described in Chapter 6, language-minority adults are an ever-increasing segment of the U.S. population. Since immigration to the United States is likely to continue and demand for services to non-English speakers is likely to remain high, much more needs to be known about the backgrounds and skills of this population. Data on the language skills and literacy profiles of non-English speakers are needed so that policy makers, program administrators, practitioners, and employers can make informed decisions about their education and training needs. Immigrant adults make up a significant proportion of the working poor in the United States, and a high number of immigrants among this group are not fully literate in English. Limited English language and literacy skills of immigrants are seen as a significant threat to U.S. economic advancement (United Way, Literacy@Work: The L.A. Workforce Literacy Project, September 2004), yet analyses of the NALS data on Spanish speakers (Greenberg et al., 2001) show that bilingual adults have higher earnings as a group than those who are monolingual in either English or Spanish. Thus, social, political, and economic concerns warrant a more focused effort at gathering information about adults who speak English as a second language than NAAL administrative procedures allowed. We addressed this issue in our letter report to the National Center for Education Statistics issued in June 2003,2 and we repeat our concerns here with the hope that future assessments of adult literacy will allow for expanded and more structured information to be collected about non-English speakers. We recognize that NAAL is intended to be an assessment of English literacy skills only, and we are not suggesting that it should be expanded to assess competence in other languages. We nevertheless maintain that it is important to enable the survey results to portray a nuanced picture of the backgrounds and skills of the entire population. Procedures for Collecting Background Data on NAAL Currently NAAL collects background information only from those who speak sufficient English or Spanish to understand and respond to the initial screening and background questions. As described in Chapter 2, when an interviewer arrived at a sampled household, a screening device 2   Available at http://www.nap.edu/catalog/10762.html.

OCR for page 182
Measuring Literacy: Performance Levels for Adults was used to determine if there was an eligible person in the household to participate in the assessment. If the respondent could not understand the English or Spanish spoken by the interviewer (or vice versa), the interviewer could solicit translation assistance from another household member, family friend, or a neighbor available at the time. If an interpreter was not available, the assessment would cease, and the case would be coded as a language problem. Therefore, unless an interpreter happened to be available, no information was collected from those who do not speak English or Spanish. Furthermore, if translation assistance was available, it was only for the initial screening questions that requested information about age, race/ethnicity, and gender. The background questionnaire was available only in English and Spanish, and translation assistance was not allowed. The consequence of these administrative decisions is that an opportunity was missed to gather additional information about individuals in this country who speak languages other than English or Spanish. The information that was obtained about this group of individuals who spoke a language other than English or Spanish relied primarily on happenstance (e.g., if an interpreter happened to be available). A more structured, more in-depth approach might have been used to better capitalize on these important data collection opportunities. Proposed Changes to Procedures Much useful information can be gathered through NAAL by allowing speakers of other languages to demonstrate the English literacy skills they do possess while providing information about their capabilities in other languages, capabilities that are likely to influence the acquisition of literacy skills in English. While translating NAAL’s background questionnaire into multiple languages may be infeasible, there are alternative ways to collect information about non-English speakers. Language-minority groups often cluster in particular geographic areas. Often, translators are available to assist adults with understanding community information, such as school enrollment procedures, health and safety information, voter information, and the like. We think this resource could be tapped for future administrations of NAAL, and a more structured approach taken to ensure that either bilingual assessors or trained translators are available during interviews with individuals who speak languages other than English or Spanish. With translators available, more in-depth information could be obtained from individuals who do not speak English or have only minimal English skills than is allowed through the current version of the initial screening device, information that could be used for programmatic purposes. For instance, it would be useful to gather information from this group about their formal education, participation in English language

OCR for page 182
Measuring Literacy: Performance Levels for Adults courses, training and work experience in other countries as well as in the United States, and self-perceptions about their oral and written proficiency in English and in other languages (e.g., using questions like the self-report questions currently included on the background questionnaire). We are not proposing that the entire background questionnaire be translated into multiple languages, simply that additional information collected about non-English speakers. It may also be useful to explore oversampling or special focused studies of language-minority regions so that they will yield sufficient numbers to allow for detailed analyses and provide information for policy makers and practitioners serving those language communities. Finally, we suggest that non-English speakers be considered in each population group (e.g., the incarcerated) and as part of each focus area (e.g., health and safety) and that background data on non-English speakers be included as part of all major reports and as a separate report on the language and literacy skills of all adults who speak a language other than English at home. RECOMMENDATION 7-5: The Department of Education should seek to expand the information obtained about non-English speakers in future assessments of adult literacy, including, for example, background information about formal education, participation in English language courses, training and work experience in other countries as well as in the United States, and self-reports about use of print materials in languages other than English. Efforts should also be made to be more structured in the collection of background information about individuals who speak languages other than English or Spanish. RETHINKING AND BROADENING THE DEFINITION OF LITERACY We conclude this chapter by proposing ways to broaden the conception of literacy on which NALS and NAAL were based. For these two assessments, literacy has been construed as an ability, as skills, and as a possession. As a concept, literacy provides a canvas that encompasses practices, behaviors, beliefs, and activities that range from basic reading and writing to the less well-defined notion of higher order problem solving. Literacy has multiple conceptions, which range from a focus on the most fundamental survival skills to more complex definitions that encompass the skills needed to thrive in a variety of contexts, such as the home, the workplace, and the community. The ways in which literacy specialists talk about literacy typically attempt to take into consideration a broad spectrum of knowledge and skills.

OCR for page 182
Measuring Literacy: Performance Levels for Adults Literacy changes over time as expectations for knowledge and skill levels increase, and it changes with the advent of new mediating technologies. While a signature served as demonstration of literacy at one point, no one would argue that signing one’s name would signify being literate today. A pen and pencil, typewriter, or keyboard were key mediating tools in the past, but to separate literacy from its most powerful purveyor, digital technology and the Internet, is to lose much of what counts as literacy in this age. Once again, we encourage discussion and reconsideration of the literacy demands in the tasks of daily living. Inclusion of stakeholders, practitioners, and members of the media in these discussions will not only contribute to an improved test design for assessing critical literacy skills for existing and new domains beyond the six specified previously, but will also contribute to a higher level of reflection on rethinking and broadening the existing definition of literacy. With these comments in mind, we make two additional suggestions for rethinking and retooling how literacy is defined for the NAAL. A significant departure from the existing NAAL paradigm is to consider future assessments as measuring functional literacy in a wider set of contexts. Although the types of literacy that are prevalent in the world of work would be important to sample, individuals use literacy for many personal purposes as well, including literacy practices connected to religion or their children’s schooling. Use of focus groups and a panel of experts for guidance in demand-side analyses would be extremely beneficial in probing the measurement boundaries of future NAAL assessments. Currently, examinees are allowed assistance only for completion of the background questionnaire. However, literacy is a social practice, and, in the real world, literacy tasks are often done collaboratively. For example, when faced with a literacy task, people often speak with each other, consult resources, and rely on background experiences to solve problems. Low-literate adults in particular have developed ways to compensate for their weak reading skills; many manage to get by in life by using compensatory strategies. We suggest that the Department of Education explore the feasibility of providing assistance as needed for the completion of some proportion of the items in the main assessment as well as in the background questionnaire. This could include, for example, asking for assistance in reading a word or two in a question or asking for clarification about the meaning of a phrase in a document or a quantitative task. When a test taker is not able to answer a question, the interviewer could gather additional information about ways he or she would approach solving the particular problem if it arose in real life. This type of information may provide especially valuable insight into the actual skills of low-literate adults and into effective compensatory strategies.

OCR for page 182
Measuring Literacy: Performance Levels for Adults CONCLUSION Ultimately, as the literacy assessment instrument evolves, so will the processes by which standards are set and performance levels described. The committee has suggested some far-reaching recommendations for future developments of a literacy assessment, some of which will require significant contemplation regarding test development processes. Most notably, there is a lingering question regarding the adequacy and completeness of the existing prose, document, and quantitative literacy scales, both in relation to the content coverage and the adequacy of measurement at the upper and lower ends of the score distribution. We recommend an alternative approach to test development, one that considers the tasks of daily living to identify the critical literacy demands that will guide development of the item pool. These procedures could change the nature of the assessment, the test administration processes, and the meaning of the scores that are reported. We recognize that such extensive modifications of the assessment make it difficult to measure trends in adult literacy, which is also an important goal. These competing goals must be carefully weighed in the design of future assessments. In all cases, however, regardless of whether any of the proposed changes are implemented, the committee recommends that the process of determining performance levels be carried out concurrently with the process of designing the assessment and constructing the items.