6
Communicating and Using the Results of Literacy Assessments

Experience with the initial release and subsequent media coverage of the results of the 1992 National Adult Literacy Survey (NALS) highlighted the critical importance of clearly communicating the assessment results so they are interpreted correctly, inform the public, and are useful to the various audiences concerned about adult literacy in the United States. In particular, because media coverage is such a central factor in how the performance levels and associated findings are understood and used, the committee included this important issue as one of the topics discussed at its public forum in February 2004. We draw on the advice provided by the stakeholders and journalists in attendance at the forum to make recommendations about reporting and communicating about results of the National Assessment of Adult Literacy (NAAL).

In this chapter, we first discuss strategies for ensuring that appropriate information and accurate messages reach the various audiences for the NAAL results. We then discuss strategies for devising and formulating methods for communicating the results. The chapter concludes with examples of how the NAAL results can be used by different fields.

COMMUNICATING RESULTS

Types of Information That Need to Be Communicated

In reporting NAAL results, the Department of Education should strive to communicate succinct and accurate messages that address two basic questions:



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 167
Measuring Literacy: Performance Levels for Adults 6 Communicating and Using the Results of Literacy Assessments Experience with the initial release and subsequent media coverage of the results of the 1992 National Adult Literacy Survey (NALS) highlighted the critical importance of clearly communicating the assessment results so they are interpreted correctly, inform the public, and are useful to the various audiences concerned about adult literacy in the United States. In particular, because media coverage is such a central factor in how the performance levels and associated findings are understood and used, the committee included this important issue as one of the topics discussed at its public forum in February 2004. We draw on the advice provided by the stakeholders and journalists in attendance at the forum to make recommendations about reporting and communicating about results of the National Assessment of Adult Literacy (NAAL). In this chapter, we first discuss strategies for ensuring that appropriate information and accurate messages reach the various audiences for the NAAL results. We then discuss strategies for devising and formulating methods for communicating the results. The chapter concludes with examples of how the NAAL results can be used by different fields. COMMUNICATING RESULTS Types of Information That Need to Be Communicated In reporting NAAL results, the Department of Education should strive to communicate succinct and accurate messages that address two basic questions:

OCR for page 167
Measuring Literacy: Performance Levels for Adults To what extent have adults’ literacy skills changed between 1992 and 2003? What is the status of adults’ literacy skills now? In communicating these messages, however, the department needs to be keenly aware of the audiences for the information and what they seek to learn from the reports. One of the recurring points that participants made during the public forum was that the results of the adult literacy assessments are not reported in a vacuum. Journalists, the public, and even specialized audiences have preexisting views about literacy that affect how they interpret the assessment results. For example, many journalists and members of the public think of literacy as a dichotomy (literate versus illiterate), rather than as a continuum of skills. Those with more sophisticated notions of the concept may wonder how literacy applies in different real-world contexts: What does a particular performance level mean in terms of how those falling within the category function as citizens, workers, family members, or consumers? If these preexisting frames are not adequately considered in communicating the assessment results, the kinds of distortions that occurred in 1993 could happen again. Mindful of that possibility, two public forum participants, one representing journalists and another representing policy makers, suggested ways to prevent this, saying: Journalists only need two levels—literate and not literate. If you don’t tell us where the break point is, we’ll make one ourselves. [But] if literacy is truly more complex, if you truly need more than two levels and a more nuanced discussion, then one way to do it is to talk about contexts. For example, here’s what people need in the work environment, in the home environment, in the school environment, in order to obtain additional training. (Richard Colvin, Hechinger Institute, Columbia University) The same is true for state legislators and state policy makers—if you don’t tell them what the message is, they’ll [create] the message. (Milton Goldberg, Education Commission of the States) This advice suggests how the department might think about the substance of its message and its dissemination strategies. The substantive challenge will be to convey the message that literacy is not a unidimensional concept or an all-or-nothing state, and that NAAL provides a nuanced portrait of adult literacy in the United States at the beginning of the 21st century. That message will be most understandable to the public and useful to policy makers if it is anchored in the competencies and life circumstances associated with each performance level and each of the three types of literacy. So, for example, in describing the distribution of survey respondents across performance levels, the department should identify concrete tasks (drawn from the survey) that adults in each category are likely to be

OCR for page 167
Measuring Literacy: Performance Levels for Adults able to do and ones that they have a low probability of accomplishing. Illustrative examples should be directly tied to their roles as citizens, workers, family members, and consumers. Equally important is for the department to provide a picture of the life circumstances associated with those scoring at each performance level, for example, the proportion earning a middle-class wage, the likelihood of voting, and the likelihood of pursuing postsecondary education. Because policy interest and needs are greatest for those with the lowest literacy skills, it is especially critical that policy makers and the press be given a full and useful representation of this group—one that will aid in crafting policy strategies of benefit to them. While it is clear that the public wants information about the percentage of adults in the country who truly have substantial difficulties with reading and that policy interventions are needed for this group, other audiences are more concerned about policy interventions at higher points on the continuum of literacy skills. For example, Beth Beuhlmann, with the Center for Workforce Preparation at the U.S. Chamber of Commerce, pointed out that about 80 percent of jobs currently require some form of postsecondary education. Christopher Mazzeo, with the National Governors’ Association, noted that preparedness for work and for higher education is “of para-mount concern for every state and for adults at all ages, not just for the 18-24 age group.” Audiences concerned about workforce issues will look to NAAL for information about the extent to which the adult population is ready to meet the demands of the workplace in the 21st century. As forum participants pointed out, employers using the 1992 NALS results were concerned about increasing the numbers of adults with skills described by Levels 4 and 5; that is, moving more of the adult population from Level 3 to the higher levels. Thus, it is likely that employers and those involved with postsecondary education will be most interested in the percentage of adults in the committee’s recommended category of intermediate literacy and will focus on interventions that increase the numbers of adults in the committee’s category of advanced literacy. Meeting the needs of these varied audiences will require careful thought about the formats and types of information included on NAAL reports, and we encourage the department to adapt versions of the reports to meet the needs of the various audiences. The alternate versions of the performance-level descriptions included in this report could provide a basis for these efforts. Appendix A includes the sample performance-level descriptions that were used to elicit comments during the public forum. Chapter 5 presents the performance-level descriptions used in the committee’s standard-setting sessions as well as revisions developed after the standard settings (see Tables 5-2, 5-3, and 5-4); these versions provided overall descriptions as well as subject-specific descriptions. We suggest that the department

OCR for page 167
Measuring Literacy: Performance Levels for Adults consider these alternate versions and formulate performance-level descriptions tailored to the needs of specific audiences. Our suggestions concerning the development of reports of NAAL results that are appropriate and useful to interested stakeholders are encapsulated in the following two recommendations: RECOMMENDATION 6-1: NAAL results should be presented with implications of their relevance for different contexts in which adults function, such as employment and the workplace, health and safety, home and family, community and citizenship, consumer economics, and leisure and recreation, as well as the different aspects of life affected by literacy. RECOMMENDATION 6-2: The Department of Education should prepare different versions of the performance-level descriptions that are tailored to meet the needs of various audiences. Simple descriptions of the performance levels should be prepared for general audiences to enhance public understanding. More technical and more detailed descriptions should be developed to be responsive to the needs of other users. Policy Interventions for Low-Literate Adults With the development of NAAL and the Adult Literacy Supplemental Assessment, the department focused specific attention on gathering information about low-literate adults. It is hoped that there will be renewed interest among policy makers and educators in meeting the needs of this group. It is therefore critical that the department report results in ways that allow for the identification and design of appropriate services for low-literate adults and for recruiting new populations of adults who could benefit from these services. The nonliterate in English and below basic categories are likely to be heterogeneous, encompassing English speakers who have weak literacy skills, non-English speakers who are highly literate in their native languages but not literate in English, and non-English speakers who are not literate in any language. Distinctly different services and strategies will be needed for these groups. To allow these distinctions to be made and valid conclusions to be drawn, we make the following recommendation about reporting results for those with low levels of literacy: RECOMMENDATION 6-3: Reports of the percentages of adults in the nonliterate in English and below basic categories should distinguish among native English speakers and non-English speakers. This will allow for more appropriate conclusions to be drawn about (1) the extent of literacy problems among native English-speaking adults in the United States and (2) the

OCR for page 167
Measuring Literacy: Performance Levels for Adults share of adults in the United States who are still learning English and therefore cannot handle literacy tasks in English. In addition, we note that attention should be focused on determining the skill levels and needs of non-English speakers in the United States. Over the past decade, there has been a significant increase in the number of non-English speakers residing in this country. In 1990, 13.8 percent of the U.S. population over age 5 spoke a language other than English at home; in 2000, this figure increased to 17.9 percent. The percentage of U.S. residents who speak Spanish at home is now reported to be 10.7 percent of the population, compared to 7.5 percent in 1990, and 2.7 percent of the U.S. population speak languages associated with Asia and the Pacific Islands, compared to 1.9 percent in 1990 (http://www.censusscope.org/us/chart_language.html). English for speakers of other languages (ESOL) is the fastest growing segment of the adult literacy education system (http://www.ed.gov/about/offices/list/ovae/pi/AdultEd/aefacts.html). ESOL currently constitutes 43 percent of the overall system, with the proportion of English language learners in the system being as high as 73 percent in California (http://www.ed.gov/about/offices/list/ovae/pi/AdultEd/datatables/20022003enroll.xls). Being able to read, write, and speak in English accrues significant benefits in this society. NAAL results can do much to clarify the needs of this group and inform policy interventions aimed at providing services appropriate for different groups of immigrants and refugees. We therefore encourage the department to conduct analyses that will answer such questions as: What are the literacy levels of the immigrant population in this country, both in the native languages and in English? What languages are spoken by the immigrant population, and how did individuals with different language backgrounds perform on NAAL? What is the relationship between earnings and various levels of literacy, including English literacy, native language literacy, and biliteracy? What is the relationship between education level and English literacy for immigrants, and does it matter if their education was obtained in the United States or in their native country? The Department of Education commissioned a special study on the literacy levels of non-English speakers who participated in the 1992 NALS (see Greenberg et al., 2001), and we encourage the department to conduct a similar study again. At a time when legalization of undocumented immigrants as well as a revised citizenship examination are part of the U.S.

OCR for page 167
Measuring Literacy: Performance Levels for Adults policy agenda, information on the language and literacy abilities of immigrants and refugees can provide valuable insights. If data on those who speak a language other than English at home are not disaggregated and analyzed separately, the true nature of the literacy problem is likely to be obscured. For example, early reports of the 1992 NALS merely noted that 25 percent of those scoring in the lowest level were foreign-born and that the majority of immigrants fell into Levels 1 and 2. Early reports did not provide information on whether the difficulties encountered by individuals in Levels 1 and 2 required remedial services (i.e., services for adults who went to U.S. schools but never attained functional literacy skills) or developmental services (services for those who are not yet proficient in English). Not providing a separate description of the skills and backgrounds of those who speak a language other than English at home is likely to confuse both the nature and the extent of the literacy problem in the United States and fails to provide policy makers with the data they need to make informed decisions about various groups of adult learners. RECOMMENDATION 6-4: The Department of Education should commission a special study on the literacy levels of non-English speakers who participated in NAAL. The study report should be given the same prominence as other reports and should be published and disseminated in a timely manner after the main release of NAAL results and in similar ways as the main report. In addition, federal and state agencies that provide substantial services for immigrants (the departments of Labor, Education, and Health and Human Services, along with the Department of Homeland Security and its citizenship services) should be given briefings that outline the profiles of the non-English-speaking population. Information should also be marketed through channels that are commonly used by those serving immigrants and refugees. Exemplifying the Performance Levels Presentations of the 1992 results included samples of released NALS items that illustrated the skills represented by each of the performance levels. According to participants in the committee’s public forum, this was a particularly useful feature of the reports. For instance, Tony Sarmiento, with Senior Service America, said that whenever he has made presentations of NALS results, he has relied more on the sample items than on the performance-level descriptions. In addition, participants in the two bookmark standard-setting sessions conducted by the committee reported using sample items as instructional guides in their adult education classrooms. We encourage the department to again use released items to exemplify the levels. However, we suggest a change in procedures, as described below.

OCR for page 167
Measuring Literacy: Performance Levels for Adults In 1992, released NALS items were “mapped” to performance levels using the same item response theory methods that provided estimates of response probabilities (see Chapter 3 for details). Items were mapped to the performance level at which there was an 80 percent probability of an examinee’s responding correctly. Item mapping is a useful tool for communicating about test performance, but we think that more than one response probability should be considered for each item. That is, an item may map to one level based on a response probability of 80 percent; however, it will map to another, lower level based on a lower response probability (e.g., 67 percent). Many of the NALS results that were publicly reported in 1993 displayed items mapped only to a single performance level, the level associated with a response probability of 80 percent. This type of mapping procedure tends to lead to a misperception that individuals who score at the specific level will respond correctly to the item and those at lower levels will respond incorrectly. This all-or-nothing focus on the mapped items ignores the continuous nature of response probabilities. That is, for any given item, individuals at every score point have some probability of responding correctly. We encourage the use of exemplar items to illustrate what adults who score at each of the performance levels are likely to be able to do. For some audiences, it may be sufficient to simply report the percentage of adults at the given performance level who responded correctly to the item. For other audiences, item mapping procedures will be more appropriate. When item mapping procedures are used, we encourage use of displays that emphasize the continuous nature of response probabilities. Items should be mapped using different response probability values so as to communicate about the types of things that adults at each performance level would be likely to do at different levels of accuracy (e.g., 50, 67, and 80 percent of the time). Displays showing what adults would be likely to do 80 percent of the time will be important to maintain consistency with item mapping procedures used in 1992, and we note that those in the health literacy field specifically requested this during our public forum. Displays showing other levels of accuracy (e.g., other response probability values) will provide additional information about adults’ literacy skills. Mapping items to more than one level will stimulate understanding of the strengths and weaknesses of those scoring at each level. We therefore make the following recommendation: RECOMMENDATION 6-5: The Department of Education should carefully consider the ways in which released items are used to illustrate the skills represented by the performance levels. For the simplest displays, the department should avoid the use of response probabilities and just indicate

OCR for page 167
Measuring Literacy: Performance Levels for Adults the proportion of people in a given level (e.g., basic) who can do the item. If the department decides to use an item mapping procedure to illustrate performance on NAAL, items should be mapped to more than one performance level. The displays should demonstrate that individuals at each performance level have some likelihood of responding correctly to each item. Such displays will allow interpretations about what individuals at each level are and are not likely to be able to do. Simplifying Presentations of NAAL Results with Composite Scores Many reports of the NALS results displayed graphs and tables separately for each of the literacy scales. However, due to the high intercorrelations among the types of literacy, the relationships portrayed and the conclusions drawn about the information in the displays tended to be similar regardless of the literacy scale (e.g., the relationships between self-perception of reading skills and NALS literacy scores were nearly identical for prose, document, and quantitative literacy). We think that this redundancy deserves some attention as the department and others plan their reports of NAAL results. Some of the participants in the focus group discussions sponsored by the department (see U.S. Department of Education, 1998) commented that having results reported as five performance levels for each of three types of literacy was too much information to present during discussions with policy makers and other audiences. They commented that they would often select results for a single literacy area (i.e., prose) to use in their discussions. We therefore suggest that the department find ways to simplify and reduce the amount of information in NAAL results. One way to do this is to form a composite of the three literacy scores that can be used for presentations to certain audiences, and we explored several ways to accomplish this as described below. As detailed in Chapter 5, we initially planned to conduct our standard settings by combining all of the items into a single ordered item booklet, which would have yielded a single set of cut scores. This was not possible, however, since the items were not scaled together. This made it impossible to compare the difficulty levels of the items across literacy areas; for example, it was not possible to determine whether a prose item was more or less difficult than a document or quantitative item. Another way to accomplish this would be to have identical cut scores for the three types of literacy. In this case, a simple average of the three scores could be formed and the cut scores applied to this composite score. This would have been a feasible alternative if our standard-setting procedures had yielded identical (or at least similar) cut scores for each of the three types of literacy. This was not the case, however; the cut scores were quite different for the three types.

OCR for page 167
Measuring Literacy: Performance Levels for Adults We were not able to devise a means for combining prose, document, and quantitative scores in order to report NAAL results according to a single set of performance levels. To compensate for this, we developed a set of overall performance-level descriptions (see Table 5-3). These overall descriptions combine the features included in the subject-specific levels, but they differ in that they do not include cut scores. We suggest that these overall performance-level descriptions be used as one way to reduce the amount of information to present to more general audiences. Not all audiences will be interested in NAAL results grouped into the performance level categories. For some purposes, reporting at the scale score level using descriptive statistics (e.g., means and standard deviations) will be more appropriate. In these circumstances, when the focus of an analysis or display is scale scores, we urge the department and others to develop a composite score that is the simple average of prose, document, and quantitative scores. This will serve to reduce the number of displays required and will simplify the information for the user. We therefore recommend the following: RECOMMENDATION 6-6: The Department of Education and others reporting and using NAAL results should consider the purposes of and audiences for their reports as well as the messages they seek to convey. Whereas performance levels will be most appropriate for some purposes, scale scores will be most appropriate for others. When scale scores are used, a composite score that is a simple average of the prose, document, and quantitative scores should be used. COMMUNICATION STRATEGIES Ensuring that an accurate, nuanced message is effectively conveyed is a difficult task, particularly given the broad range of media likely to report on the NAAL results and the varying interests of policy makers and other stakeholders. Consequently, the Department of Education will need to consider a variety of dissemination strategies, beyond publication of the results, press releases, and news conferences. Participants in the committee’s public forum proposed a number of dissemination and communication strategies that the department could consider. For instance, in Canada, informational packets about literacy results are tailored specifically for different constituencies. Other proposed strategies include prerelease briefings for Washington-based media, web-based presentations with the Department of Education staff available to answer questions and provide information in response to online inquiries, collaboration with organizations providing professional development to journalists specializing in education and business, and opinion pieces prepared for agency officials that highlight key findings and that can be distributed to a

OCR for page 167
Measuring Literacy: Performance Levels for Adults variety of media outlets. Similarly, in-person, hard-copy, and virtual presentations can be made to organizations, such as the National Governors’ Association and the National Conference of State Legislatures, whose members decide the priority and level of resources allocated to adult literacy efforts. Attempts should also be made to communicate with community-based organizations and other nongovernmental agencies (e.g., National Council of La Raza, the Center for Law and Social Policy) who both seek and provide information on populations in need. Whatever communication strategies the Department of Education decides to employ, there should be a well-crafted plan designed to convey a clear, accurate, and consistent message or “story” about the NAAL results and their meaning. An effective way to proceed would be to consult with communication professionals to develop materials tailored to specific audiences and then to identify vehicles for disseminating them. The ways in which different audiences are likely to interpret and understand these materials can be tested using focus groups representing stakeholders, media, policy makers, and members of the public. The materials and dissemination strategies can then be revised before their release. Key actors in any communication strategy are the secretary of education and other agency officials. They need to be thoroughly briefed about the results, their meaning, and implications. Agency officials should be aware that they will be asked simplistic questions, such as “How many Americans are illiterate?” Consequently, it is especially important that they have sufficient information to allow them to give nuanced responses that are both faithful to the survey results and understandable to the public. The Department of Education should involve leaders in the literacy field as much as possible, so they can reinforce the major messages and provide a consistent picture of the results. In addition, there needs to be a strategy for communicating with members of Congress. Education committee members and staff in both the House of Representatives and the Senate should be included in prebriefing sessions and perhaps could be involved in developing dissemination strategies. Larger House and Senate briefings should be arranged soon after the main release of NAAL results and scheduled at times when staff members can attend. There should be bipartisan involvement in making these invitations. Our recommendations with regard to communication and dissemination strategies are summarized in the following recommendation. RECOMMENDATION 6-7: Before releasing results from the 2003 NAAL, the Department of Education should enlist the services of communication professionals to develop materials that present a clear, accurate, and consistent message. It should then pilot test the interpretation of those materials with focus groups including stakeholders, media, and members of the pub-

OCR for page 167
Measuring Literacy: Performance Levels for Adults lic, and revise them as appropriate before release. A briefing strategy should be developed that includes prebriefing sessions for department policy makers and congressional staff members. These groups should be briefed in detail on the supportable inferences from the findings before the official release of NAAL results. The Department of Education can also do much to enhance understanding of adult literacy in this country by making NAAL data public and encouraging research on the results. After the initial release of NALS in 1992, the department funded a number of special studies on specific population groups (e.g., language minorities, prisoners, older adults, workers, low-literate adults). These studies offered significant insight into the literacy levels of these populations and have been widely cited and used over the past decade. We encourage the department to again commission extensive follow-up studies on NAAL results. The department can also provide support for smaller scale research studies on NAAL data in the same way that it does through the secondary analysis grant program of the National Assessment of Educational Progress (NAEP) (http://www.nces.ed.gov/nationsreportcard/researchcenter/funding.asp). In order to encourage researchers to use NAAL data, the data have to be publicly available in a format that is accessible by the most commonly used statistical software. We note specifically that the NAAL data files are currently designed so that they can be analyzed only with software, called “AM,” developed by the contractor. Having the data files accessible only by proprietary software will severely limit the extent to which researchers can make use of NAAL results. In addition, the software is designed to produce plausible values that are conditioned on the set of background variables specified for a given analysis. As a consequence, the plausible values that are generated for one analysis (using one set of background variables) differ from those generated for another that uses another set of background variables. This feature of the AM software has the potential to cause confusion among researchers accessing NAAL data. We suggest that the department determine a means for developing a set of plausible values in the publicly available data file so that all researchers will work with a common data file. EXAMPLES OF WAYS NAAL RESULTS MAY BE USED NAAL results will be used by a variety of audiences and in a variety of ways. For some of these audiences, the report of NAAL findings will be a one-day story. The findings are likely to create a public stir on the day they are released, but public attention will probably be brief. For other audiences, however, NAAL results will receive long-term attention. For instance, NALS results, reported over a decade ago, are still frequently cited

OCR for page 167
Measuring Literacy: Performance Levels for Adults by the adult literacy and adult education fields (e.g., Sticht, 2004). Similarly, the health literacy field has relied on NALS results for a wealth of information and as the basis for numerous studies, some conducted only recently (e.g., Rudd, Kirsch, and Yamamoto, 2004). Below we present two examples of long-term uses of adult literacy results, one drawn from the health literacy field and one from the civic literacy field. We include these examples not only to consider the types of information that should be reported to enable such uses but also to prompt the various literacy fields to consider ways in which they can use NAAL results. Health Literacy Findings from the 1992 NALS sparked interest in literacy and possible links to health outcomes among researchers in health education, public health, medicine, and dentistry (American Medical Association, 1999; Berkman et al., 2004; Institute of Medicine, 2004; Rudd, Moeykens, and Colton, 2000). As was true in the education sector, some initial misinterpretations of the findings fueled headlines and misuse of the term “illiteracy.” The body of literature developed since the publication of NALS, however, has established the field called health literacy, which is now on the national agenda. The report, Communicating Health, recommends actions to support the education sector and improve the communication skills of health professionals (U.S. Department of Health and Human Services, Office of Disease Prevention and Health Promotion, 2003). The National Institutes of Health have called for research examining the pathways between education and health (RFA OB-03-001, Pathways Linking Education to Health; see http://www.grants.nih.gov/grants/guide/rfa-files/RFA-OB-03-001.html) and through calls for proposals examining health literacy supported by several institutes (PAR-04-117, Understanding and Promoting Health Literacy, see http://www.grants1.nih.gov/grants/guide/pa-files/PAR04-117.html). Findings from well over 400 published studies of health materials indicate a mismatch between average high school reading skills in the United States and the reading-level demands of materials across a broad spectrum of health topic areas. Many of these studies measured the ability of patients to read and comprehend materials developed and designed to offer them key information and directions. The documented mismatch between the demands of the materials and the reading skills of the intended audience speaks to poor communication on the part of health professionals as well as to limited abilities of patients to use health materials (Rudd, Moeykens, and Colton, 2000). Over 50 studies have linked untoward health outcomes among patients with limited reading skills (as measured by instruments that correlate well

OCR for page 167
Measuring Literacy: Performance Levels for Adults with reading assessments such as the Wide Range Achievement Test) compared with outcomes among those with stronger reading skills (Berkman et al., 2004). Measures of health outcomes included general public health information, such as knowledge about the effects of smoking, knowledge of HIV transmission risk, and the use of screening services. Medical outcomes included such measures as knowledge of their disease, the risk of hospitalization, and glycemic control in diabetes care (Berkman et al., 2004; Institute of Medicine, 2004). A recent Institute of Medicine report, Health Literacy: A Prescription to End Confusion, cites the finding that 90 million adults have difficulty understanding and acting on health information based on findings from health studies and an understanding of the implications of the NALS results (Institute of Medicine, 2004; Rudd, Kirsch, and Yamamoto, 2004). Among the recommendations of the Institute of Medicine report are the following: The U.S. Department of Health and Human Services and other government and private funders should support research leading to the development of causal models explaining the relationships among health literacy, the education system, the health system, and relevant social and cultural systems. Federal agencies responsible for addressing disparities should support the development of conceptual frameworks on the intersection of culture and health literacy to direct in-depth theoretical explorations and formulate the conceptual underpinnings that can guide interventions. Professional schools and professional continuing education programs in health and related fields should incorporate health literacy into their curricula and areas of competence. Measures of functional literacy skills fueled this development. Many health-related items and associated tasks were included in the 1992 NALS and covered a wide spectrum of health activities, including health promotion, health protection, disease prevention, health care and maintenance, and access and systems navigation. For example, items included a food label, an article on air quality, an advertisement for sunscreen, a medicine box dosage chart, and information from a benefits package. Because the tasks associated with these items were developed and rated in the same way that nonhealth-related tasks were, the findings for literacy skills in health contexts remain the same as that for prose, document, and quantitative scores for application of literacy skills in other everyday contexts (Rudd, Kirsch, and Yamamoto, 2004). Diffusion of information across fields is slow, and the findings from the 1992 NALS are still new to many in the health fields. With NAAL, which augments existing materials with additional items and tasks, health literacy

OCR for page 167
Measuring Literacy: Performance Levels for Adults findings will be reported. Having NAAL results focused on health contexts will garner significant attention among health researchers, practitioners, and policy makers. The approximately 300 studies published between 1970 and 1999 and the additional 300 studies between 2000 and 2004 attest to a rapidly increasing interest in the relationship between literacy and health outcomes. The field of inquiry is expanding to new health topics and new health disciplines, including pediatrics, oral health, mental health, environmental health, and public health. Although some health researchers continue to develop literacy-related assessments suitable for use in health settings, the health sector continues to rely on the education field to measure literacy skills, monitor change, and inform researchers and practitioners in other related fields. Furthermore, the health field has an interest in making comparisons over time and in examining trend data. Consequently, careful attention will need to be given to the measurement and reporting changes enacted with NAAL in order to not confuse users and hamper progress in this nascent field of inquiry. Civic Literacy Participation in civic and political engagement, a crucial aspect of public life in a democratic society, is contingent on many of the fundamental literacy skills presumed to be measured by NALS and NAAL. These skills include such tasks as reading the news sections or opinion pieces in a newspaper, deciphering documents like an election ballot, and understanding numbers associated with public issues like the allocation of local government resources. Studies of the 1992 NALS results (Smith, 2002; Venezky and Kaplan, 1998; Venezky, Kaplan, and Yu, 1998) reported that the likelihood of voting increased as literacy increased, even when controlling for other factors, such as age, educational attainment, and income. Newspaper readership also increased as literacy increased and was positively associated with voting behavior. More generally, research on civic and political engagement suggests that characteristics known or believed to be related to literacy (e.g., education, following the news, knowledge of politics, being in a white-collar occupation) are direct and indirect precursors of a variety of types of civic and political participation (Brady, 1996; Delli Carpini and Keeter, 1996; Verba, Lehman Schlozman, and Brady, 1995). It is possible that NAAL results can enhance understanding of the extent to which adults have the fundamental literacy skills needed for participating in civic affairs and carrying out their civic responsibilities. According to the frameworks the National Assessment Governing Board developed for the civic assessment of NAEP (National Assessment Governing Board, 1998), fundamental skills required for civic functioning include

OCR for page 167
Measuring Literacy: Performance Levels for Adults both content knowledge about civics as well as intellectual skills that can be applied to the content, such as knowing how to identify, describe, explain, and analyze information and arguments and evaluating positions on public issues. While NAAL does not contain any questions that specifically address civic content knowledge, the assessment does evaluate some of the intellectual skills described by NAEP’s frameworks. For instance, the committee’s advanced category for prose literacy encompasses the skills of making complex inferences, comparing and contrasting viewpoints, and identifying an author’s argument (e.g., in a newspaper). Likewise, the prose intermediate category includes being able to recognize an author’s purpose and to locate information in a government form. Using the performance-level descriptions and samples of released NAAL items, it may be possible to glean information relevant for evaluating adults’ skills in areas fundamental to civic engagement and civic participation. Knowledge about adults’ proficiency in this important area can be used both to formulate methods to improve their skills as well as to evaluate the extent to which civic and political materials are accessible to adults. The changes implemented by those working in the health literacy area provide a model. For example, after the release of NALS results in 1992, the health literacy field sought to evaluate the level of reading needed to understand health and safety information and to enact changes to make the information more accessible. Similar strategies could be used in the civic literacy field to evaluate the match (or mismatch) between individuals’ literacy skills and the level of reading and vocabulary required on such documents as election ballots, pamphlets explaining rights and responsibilities, and flyers stating candidates’ stands on political issues. Such strategies can lead to increased awareness of adults’ understanding of civic and political issues and development of more accessible materials. A negative example may serve to reinforce the importance of discussing the implications of literacy data. The report on NALS results for non-English speakers was not published until 2001 (eight years after the main Department of Education report, Adult Literacy in America). Neither analyses, findings, nor implications for this group of adults were widely discussed or disseminated. As a result, the realization that a significant portion of the population of greatest need was comprised of immigrants who had not yet learned English was slow to enter the policy debate and the public consciousness. If future reports are published in a timely manner, interested audiences will be able to gain a more nuanced picture of the literacy abilities and needs of the U.S. population, and policy makers will find it easier to make informed decisions.