National Academies Press: OpenBook

NAEP Reporting Practices: Investigating District-Level and Market-Basket Reporting (2001)

Chapter: Implications of District-Level and Market-Basket Reporting

« Previous: Designing Reports of District-Level and Market-Basket NAEP Results
Suggested Citation:"Implications of District-Level and Market-Basket Reporting." National Research Council. 2001. NAEP Reporting Practices: Investigating District-Level and Market-Basket Reporting. Washington, DC: The National Academies Press. doi: 10.17226/10049.
×

Page 101

7

Implications of District-Level and Market-Basket Reporting

The two reporting practices that are the subject of this study represent more than extensions of current NAEP programs and procedures: they are essentially new programs that would result in new NAEP products. Both reporting methods would present new information that would draw attention from new audiences—audiences that, in the past, may have paid little attention to NAEP results. Implementation of either reporting method would pose challenges for NAEP's existing procedures. District-level reporting would affect sampling procedures. Creation of a short form of NAEP has implications for test construction procedures. Both market-basket and district-level reporting would alter analytic and scoring methodologies as well as the number and types of reports to be prepared. Given these factors, implementation of either reporting practice can be expected to have a significant impact on the internal configuration of the NAEP program. Furthermore, the use of data resulting from these reporting methods by policy makers, state and local departments of education, the press, and the lay public could carry consequences for state and local assessment, curriculum, and instruction.

In this chapter, we address questions about the consequences that the two reporting practices might have, specifically: (1) Would either districtlevel or market-basket reporting pose any threats to the validity of inferences from national and state NAEP? and (2) What are the implications of district-level and market-basket reporting for other state and local assessment programs? In the first section of this chapter, we explore the likely

Suggested Citation:"Implications of District-Level and Market-Basket Reporting." National Research Council. 2001. NAEP Reporting Practices: Investigating District-Level and Market-Basket Reporting. Washington, DC: The National Academies Press. doi: 10.17226/10049.
×

Page 102

implications of district-level and market-basket reporting on the NAEP program. In the second section, we discuss the impact of the reporting practices on state and local educational systems.

IMPLICATIONS FOR THE NAEP PROGRAM

NAEP is comprised of many interrelated components that work together to form a complex system. A change to any given piece of this system may have consequences for other pieces of the system. Implementation of either district-level or market-basket reporting would require numerous changes.

First, the type and nature of reported data will influence NAEP's sampling and analytic methodologies. Different sampling procedures would be needed to allow reporting of district-level data. Different analytic procedures would be needed to condition on district characteristics rather than state characteristics.

Second, the types and numbers of reports required will affect the complexity and length of time for production. Under district-level reporting, the number of reports produced could increase significantly. Preparation of market-basket results based on synthetic forms would introduce significant complexity.

Third, the uses made of reported data will affect the relative importance of the assessment in schools and the ways schools and students prepare for the assessment. Such changes suggest the need for additional user support and interpretive guidance. Policy would need to be formulated to guide preparation activities.

Hence, changes cannot be enacted capriciously but must be considered in relation to their potential effects on other pieces of the system. In the text below, we expand on this by exploring some of the effects the proposed reporting practices might have on the validity of inferences drawn from NAEP results as well as on NAEP's procedures, policies, and program costs.

Increasing the Stakes

Traditionally, NAEP has been a low-stakes assessment, since decisions about schools, teachers, and individuals have not been based on test results. The move to reporting data for school districts—either via current NAEP or through the short form—brings the level of reporting much closer to

Suggested Citation:"Implications of District-Level and Market-Basket Reporting." National Research Council. 2001. NAEP Reporting Practices: Investigating District-Level and Market-Basket Reporting. Washington, DC: The National Academies Press. doi: 10.17226/10049.
×

Page 103

those responsible for instruction. As the level of reporting moves to these smaller units, the assessment stakes will likely become even higher for schools and teachers. Increasing the stakes can have a myriad of effects.

First and foremost, increasing the stakes would require immediate attention to security issues. If high stakes consequences were attached to district-level performance on current NAEP or based on the short form, the likelihood of security breaches would increase. Security breaches could compromise NAEP items as well as the items that make up the short form. In anticipation of increased potential for security breaches, item development would need to be stepped up. Furthermore, with higher stakes, test preparation activities would become more of a concern, since inappropriate test preparation practices could unfairly advantage some districts and could affect the validity and integrity of test results. As suggested by Roeber (1994), NAEP's sponsors would need to lay out appropriate and inappropriate test preparation procedures.

Higher stakes also increases motivation to perform well. Currently, students have little incentive to do well on NAEP beyond their own personal pride and exhortations to honor the state. But if districts were able to obtain results (either as part of current NAEP or via the short form), schools and students might demonstrate greater motivation to perform well on the assessment. Previous research examining the effects of motivation on NAEP performance suggested that changes in motivation may be associated with increased performance (Linn, Koretz, & Baker, 1996). For example, Kiplinger and Linn (1992; 1995/1996) studied changes in performance on NAEP items when a block of NAEP mathematics items was embedded in a state assessment used for state and local school accountability purposes; presumably, schools and students are more motivated to perform well on a test used for accountability purposes. Their studies found a small, but statistically significant, effect, suggesting that students performed better on the NAEP items administered as part of the state assessment than on the same items administered as part of NAEP.

If motivation to do well can affect students' performance, then a number of issues may arise. Performance on NAEP may increase—perhaps not as a result of increased skill levels but as a result of increased motivation to demonstrate skill levels. This can degrade the integrity of NAEP as a monitor of educational progress. For example, under district-level reporting for current NAEP, performance gains could be seen in districts that receive results, thereby improving performance for the state. States that have no districts qualifying to receive results may not realize similar gains. It would

Suggested Citation:"Implications of District-Level and Market-Basket Reporting." National Research Council. 2001. NAEP Reporting Practices: Investigating District-Level and Market-Basket Reporting. Washington, DC: The National Academies Press. doi: 10.17226/10049.
×

Page 104

be impossible to discern whether performance increases represent real skilllevel changes or are only an artifact of changes in motivation.

If plans for the short form were implemented, changes in motivation could further affect the comparability of short-form results to regular NAEP results. Depending on the ways schools and districts decide to use shortform results, motivation to do well may increase. These changes in motivation will interfere with hopes that short-form results would be able to compare with main NAEP.

Interpreting Reported Data

Although these reporting approaches have been suggested as ways of making NAEP reports simpler and more interpretable, they may add complexities that require additional clarification. Below-state reporting may attract new audiences, unfamiliar with the goals, purposes, and limitations of NAEP. Such audiences would require assistance in understanding the meanings and implications of NAEP results. NAEP's sponsors could find themselves faced with providing support materials to new and different users to ensure appropriate interpretations of results.

Use of a percent correct metric for market-basket reporting would require considerable support to prevent misinterpretation, even for experienced users of NAEP results. For instance, during the committee's workshop on market-basket reporting, several speakers cautioned that the percent correct scale proposed for use with the market basket (see Table 6-1 ) differs from the way the public generally views percent correct scores. A number of speakers commented that people typically regard 70 percent as a passing score; scores around 80 percent as indicating proficiency; and scores of 90 percent and above as advanced. What would members of the general public think when they saw that the average American student scored less than 50 percent on the test? Or, that the proficient student only answered 55 percent of the questions correctly? According to one assessment director, “Most test directors [know enough about NAEP to] understand why this might be, but no teacher, parent, or member of the public would consider 55 percent proficient. They would consider that score as representing ‘clueless,' perhaps, and would think even less of the test and the educators that would purport to pass off 55 percent as proficient” (National Research Council, 2000). NAEP's sponsors may find that explaining percent correct scores would require substantial interpretive support to their various audiences.

Suggested Citation:"Implications of District-Level and Market-Basket Reporting." National Research Council. 2001. NAEP Reporting Practices: Investigating District-Level and Market-Basket Reporting. Washington, DC: The National Academies Press. doi: 10.17226/10049.
×

Page 105

Demand for School and Individual Results

Availability of the short form would fuel the demand for official individual scores as it is likely that the released short forms will be posted on websites, and audiences will be encouraged to “take the test” and get a score (Colvin, 2000). Because the short form could be administered to all children in a specific grade in a manner closely resembling other testing in schools—testing that results in individual score reports—maintaining the prohibition against individual results will be difficult.

District-level reporting may increase the expectation for school and student level results as well. Instructionally useful information about content areas within a subject—for example, geometry and algebra scores, rather than simply overall mathematics scores—is typically available to districts as part of other testing programs and may also become an expectation for NAEP.

Participation in State or Main NAEP

Participation in NAEP may be affected both positively and negatively by the proposed new reporting practices. Assuming resolution of the many technical and logistical issues related to district-level reporting and that few negative consequences are associated with performance, participation in state or main NAEP may increase. Districts may be willing to invest student and teacher time in return for data they consider useful.

For market-basket reporting via the short form, the impact may be the opposite. If districts are able to receive information more quickly with less testing time, they may opt for the use of the short form in place of participating in state or main NAEP.

Increased Program Costs

Moving to either of the proposed reporting methods would have significant cost implications. Increased item development would be needed—due to the security considerations associated with district-level reporting, the number of items released as part of the market basket, and the items needed to construct short forms. Larger numbers of students would be tested to accommodate reporting district-level results, which could sub-stantially increase test administration costs. Scoring procedures for both reporting practices could also introduce additional complexities, which

Suggested Citation:"Implications of District-Level and Market-Basket Reporting." National Research Council. 2001. NAEP Reporting Practices: Investigating District-Level and Market-Basket Reporting. Washington, DC: The National Academies Press. doi: 10.17226/10049.
×

Page 106

would increase costs associated with data analyses. Increased numbers of reports would be required, since separate reports would be prepared for each participating district and to provide market-basket results. NAEP's sponsors would need to provide interpretive support to assist users of the new products. Thorough evaluation of the costs associated with the reporting methods is essential. And, if these costs are to be passed on to users (either the state or the district), they need to be known and specified prior to considering districts' and states' interest in either program.

IMPLICATIONS FOR STATE AND LOCAL EDUCATIONAL SYSTEMS

States' and districts' educational systems vary widely, making it impossible to characterize in a simple way the role of assessment or the relationships among assessment, curriculum, and instruction. Traditionally, however, assessment either serves an accountability function or as an integral component of the larger instructional system. Since assessment is one aspect of a system with interrelated parts, changes in assessment systems affect curriculum and instruction, as well as what we know about student learning. Likewise, changes in curriculum or instruction affect assessment.

Instructional systems are often initially developed from expectations for student learning. These expectations are structured by curricula that map essential steps in the development of that learning. Schools implement instructional strategies that enable students to reach the identified curricular milestones and expectations. Assessment occurs at appropriate points in the instructional process to inform decision makers about the status of student learning and to provide information for further instructional planning.

In the ideal, each of these components integrally connects to the other components of the instructional system. However, there are a myriad of factors and influences that can negatively affect the symbiotic relationships among the components. Any resulting disconnect between the components can derail student learning, the reporting of learning progress, or the instructional planning essential to continued learning. To avoid these disruptions, recent educational reforms have focused on the alignment of expectations (often called standards), curriculum, instruction, and assessment.

This idealized system is subject to influences by public policy, public relations, community pressure, and other forces outside of the learning

Suggested Citation:"Implications of District-Level and Market-Basket Reporting." National Research Council. 2001. NAEP Reporting Practices: Investigating District-Level and Market-Basket Reporting. Washington, DC: The National Academies Press. doi: 10.17226/10049.
×

Page 107

system. These additional forces can produce disconnects among components of the system and can result in inefficiencies that can hamper students' opportunities to attain the desired expectations. Thus, it is important to consider the possible effects of district-level and market-basket reporting on state and local curricula and assessment systems. As with any change, the potential implications for local systems of implementing district-level NAEP or market-basket reporting are many and varied.

For local educational systems, the implications of district-level reporting and market-basket or short-form reporting may parallel those anticipated with the implementation of the state NAEP (see discussion in Chapter 3 ), as well as include implications specific to district-level instructional systems. The text below discusses the likely effects of the two proposed reporting practices on local curricula and assessments.

Assessment Areas, Content, Schedules, and Methodology

Currently, many state assessments are administered at about the same time of year as national and state NAEP. Schedule conflicts have put many districts in the position of having to choose between NAEP and state or local assessments. When faced with such conflicts, districts have tended to withdraw from NAEP participation in order to accommodate the schedule for mandated state and local assessments. But if NAEP results were reported at the district level, there is likely to be more focus on those results. This could cause districts or states to favor NAEP participation over their local assessment programs. Attempts to ensure that students are not over tested or weary at the time of the NAEP testing could lead to changes in current assessment schedules as well as modification of current assessment systems.

Data from a high visibility national assessment may receive more attention than local assessment results. Generally, local curricula and expectations are closely tied with local and state assessment—but not necessarily with NAEP. Comparisons of performance on the two sets of results may portray different pictures about students' accomplishments, differences that may be primarily attributable to alignment between local assessment and instructional programs. As a result, there might be a push to align instruction more closely with what NAEP tests. Or, current assessment systems might be replaced by the short form, given the desires and pressures for comparisons with national benchmarks. Such changes can potentially disrupt the instructional and learning systems currently in place.

Suggested Citation:"Implications of District-Level and Market-Basket Reporting." National Research Council. 2001. NAEP Reporting Practices: Investigating District-Level and Market-Basket Reporting. Washington, DC: The National Academies Press. doi: 10.17226/10049.
×

Page 108

Based on information gathered during the committee-sponsored workshops, it might be expected that local assessments would be influenced by the kinds of items and the format of items that are used on NAEP. Workshop participants commented that states have found the release of NAEP items to be useful in guiding item development for state assessments. For example, the use of performance assessments and constructed response questions in NAEP has led to the inclusion of similarly formatted questions in state instruments. Since the research involved in developing NAEP items is often much more extensive than is possible within state research divisions, states feel quite comfortable using the NAEP design as a model in developing their tests. If district-level reporting were implemented, these changes would also be likely for local assessments. The influence of NAEP formats on local assessments may be more pronounced given the number of items released in connection with the market basket. This could benefit the local systems, but only to the degree that the content to be assessed, the testing purposes, and other important characteristics of the test design would dictate the use of such item types. A significant disconnect within the local system of curriculum, instruction, and assessment could be created if there is insufficient alignment between NAEP and local instructional programs.

Approaches to Reporting Results

District-level NAEP reports might also have an effect on the type of information districts report about their own assessments. To reduce confusion for the public, districts might choose a single form of reporting. Most likely, approaches used for the higher visibility (perceived as the “higher priority”) assessment would prevail. Thus, districts may adopt the use of NAEP-like achievement levels, scaled scores that appeared consistent with NAEP results, as well as certain statistical and other processes.

This pattern has been seen in statewide assessments. During the committee's workshops, representatives from state assessment offices commented that NAEP's use of achievement levels to summarize performance has been highly influential. 1 Many states have moved to achievement-level


1 It should also be pointed out that the NAEP achievement levels have been the subject of considerable research and debate. Details can be found in National Research Council (1999b) and Hambleton, Brennan, Brown, Dodd, Forsyth, Mehrens, Nelhaus, Reckase, Rindone, van der Linden, & Zwick (2000).

Suggested Citation:"Implications of District-Level and Market-Basket Reporting." National Research Council. 2001. NAEP Reporting Practices: Investigating District-Level and Market-Basket Reporting. Washington, DC: The National Academies Press. doi: 10.17226/10049.
×

Page 109

reporting, and some use the same achievement-level descriptors as NAEP. This emulation of NAEP may increase confusion. For example, some misinterpretation has been associated with the achievement levels. One workshop participant noted that results from a recent NAEP administration revealed that 60 percent of their students performed below the proficient level in reading. State legislators interpreted this finding to mean that their students lacked essential reading skills (an interpretation not necessarily justified by the NAEP results) and advocated for revisions in the state reading instruction and assessment program. Under the amended system, students take an oral reading test in second grade, which allows for early identification and remediation of reading problems. Low-performing students then receive an individualized reading program designed to improve their reading mastery (National Research Council, 1999c). While the ultimate result may have benefited low-performing students, the original interpretation of NAEP results may not have been appropriate.

There are marked disadvantages associated with percent correct reporting. Percent correct scores may appear simple to understand, but they are subject to misinterpretation (See Chapter 4 ). If NAEP moved to reporting percent correct scores on market-basket sets of items, states and districts might be expected to consider following suit. Attempting to share the credibility of NAEP through applying such reporting approaches to local assessments would undermine the effectiveness and the appropriateness of current approaches to the reporting of results for many local assessments.

These and other approaches used by NAEP might initially appear appropriate for local assessment systems. However, attempts to emulate the national assessment in these areas is fraught with obstacles. NAEP's matrix sampling approach, for example, is not appropriate for producing individual student results. The sophistication and complexity of the processes that underlie NAEP development, scoring, and reporting would likely be inappropriate or unachievable for many local assessments due to various factors. These factors include sample size, expertise, and resources at district levels, as well as fundamental issues related to the comparability of score scales, comparability of achievement levels determined with differing groups on differing content using differing procedures, and other technical issues.

Impact on Curriculum

The use of district-level and market-basket reports may also have an impact on the curricular content taught in schools. With highly visible

Suggested Citation:"Implications of District-Level and Market-Basket Reporting." National Research Council. 2001. NAEP Reporting Practices: Investigating District-Level and Market-Basket Reporting. Washington, DC: The National Academies Press. doi: 10.17226/10049.
×

Page 110

NAEP results being reported at the district level, via either full NAEP or through the short form, there would be some pressure for curriculum to become more aligned with those assessment results. By definition, the market-basket concept implies a domain being assessed and reported that is narrower than the entire NAEP framework, and the short form would be an even smaller sample of that domain. The impact on curriculum of reporting at the district level is likely to be significant, due to this narrowed focus. The limited set of items would likely reduce the scope of curricular expectations, especially in the context of strong public scrutiny.

Moreover, the market basket might supplant local standards due to their perceived priority. Because the market basket is smaller, it may appear to some to represent a carefully reasoned set of priorities for learning. And because it was developed nationally, the market basket might appear to represent a more general consensus about what students should know and be able to do than a locally generated set of content and standards.

Linking Local Results to NAEP

There might also be attempts to link local level assessment results to NAEP's district-level results, again for purposes of reducing confusion in interpreting results or for “improving” the comparability between results from differing assessments. Workshop participants observed that an appealing feature of district-level reporting for NAEP would be the presumed ability to compare district assessment results with stable external measures of achievement. There are several problems with attempts to link to NAEP. Earlier reports published by the National Research Council have indicated the problematic nature of attempting or touting such connections (National Research Council, 1999a; National Research Council, 1999d).

CONCLUSIONS AND RECOMMENDATIONS

Many of the concerns expressed in this chapter parallel those expressed when state NAEP was first implemented. Although not all the dire predictions for state NAEP came true, there is considerable concern over the potential uses of district-level and market-basket results. Will district-level results be used to rank order districts within the state or across the country? 2


2 This presupposes that the sampling design and interest levels result in sufficient numbers of participating districts to produce a “cross-district data compendium” like the cross-state data compendia.

Suggested Citation:"Implications of District-Level and Market-Basket Reporting." National Research Council. 2001. NAEP Reporting Practices: Investigating District-Level and Market-Basket Reporting. Washington, DC: The National Academies Press. doi: 10.17226/10049.
×

Page 111

Will districts be punished and rewarded for their performance? Will district-level NAEP results become part of schools' accountability systems? If so, what impact will this have on NAEP? Will NAEP's function as a monitor of change be fundamentally altered by below-state reporting? What effect will the release of market-basket sets of item have on state and local instruction systems? Given the potential for varied effects, the same level of effort on program evaluation would be called for as was implemented in connection with the Trial State Assessment. In addition, support systems will be needed to assist states and districts in appropriate uses and interpretations of the new products and reports.

RECOMMENDATION 7-1. If the decision is made to proceed with district-level reporting, NAEP's sponsors should develop and implement a plan for program evaluation, similar to the research conducted during the initial years of the Trial State Assessment, that would investigate the quality and utility of district-level NAEP data.

RECOMMENDATION 7-2: The potential is high for significant impact on curriculum and/or assessment at the local levels. If either district-level reporting or market-basket reporting, with or without a short form, is planned for implementation, the program sponsors should develop and implement intensive support systems to assist districts and states in appropriate uses and interpretations of any such NAEP results reported.

Suggested Citation:"Implications of District-Level and Market-Basket Reporting." National Research Council. 2001. NAEP Reporting Practices: Investigating District-Level and Market-Basket Reporting. Washington, DC: The National Academies Press. doi: 10.17226/10049.
×
Page 101
Suggested Citation:"Implications of District-Level and Market-Basket Reporting." National Research Council. 2001. NAEP Reporting Practices: Investigating District-Level and Market-Basket Reporting. Washington, DC: The National Academies Press. doi: 10.17226/10049.
×
Page 102
Suggested Citation:"Implications of District-Level and Market-Basket Reporting." National Research Council. 2001. NAEP Reporting Practices: Investigating District-Level and Market-Basket Reporting. Washington, DC: The National Academies Press. doi: 10.17226/10049.
×
Page 103
Suggested Citation:"Implications of District-Level and Market-Basket Reporting." National Research Council. 2001. NAEP Reporting Practices: Investigating District-Level and Market-Basket Reporting. Washington, DC: The National Academies Press. doi: 10.17226/10049.
×
Page 104
Suggested Citation:"Implications of District-Level and Market-Basket Reporting." National Research Council. 2001. NAEP Reporting Practices: Investigating District-Level and Market-Basket Reporting. Washington, DC: The National Academies Press. doi: 10.17226/10049.
×
Page 105
Suggested Citation:"Implications of District-Level and Market-Basket Reporting." National Research Council. 2001. NAEP Reporting Practices: Investigating District-Level and Market-Basket Reporting. Washington, DC: The National Academies Press. doi: 10.17226/10049.
×
Page 106
Suggested Citation:"Implications of District-Level and Market-Basket Reporting." National Research Council. 2001. NAEP Reporting Practices: Investigating District-Level and Market-Basket Reporting. Washington, DC: The National Academies Press. doi: 10.17226/10049.
×
Page 107
Suggested Citation:"Implications of District-Level and Market-Basket Reporting." National Research Council. 2001. NAEP Reporting Practices: Investigating District-Level and Market-Basket Reporting. Washington, DC: The National Academies Press. doi: 10.17226/10049.
×
Page 108
Suggested Citation:"Implications of District-Level and Market-Basket Reporting." National Research Council. 2001. NAEP Reporting Practices: Investigating District-Level and Market-Basket Reporting. Washington, DC: The National Academies Press. doi: 10.17226/10049.
×
Page 109
Suggested Citation:"Implications of District-Level and Market-Basket Reporting." National Research Council. 2001. NAEP Reporting Practices: Investigating District-Level and Market-Basket Reporting. Washington, DC: The National Academies Press. doi: 10.17226/10049.
×
Page 110
Suggested Citation:"Implications of District-Level and Market-Basket Reporting." National Research Council. 2001. NAEP Reporting Practices: Investigating District-Level and Market-Basket Reporting. Washington, DC: The National Academies Press. doi: 10.17226/10049.
×
Page 111
Next: References »
NAEP Reporting Practices: Investigating District-Level and Market-Basket Reporting Get This Book
×
 NAEP Reporting Practices: Investigating District-Level and Market-Basket Reporting
Buy Paperback | $45.00 Buy Ebook | $35.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

At the request of the Department of Education, the National Research Council formed the Committee on NAEP Reporting Practices to address questions about the desirability, feasibility, and potential impact of implementing these reporting practices. The committee developed study questions designed to address issues surrounding district-level and market-basket reporting.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!