Implementation is sometimes compared to an engineering process because it requires a combination of methods and tools that are based in research but intended to guide a complex, real-world activity with many moving parts. As one expert told the committee, implementation is “about making things work, not discovering whether they could work.”1 The purpose of careful implementation is to take initiatives from the research stage into widespread practice in a way that ensures fidelity to the original concept and achieves the desired outcome (Fixsen et al., 2009). Effectively implementing a program on a broad scale is a process that takes time and requires ongoing evaluation and adaptation to local circumstances (Aarons, Hurlburt, and Horwitz, 2011; Metz and Bartley, 2012; Meyers, Durlak, and Wandersman, 2012).
There are several prerequisites to effective implementation. The program needs to be based on a sound theoretical model that characterizes precisely how it can be expected to bring about a desired change. Evaluation then produces data that can be used to refine the program’s design and establish the parameters for delivering it with fidelity; Box 8-1 defines the dimensions of fidelity that need to be considered.
During the past decade, implementation research has focused both on the foundations that support the process—the development of a design that highlights essential components needed for effectiveness while allowing for adaptation to suit diverse populations—and the process itself—what it takes to deliver an intervention at a scale that can benefit broad populations. Thus, the term scale-up refers to systematic ways of increasing the coverage, range, and sustainability of the intervention, such as by taking a tested, effective local program to the regional, national, or international level (Ilott et al., 2013). This chapter focuses on three elements of sound program design that support effective implementation:
1 C. Hendricks Brown, personal communication.
- identification of the core components that make an intervention effective and monitoring of the fidelity with which those components are implemented;
- adaptation of interventions to suit the needs and characteristics of diverse communities, especially at broader scales; and
- the implementation strategies—building blocks—used in mental, emotional, and behavioral (MEB) health–related programs.
Effective implementation of an intervention starts with identifying its core components and the logic model or theory of how those components are intended to bring about the desired outcome. Also sometimes referred to as the active ingredients, essential elements, or mechanisms of change, core components are those variables that are essential if a program is to function as designed. Examples include the development of particular skills, such as self-management, decision making, drug resistance, or coping with stress and anxiety (Botvin and Griffin, 2015). Identifying those components that are truly essential makes it possible to then adapt nonessential elements to meet local needs and preferences (Fixsen et al., 2013). Most important, once a program’s core components have been clearly defined, it is possible to implement the program with fidelity, which has
Ideally, a program’s developers will begin identifying its core components early on while they are working out the program design, and then monitor the role played by these components as they proceed through efficacy and effectiveness trials, so as to identify the most potent ingredients or mechanisms for change. This process is supported by mediation research, which entails searching for mediating factors—those that explain how the core components actually operate—as well as other factors such as sex, class, or race that moderate those relationships. Mediation studies can also support dissemination by suggesting how interventions might be made more efficient without sacrificing impact. The sections below describe this process in greater detail.
Researchers studying mediating factors in family- and school-based interventions have explored a variety of child and adolescent intervention outcomes, including effects on child conduct problems and externalizing behaviors, school engagement and achievement, depression and anxiety symptoms, initiation of and growth in substance use, and delinquency and arrests (Carreras et al., 2016). Others have explored core skills and intervention targets, such as positive parenting (Bjørknes et al., 2012; Gardner, Burton, and Klimes, 2006; Tein et al., 2006), youth social-emotional character development (Bavarian et al., 2016), and peer refusal skills (Glassman et al., 2014). Studies typically examine the role of one or two of many possible mediating mechanisms by which an intervention is thought to work.
For example, Sandler and colleagues (2011) examined the mechanisms through which parenting interventions affect child outcomes. Their work suggests that long-term intervention effects are best understood in the context of changes in social, cognitive, behavioral, and biological processes in parents and their children, as well as transactions between youth and their social contexts. Other researchers have used longitudinal studies to examine mediating factors, capitalizing on research designs that can potentially support causal statements by collecting data in three or more waves—examining direct intervention effects first, then targeting mediating constructs, and then measuring longer-term outcomes (Fairchild and MacKinnon, 2014; Gonzales et al., 2014; Leve and Chamberlain, 2007; Lewis, Bailey, and Galbally, 2012; Stigler et al., 2011; Van Ryzin and Dishion, 2012). Several examples illustrate the types of mediators being examined in intervention studies; research of this kind provides clues to the potential core components of interventions.
Example: Preventing Adolescent Depression
Perrino and colleagues (2014) integrated data from three trials of the family intervention program Familias Unidas to examine mechanisms by which the
intervention prevented adolescent depression and internalizing symptoms (negative emotions that are directed inward, such as anxiety or withdrawal). Familias Unidas is a community counseling and information center serving primarily Hispanic families that focuses on increasing positive parenting, family support, and parental involvement and improving parent–adolescent communication. Results of previous tests have found modest to large effect sizes depending on the outcome and trial (Prado and Pantin, 2011).2 Perrino and colleagues sought to understand both how and for whom a specific intervention component might improve adolescent depression and internalizing symptoms, as well as externalizing behaviors (such as aggression or bullying).
The researchers focused on the mediating role of one proximal intervention target—parent–adolescent communication—and three moderators—baseline levels of internalizing symptoms, externalizing symptoms, and parent–adolescent communication. They found that the communication component fully mediated intervention effects on levels of internalizing symptoms, particularly among families with lower levels of parent–adolescent communication skills at baseline, providing support for the hypothesis that parent–adolescent communication is an effective component of the Familias Unidas intervention curriculum.
Example: Preventing Substance Use in Adolescents
DeGarmo and colleagues (2009) sought to test mediators in their study of a universal school-based intervention for early adolescents—Linking the Interests of Families and Teachers (LIFT)—aimed at preventing antisocial behavior, including youth substance use. The intervention focused on proximal intervention targets—strengthening positive relationships between young people and their parents and peers—because of evidence that these relationships have a protective effect with respect to antisocial behavior and early substance use. Intervention components included parent training, training for children in social and problem-solving skills, a recess intervention game (the Good Behavior Game3), and encouraging communication between parents and teachers.
The researchers hoped to understand the possible mediating effects of family problem solving and reduction in peer playground aggression on long-term outcomes for the LIFT intervention with respect to initiation and growth of substance use through grade 12. They found that intervention-related effects on reducing average tobacco use were mediated by improvements in family problem solving, while effects on growth in substance use were mediated by family problem solving and reductions in playground aggression. This work highlights that mechanisms and associated program components may play different roles for different outcomes: family training in problem-solving skills and the Good
Behavior Game may be critical components for substance use, but family problem solving alone may be sufficient for tobacco use.
Example: A Blended Strategy to Address Substance Use and Antisocial Behavior
Communities That Care (CTC) is a strategy developed by researchers at the University of Washington for providing workshops, instructional materials, and other resources to communities and states over the Internet.4 CTC uses a blended implementation strategy aimed at developing community-based prevention systems that take advantage of multiple evidence-based programs. Brown and colleagues (2014) assessed the influence of five possible community-level mediators of the effects of CTC on youth substance use and antisocial behavior. The five mediators they examined are core components of CTC’s theory of change with respect to reductions in risk and problem behaviors: (1) community adoption of science-based approaches to prevention, (2) collaboration on prevention initiatives, (3) widespread support for prevention, (4) community norms against drug use and antisocial behavior, and (5) use of the social development strategy in everyday interactions. The researchers found that CTC’s impacts on youth problem behaviors in grade 8 were fully mediated by changes in community adoption of a science-based approach to prevention; none of the other putative mediators had a significant impact. This study is notable for having identified a single core component as the active ingredient in mediating change in CTC’s overall objective of preventing problem behaviors in youth. Another study examined community adoption of the program by surveying community members about their awareness and use of prevention science concepts, use of epidemiological data, and system monitoring (Cambron et al., 2019).
The studies of mediation described above show how the operation of core theoretical constructs can be established and how specific components of an intervention influence long-term outcomes. However, such studies do not provide strong evidence about which components of interventions can be dropped to make the program more efficient. With CTC, for example, the benefits likely occur not solely as a result of half-day orientation sessions for leaders; rather, the training of a well-functioning community coalition to use tools and decision-making processes in selecting evidence-based programs and implementing them with fidelity is likely extremely important.
Methods for isolating the core components of an intervention more precisely have been proposed. Dismantling or factorial designs (methods for disentangling potentially influential factors) could provide robust evidence about which intervention components are necessary to produce desired effects on youth
outcomes and which can be removed to streamline or adapt the intervention (Collins, 2014; Collins, Murphy, and Strecher, 2007; Danaher and Seeley, 2009; Lindquist et al., 2007). One such approach, the Multiphase Optimization Strategy, uses a three-step process based on engineering principles (Collins and Kugler, 2018; Collins et al., 2005). In the first step, a variety of experimental methods are used to assess an array of interventions or delivery components; a second set of experiments is used to confirm the identification of essential components; and finally, efficacy and effectiveness are confirmed in randomized controlled trials. Other work has also used trials of a range of intervention components to identify those that are essential, beginning with a thorough evaluation of a single multicomponent intervention (Collins, 2014; Collins, Murphy, and Strecher, 2007; Danaher and Seeley, 2009; Lindquist et al., 2007; Mohr et al., 2015).
Mediating constructs, such as community adoption of particular prevention strategies, may be complex, so disentangling them into more discrete elements may further illuminate causal mechanisms. Moreover, because many preventive interventions are intended to influence multiple outcomes, potentially through multiple proximal intervention targets, holistic analyses are often desirable. Another approach, the sequential mediation study, explores the operation of an intervention across time to examine possible reactions and complex pathways involving relationships among cognitive, biological, social, and/or behavioral mediators (Deković et al., 2012; Sandler et al., 2011).
Once core components of a program have been established, monitoring the fidelity with which they are being implemented as the program is being designed and tested is critical. Yet systematic attention to both fidelity monitoring and the role of core components has been neglected in the past and still is (Dane and Schneider, 1998; Jensen et al., 2005; Matthias and John, 2010; Moncher and Prinz, 1991; O’Shea et al., 2016; Prowse and Nagel, 2015). While fidelity monitoring is important in efficacy trials, moreover, it becomes even more so—and more complicated—once an intervention has been put into practice in real-world environments (Crosse et al., 2011). Although the value of consistent quality monitoring has long been recognized, securing the necessary resources and managing the increased burden such monitoring can place on service systems, practitioners, and consumers can be significant challenges (Aarons, Fettes, et al., 2009; Aarons, Sommerfeld, et al., 2009). Regardless, without data to suggest whether a program has been implemented with fidelity, it is unclear whether program outcomes can accurately be attributed to the program itself or whether poor implementation was a culprit in unexpected outcomes (Fixsen et al., 2013).
Researchers have suggested ways to support fidelity assessment in both the research and scale-up stages of an intervention (refer to Box 8-2). Increasing the connections among clearly identified core intervention components, fidelity assessments, and intended intervention outcomes, as well as ensuring much more rigorous monitoring of core intervention components in both research and
practice, may be essential to achieving desired outcomes in everyday practice more consistently. These recommendations also make clear that fidelity monitoring is a shared responsibility for implementing organizations or evaluators, supported and reinforced by stakeholders.
Although identifying the core components of a program is critical, a growing body of research is emphasizing that programs are more effective and sustainable if they are responsive to local needs, preferences, and capacities (Horner, Blitz, and Ross, 2014; Walker, Bumbarger, and Phillippi, 2015). The diversity of the U.S. population and its communities highlights the importance of careful attention to the distinctive cultural characteristics of communities in implementing interventions related to MEB health at scale (Bernal, Jimenez-Chafey, and Domenech Rodriguez, 2009), particularly given the disparities in both access to care and outcomes for minority populations and those who live in underresourced communities (Alegria, Vallas, and Pumariega, 2010; Alegria et al., 2015; Coker et al., 2009).
Adapting programs to suit the local context requires care, however. The 2009 National Research Council and Institute of Medicine report notes a “longstanding consensus that health promotion and prevention programs should be culturally sensitive” (National Research Council and Institute of Medicine, 2009,
p. 302), and that adapting programs for cultural groups while maintaining their core components has yielded significant benefits. The report’s authors observe that a culturally sensitive intervention has content that is welcoming to the target culture, is not offensive, and comes across as familiar to the people involved. However, they also describe the tension between making adaptations and preserving the elements essential for effectiveness, and note the limited research on cultural, racial, and ethnic issues related to adaptation of interventions. In general, assuming that the selected program accords with the needs and values of the local context in which it is to be implemented and that its core components have been identified, adaptations are most likely to have sustainable impact when they are based on evidence that shows they align with the program’s goals and theory (Aarons et al., 2012; Castro and Yasui, 2017; Chambers, Glasgow, and Stange, 2013; Durlak and DuPre, 2008).5
Over the past two decades, researchers have developed several frameworks for cultural adaptation. For example, the Ecological Validity Model describes eight dimensions to be considered: language, persons, metaphors, content, concepts, goals, methods, and context (Bernal, Bonilla, and Bellido, 1995), and ADAPT-ITT provides a process framework for steps in adaptation, such as assessment to understand the target population, pretesting, consultation with topical experts, and pilot testing (Wingood and DiClemente, 2008).6 Other authors have distinguished between “surface” adaptations, which involve superficial aspects of an intervention, such as activities or materials, and “deep” structural adaptations that relate to content and affect outcomes of interest more directly (Resnicow et al., 2000).
Despite this thinking about what is important in adapting programs to meet the needs of diverse communities, a recent meta-analysis of studies of the effects of cultural adaptations on treatment outcomes suggests that results thus far have been mixed (Gonzales, 2017). A prior meta-analysis of ethnic minority children and adolescents who had received an evidence-based intervention showed no difference in outcomes for culturally adapted and nonadapted interventions (Huey and Polo, 2008), while other meta-analyses combining child and adult studies have found that culturally adapted interventions had modestly better effects,
5 In practice, interventions that address MEB health and other objectives are frequently adapted without careful attention to fidelity. For example, in a survey of Pennsylvania program grantees, 44 percent of respondents reported making adaptations to program procedures, dosage, and content, both intentional and not (Moore, Bumbarger, and Cooper, 2013). Reasons cited were primarily logistical in nature: lack of time, limited resources, and difficulty with participant engagement.
particularly for adults compared with children (Benish, Quintana, and Wampold, 2011; Griner and Smith, 2006). A recent review of parent training interventions found that parents’ ethnicity did not appear to moderate the effects of the interventions, and that cultural adaptation did not appear to improve outcomes compared with nonadapted programs (Ortiz and Del Vecchio, 2013). In their systematic review of four widely disseminated evidence-based parent training interventions, Baumann and colleagues (2015) found only 8 of 610 published studies that met their strict criteria for cultural adaptation; they advocate documenting explicitly how, why, and for whom adaptations were made (see, e.g., Chowdhary et al., 2014; Le et al., 2010).
These mixed findings suggest that, although surface adaptations may be necessary to ensure that interventions are culturally sensitive to and engage the populations being served, more rigorous research is needed to determine whether deep adaptations are warranted. Some researchers have noted the time and cost of evaluating these adaptations and the resulting potential delay in dissemination of effective treatments to those most in need (Domenech Rodriguez, Baumann, and Schwartz, 2011). Further, it may be best if intervention developers examine broad issues of culture and context in the development process, while gathering input from multiple constituencies.
Research on how to adapt programs effectively to serve diverse populations increasingly highlights the importance of engaging directly with communities. For MEB interventions, one way to accomplish such engagement and to respond directly to community needs is to engage community health workers in delivering the interventions. Such workers are often from the same community as the clients being served, and evidence of the effectiveness of this approach has been found in a variety of settings and for a variety of targeted problems in research conducted both in the United States and in other countries (Barnett et al., 2018). This research has addressed topics ranging from the delivery of a parenting intervention to Native American families to treatment for traumatic stress and mental disorders (Barlow et al., 2015; Chibanda et al., 2016; Murray et al., 2015; Nadkarni et al., 2015; Patel et al., 2016; Walkup et al., 2009). However, such barriers as rules and policies related to reimbursement (e.g., difficulty securing Medicaid payments for some types of interventions) have hampered the use of this approach in the United States.
Looking more broadly, multiple studies have shown community partnership and consultation in adaptation and implementation to be an effective approach (Barrera, Castro, and Steiker, 2011; Baumann et al., 2015; Goodkind et al., 2012; Guttmannova et al., 2017). An example of this approach is community-based participatory research (CBPR), which emphasizes reciprocal knowledge exchange and mutual benefit among partners (Minkler and Wallerstein, 2011; Wallerstein and Duran, 2010). A recent meta-analysis of eight programs using this approach found improvements in both health outcomes for individuals and
measures of health in the community (Salimi et al., 2012). Another study, in the context of targeting depression, compared the results for a CBPR-based approach with those for standard strategies for delivering depression care. The authors found that the results for some indicators were equivalent but others were better with the CPBR-based approach (Wells et al., 2013). Positive results have been documented for the CBPR approach in child and youth mental health interventions as well (Betancourt et al., 2015; Mance et al., 2010; Stacciarini et al., 2011).
The CBPR approach has shown particularly strong results in programs designed to promote MEB health in Native American populations. Native American youth have long been at particularly high risk for dropping out of high school, substance use disorders, teen pregnancy, and suicide. These high risks have been attributed to multiple factors, including poverty, historical and acute trauma, and lack of access to evidence-based prevention interventions (Brockie et al., 2015; Goodkind, Lanoue, and Milford, 2010; Ohannessian et al., 2015; Thayer et al., 2017; Whitesell et al., 2009). Studies using a CBPR approach to promote mental health among Native American youth have also demonstrated positive effects (Goodkind et al., 2012; Mullany et al., 2012). A study of two interventions for prevention of alcohol abuse among youth in a rural Native American community that were adapted for the local culture—Communities Mobilizing for Change on Alcohol (CMCA), a community organizing intervention, and CONNECT, a school-based universal screening and brief intervention—showed that both effectively decreased individual-level alcohol use and heavy episodic drinking (Komro et al., 2015). The CMCA intervention also had community effects on reducing overall access to alcohol among underage youth (Komro et al., 2017).
An alternative to adapting an existing evidence-based intervention is when local practitioners develop interventions based on the real-world needs and cultures of specific communities (Marsiglia and Kulis, 2009). This approach, often referred to as the use of practice-based evidence, highlights culturally specific interventions and healing practices that are used in ethnic minority communities and reflect the beliefs and values of the local community (Isaacs et al., 2005). Initiatives developed in this way may be well accepted as effective by the local community. However, one recent examination of interventions in use in a statewide setting showed that few practices, regardless of whether they were based on research or practice-based evidence, were culturally specific (Lyon et al., 2017). Moreover, most of the cultural features noted reflected only surface-level program characteristics, such as providing services in languages other than English or provider–recipient matching, rather than deep content characteristics that attend to cultural values and other central cultural components (Lyon et al., 2017).
Implementation strategies are methods and tools used to change policies, administrative procedures, and environments; they are the how of implementation, the means through which core components are put into practice. Such strategies might include, for example, engaging program consumers, providing training and technical support to staff delivering the program, or garnering stakeholder support for the program. The evolution of implementation science has included a focus on identifying, classifying, and studying these basic elements of the implementation process (Proctor, Powell, and McMillen, 2013). This section reviews in turn discrete and blended implementation strategies, providing three examples of the latter, and then examines the evidence on ways of supporting implementation efforts.
Some implementation strategies are discrete—single actions or processes (e.g., reminders, educational meetings) that are part of an effort to implement a new practice or program. Researchers have examined discrete strategies and identified an array of purposes they serve, including engaging consumers of the program being implemented, developing relationships with other stakeholders, supporting practitioners, and providing interactive assistance or training (Powell et al., 2015; Waltz et al., 2015). Such actions are the building blocks of more complex strategies, and researchers have analyzed them in seeking to identify core implementation components and track and assess fidelity. Their analyses have also helped support the development of a common language for strategies and highlight those that have not been adequately studied, facilitating efforts to select and tailor strategies for different contexts (Powell et al., 2017).
Still, the evidence base on discrete strategies and ways of combining them to form multifaceted strategies remains nascent. Only a handful of strategies have been studied in detail. For example, strategies designed to change the behavior of health care professionals (such as giving them printed materials, conducting audits and providing feedback, and influencing them through local opinion leaders) have demonstrated some effectiveness (Grimshaw et al., 2012). But few strategies have been tested for their individual contributions to effectiveness, and researchers are increasingly turning their attention from questions about whether strategies work to how, why, where, and for whom they work. This research is a useful contribution to the field, but more research is needed to improve understanding of the potential impact of tailoring implementation strategies, the barriers to implementation in particular contexts, and optimal ways of selecting strategies (Baker et al., 2015).
Blended implementation strategies combine several discrete strategies to address broad implementation challenges (Powell et al., 2012; Spoth, Redmond et al., 2013). Those challenges are complex and can include, for instance, increasing local readiness and enhancing program quality (Chinman et al., 2015; Hawkins, Catalano, and Arthur, 2002). This section explores three examples of blended strategies, which have been developed and tested to assist agencies, communities, and states in sustainably implementing MEB health programs: Communities That Care (CTC), Promoting School-Community-University Partnerships to Enhance Resilience (PROSPER), and Getting to Outcomes (GTO).
Communities That Care (CTC)
CTC is a web-assisted system, first developed in the 1990s, designed to support communities in planning and capacity building aimed at promoting youth development through effective local coalition action. It provides such resources as instructional videos and other materials, research summaries, live training in the use of the materials for community members, and web-based consulting and coaching for communities and states.7 Its focus is on making evidence about prevention available so that communities can promote healthy development and outcomes and reduce problem behaviors in young people. The developers used a CBPR process (see above) to continuously improve the design (which has been tested in 24 randomized controlled trials), and CTC has now been implemented in several hundred communities in the United States (Chilenski et al., 2019; Fagan et al., 2019) and in Europe, South America, and Australia (Fagan et al., 2019; Jonkman et al., 2009; Pérez-Gómez et al., 2016; Toumbourou et al., 2019). Box 8-3 describes CTC’s implementation process.
The central idea of CTC is that the training and technical support it provides serve as a catalyst for the development of a well-functioning community coalition of diverse local stakeholders. This coalition develops the skills needed to assess the highest-priority risk and protective factors in the community, and thus to select effective programs that can address community needs, implement those programs faithfully, and monitor the results (Hawkins, Catalano et al., 2008; Rhew et al., 2013). Reducing targeted risk and strengthening targeted protective factors is expected to lead to lower rates of problem behavior and more favorable behavioral health outcomes for local youth. The CTC process emphasizes community and collaboration for prevention activities and strengthening of community norms against adolescent drug use, and has defined a social development strategy to protect youth beginning at birth; refer to Box 8-4 (see also Brown et al., 2014).
CTC has been evaluated in both quasi-experimental (Chilenski et al., 2019; Feinberg et al., 2007, 2010) and experimental studies (Brown et al., 2014; Hawkins, Catalano et al., 2008; Oesterle et al., 2018). These studies have provided strong support for CTC’s theory of change and its impacts on positive youth development. Among the effects that have been documented are improved coalition functioning (Shapiro, Hawkins, and Oesterle, 2015; Shapiro, Oesterle, and Hawkins, 2015); sustained improvements in the adoption of a science-based approach to prevention (Gloppen et al., 2012, 2016); implementation of a greater number of effective programs relative to control communities (Fagan et al., 2011, 2012); high levels of fidelity to the program and CTC prevention system (Arthur et al., 2010; Fagan et al., 2009; Quinby et al., 2008); and improvements in risk and protection in middle school (Hawkins, Brown, et al., 2008; Kim et al., 2015).
Researchers have also generally documented sustained reductions in health risk behaviors among study participants, including reductions in the prevalence of current substance use, delinquency, and violence through grade 10 (Hawkins et al., 2009), and greater abstinence from gateway use of drugs and antisocial
behavior and lower incidence of violence through age 21 (Feinberg et al., 2010; Oesterle et al., 2018). A quasi-experimental trial across Pennsylvania school districts that implemented CTC showed small to moderate improvements in delinquency compared with students from non-CTC districts (Feinberg et al., 2010) and long-term (16-year) sustained reductions in substance use after statewide adoption of the program (Chilenski et al., 2019). Research to refine understanding of how the elements of the program function in different contexts is ongoing.
Promoting School-Community-University Partnerships to Enhance Resilience (PROSPER)
PROSPER is a system for linking university researchers with state and community teams to support the delivery of effective programs for preventing risky behaviors, promoting youth development, and strengthening families.8 Its focus is on taking advantage of existing public education infrastructure as a foundation for building the capacity needed to implement and sustain effective programs, engaging teams that understand local concerns and culture. PROSPER provides technical assistance so that programs are delivered as intended, as well as supported and sustained by the community.
In the PROSPER model, extension agents from land grant universities serve as prevention coordinators on local community prevention teams. Representatives from public schools co-lead the teams, and a state management team consisting of university officials and prevention researchers provides oversight and supports evaluation activities. PROSPER’s National Network supports the partnership by providing training and ongoing technical assistance and coaching, as well as the expertise in prevention science of university-based prevention researchers. The extension agent offers community knowledge and experience in disseminating educational programs, while the public school offers access to youth and educators in the community. PROSPER builds on this initial partnership by adding other community providers of services to youth and families to form small strategic teams.
Once a strategic team has been formed, its members select one family-based and one school-based prevention program from a menu of evidence-based programs PROSPER has identified (an approach that contrasts with that of CTC, which gives communities more latitude in selecting programs). Team participants complete a three-unit training program (see Box 8-5). Program area specialists, prevention scientists, and evaluation experts support teams throughout the implementation process. PROSPER’s National Network further supports partnership by providing training and ongoing technical assistance, as well as expertise in prevention science (Partnerships in Prevention Science Institute, 2019).
PROSPER has been evaluated in a number of studies, including a longitudinal cluster-randomized trial that began in 2002 (Redmond et al., 2009; Spoth and Greenberg, 2011; Spoth, Clair et al., 2007; Spoth, Redmond et al., 2007; Spoth, Trudeau et al., 2013; Spoth et al., 2015). Among the program’s documented benefits are small to moderate increases in rates of family recruitment in prevention programs; improvement in child and family risk and protective factors that predict adolescent substance use; reduced rates of prescription drug and opioid misuse; and reductions in youth misconduct, such as stealing, skipping school, and carrying a weapon (Spoth et al., 2015). Implementation studies of PROSPER have documented the role of poverty, attitudes about prevention, substance use norms, and prior experience with collaboration in predicting team functioning (Feinberg et al., 2007; Greenberg et al., 2007). These studies have also shown that providing technical assistance to a community team was associated with improved team functioning (Chilenski et al., 2016). Additionally, teams that functioned well early on were better able to address challenges related to long-term sustainability (Perkins et al., 2011). Teams also demonstrated multiple pathways to financial viability, with some communities generating more in external resources and others more in in-kind contributions (Welsh et al., 2016).
Getting to Outcomes (GTO)
GTO is a toolkit developed by researchers to help communities implement and evaluate programs that target risk behaviors in young people and, like CTC and PROSPER, is designed to link research on implementation with practice in
communities.9 Rather than identifying programs that researchers recommend to communities, GTO offers supports to community leaders in following 10 steps (see Box 8-6) to identify the best program for meeting the community’s needs and adapt and implement that program effectively to achieve the desired results. Steps 1 through 6 are planning activities, steps 7 and 8 detail implementation processes and evaluation, and steps 9 and 10 focus on the use of data to improve and maintain programs.
GTO is both a model and a support intervention. The 10 steps are intended to serve as a guide for communities in identifying and implementing prevention programs on their own. Unlike PROSPER and CTC, which emphasize reliance on evidence in the selection of programs, GTO is intended to strengthen agencies’ and organizations’ use of prevention programs regardless of prior evidence of effectiveness. GTO also provides support, including manuals, in-person training, and onsite technical assistance (Wandersman, Chien, and Katz, 2012; Wandersman et al., 2016), designed to build practitioners’ knowledge and strengthen their skills in such areas as goal setting, planning, and evaluation. This increased capacity, in turn, is expected to improve the fidelity of the selected prevention program and increase the likelihood that desired prevention outcomes will be achieved (Chinman et al., 2016).
9 For more information, see https://www.rand.org/health-care/projects/getting-tooutcomes.html (see Chinman et al., 2018; Chinman, Imm, and Wandersman, 2004).
GTO has been tested in a number of studies, including those with both quasi-experimental and randomized designs. In a recent cluster-randomized trial, researchers documented improvements in process outcomes and in youth attitudes related to sexual risk behaviors, such as intentions with respect to condom use (Acosta et al., 2013; Chinman et al., 2009); however, youth behaviors, such as frequency of sex and condom use, did not change (Chinman et al., 2018).
External providers of implementation support work directly within organizational and system environments to ensure the success and sustainability of program implementation and scale-up. Implementation support is sometimes provided by the technology-transfer companies established to disseminate well-established programs, although it is most commonly a function of intermediary organizations established to support the implementation or scale-up of a number of effective programs within a region or state (Franks and Bory, 2015; McWilliam et al., 2016; Mettrick et al., 2015). Such support may be paid for by program adaptors, but is more commonly paid for by program funders (e.g., state and federal service administrators, private foundations) to support their investments and increase the likelihood of success.
The three examples described above illustrate how implementation support may enhance the capacity of agencies, coalitions, and communities to carry out prevention programming. Each of these blended implementation strategies includes the provision of external support in the form of training and technical assistance for facilitators, as well as change agents. This type of support is found in most implementation frameworks; it is generally both proactive and responsive in nature and usually involves a combination of implementation science and skills training, facilitation, and supportive behavioral coaching for individuals, groups, and organizations (Meyers, Durlak, and Wandersman, 2012).
Other work has clearly indicated that external support plays a primary role in optimizing local implementation outcomes (Berta et al., 2015; Blasé, 2009; Chinman et al., 2016; Katz and Wandersman, 2016; Rushovich et al., 2015; Spoth and Greenberg, 2011; West et al., 2012). Studies of the PROSPER model, for example, showed that collaboration with providers of technical assistance (e.g., cooperation, responsiveness) was associated with the achievement of local implementation goals, such as higher participant recruitment rates and stronger functioning of community prevention teams (Chilenski et al., 2016; Spoth, Clair, et al., 2007). The provision of external support early in local implementation processes demonstrated particular benefits in studies of CTC and GTO (Chinman et al., 2016; Feinberg, Ridenour, and Greenberg, 2008). Similar work in other contexts reinforces these findings (Fagan and Mihalic, 2003; Leeman et al., 2015; Romney, Israel, and Zlatevski, 2014; West et al., 2012), although research to determine the necessary dosage of implementation support has thus far been indeterminate (Beam et al., 2012; Chinman et al., 2016; Feinberg, Ridenour, and Greenberg, 2008; Spoth, Clair, et al., 2007).
Research conducted in the past decade has shed additional light on aspects of implementation that are key foundations for successful scale-up of effective approaches.
Aarons, G.A., Fettes, D.L., Flores, L.E., Jr., and Sommerfeld, D.H. (2009). Evidence-based practice implementation and staff emotional exhaustion in childrens’ services. Behaviour Research and Therapy, 47(11), 954–960.
Aarons, G.A., Green, A.E., Palinkas, L.A., Self-Brown, S., Whitaker, D.J., Lutzker, J.R., Silovsky, J.F., Hecht, D.B., and Chaffin, M.J. (2012). Dynamic adaptation process to implement an evidence-based child maltreatment intervention. Implementation Science, 7, 32.
Aarons, G.A., Hurlburt, M., and Horwitz, S.M. (2011). Advancing a conceptual model of evidence-based practice implementation in public service sectors. Administration and Policy in Mental Health and Mental Health Services Research, 38(1), 4–23.
Aarons, G.A., Sommerfeld, D.H., Hecht, D.B., Silovsky, J.F., and Chaffin, M.J. (2009). The impact of evidence-based practice implementation and fidelity monitoring on staff turnover: Evidence for a protective effect. Journal of Consulting and Clinical Psychology, 77(2), 270–280.
Acosta, J., Chinman, M., Ebener, P., Malone, P.S., Paddock, S., Phillips, A., Scales, P., and Slaughter, M.E. (2013). An intervention to improve program implementation: Findings from a two-year cluster randomized trial of asset—Getting To Outcomes. Implementation Science, 8, 87.
Alegria, M., Green, J.G., McLaughlin, K., and Loder, S. (2015). Disparities in child and adolescent mental health and mental health services in the U.S. Available: https://philanthropynewyork.org/sites/default/files/resources/Disparities_in_child_and_adolescent_health.pdf.
Alegria, M., Vallas, M., and Pumariega, A.J. (2010). Racial and ethnic disparities in pediatric mental health. Child and Adolescent Psychiatric Clinics of North America, 19(4), 759–774.
Arthur, M.W., Hawkins, J.D., Brown, E.C., Briney, J.S., Oesterle, S., and Abbott, R.D. (2010). Implementation of the communities that care prevention system by coalitions in the community youth development study. Journal of Community Psychology, 38(2), 245–258.
Baker, R., Camosso-Stefinovic, J., Gillies, C., Shaw, E.J., Cheater, F., Flottorp, S., Robertson, N., Wensing, M., Fiander, M., Eccles, M.P., Godycki-Cwirko, M., van Lieshout, J., and Jager, C. (2015). Tailored interventions to address determinants of practice. Cochrane Database of Systematic Reviews, (4), c0005470.
Barlow, A., Mullany, B., Neault, N., Goklish, N., Billy, T., Hastings, R., Lorenzo, S., Kee, C., Lake, K., Redmond, C., Carter, A., and Walkup, J.T. (2015). Paraprofessional-delivered home-visiting intervention for American Indian teen mothers and children: 3-year outcomes from a randomized controlled trial. American Journal of Psychiatry, 172(2), 154–162.
Barnett, M.L., Gonzalez, A., Miranda, J., Chavira, D.A., and Lau, A.S. (2018). Mobilizing community health workers to address mental health disparities for underserved populations: A systematic review. Administration and Policy in Mental Health and Mental Health Services Research, 45(2), 195–211.
Barrera, M., Jr., Castro, F.G., and Steiker, L.K.H. (2011). A critical analysis of approaches to the development of preventive interventions for subcultural groups. American Journal of Community Psychology, 48(3–4), 439–454.
Baumann, A.A., Powell, B.J., Kohl, P.L., Tabak, R.G., Penalba, V., Proctor, E.E., Domenech-Rodriguez, M.M., and Cabassa, L.J. (2015). Cultural adaptation and implementation of evidence-based parent-training: A systematic review
and critique of guiding evidence. Children and Youth Services Review, 53, 113–120.
Bavarian, N., Lewis, K.M., Acock, A., DuBois, D.L., Yan, Z., Vuchinich, S., Silverthorn, N., Day, J., and Flay, B.R. (2016). Effects of a school-based social-emotional and character development program on health behaviors: A matched-pair, cluster-randomized controlled trial. Journal of Primary Prevention, 37(1), 87–105.
Beam, M., Ehrlich, G., Black, J.D., Block, A., and Leviton, L.C. (2012). Evaluation of the healthy schools program: Part II. The role of technical assistance. Preventing Chronic Disease, 9, E64.
Benish, S.G., Quintana, S., and Wampold, B.E. (2011). Culturally adapted psychotherapy and the legitimacy of myth: A direct-comparison meta-analysis. Journal of Counseling Psychology, 58(3), 279–289.
Bernal, G., Bonilla, J., and Bellido, C. (1995). Ecological validity and cultural sensitivity for outcome research: Issues for the cultural adaptation and development of psychosocial treatments with Hispanics. Journal of Abnormal Child Psychology, 23(1), 67–82.
Bernal, G., Jimenez-Chafey, M.I., and Domenech Rodriguez, M.M. (2009). Cultural adaptation of treatments: A resource for considering culture in evidence-based practice. Professional Psychology: Research and Practice, 40(4), 361–368.
Berta, W., Cranley, L., Dearing, J., Dogherty, E., Squires, J., and Estabrooks, C. (2015). Why (we think) facilitation works: Insights from organizational learning theory. Implementation Science, 10(141). Available: http://www.implementationscience.com/content/10/1/141.
Betancourt, T.S., Frounfelker, R., Mishra, T., Hussein, A., and Falzarano, R. (2015). Addressing health disparities in the mental health of refugee children and adolescents through community-based participatory research: A study in 2 communities. American Journal of Public Health, 105(Suppl. 3), 475–482.
Bjørknes, R., Kjøbli, J., Manger, T., and Jakobsen, R. (2012). Parent training among ethnic minorities: Parenting practices as mediators of change in child conduct problems. Family Relations, 61(1), 101–114.
Blasé, K., and Fixsen, D. (2013). Core intervention components identifying and operationalizing what makes programs work. ASPE research brief. Available: https://aspe.hhs.gov/report/core-intervention-components-identifying-and-operationalizing-what-makes-programs-work.
Blasé, K.A. (2009). Technical assistance to promote service and system change. Roadmap to effective intervention practices #4. Tampa: University of South Florida, Technical Assistance Center on Social Emotional Intervention for Young Children.
Botvin, G.J., and Kantor L.W. (2015). Preventing tobacco, alcohol, and drug abuse through life skills training. In Life skills training: Handbook of drug abuse prevention research, intervention strategies, and practice (pp. 177–196). Washington DC: American Psychological Association.
Brockie, T.N., Dana-Sacco, G., Wallen, G.R., Wilcox, H.C., and Campbell, J.C. (2015). The relationship of adverse childhood experiences to PTSD, depression, poly-drug use and suicide attempt in reservation-based Native American adolescents and young adults. American Journal of Community Psychology, 55(3–4), 3–4.
Brown, C.H., Chamberlain, P., Saldana, L., Padgett, C., Wang, W., and Cruden, G. (2014). Evaluation of two implementation strategies in 51 child county public service systems in two states: Results of a cluster randomized head-to-head implementation trial. Implementation Science, 9(1).
Cambron, C., F. Catalano, R.F., and Hawkins, J. (2019). The social development model. In D.P. Farrington, L. Kazemian and A.R. Piquero (Eds.), The Oxford handbook of developmental and life-course criminology (pp. 224–247). New York: Oxford University Press.
Carreras, G., Bosi, S., Angelini, P., and Gorini, G. (2016). Mediating factors of a school-based multi-component smoking prevention intervention: The LDP cluster randomized controlled trial. Health Education Research, 31(4), 439–449.
Castro, F.G., and Yasui, M. (2017). Advances in EBI development for diverse populations: Towards a science of intervention adaptation. Prevention Science, 18(6), 623–629.
Chambers, D.A., Glasgow, R.E., and Stange, K.C. (2013). The dynamic sustainability framework: Addressing the paradox of sustainment amid ongoing change. Implementation Science, 8, 117.
Chibanda, D., Weiss, H.A., Verhey, R., Simms, V., Munjoma, R., Rusakaniko, S., Chingono, A., Munetsi, E., Bere, T., Manda, E., Abas, M., and Araya, R. (2016). Effect of a primary care-based psychological intervention on symptoms of common mental disorders in Zimbabwe: A randomized clinical trial. Journal of the American Medical Association, 316(24), 2618–2626.
Chilenski, S.M., Frank, J., Summers, N., and Lew, D. (2019). Public health benefits 16 years after a statewide policy change: Communities that care in Pennsylvania. Prevention Science, 20(6), 947–958.
Chilenski, S.M., Perkins, D.F., Olson, J., Hoffman, L., Feinberg, M.E., Greenberg, M., Welsh, J., Crowley, D.M., and Spoth, R. (2016). The power of a collaborative relationship between technical assistance providers and community prevention teams: A correlational and longitudinal study. Evaluation and Program Planning, 54, 19–29.
Chinman, M., Acosta, J., Ebener, P., Malone, P.S., and Slaughter, M. (2015). A novel test of the GTO implementation support intervention in low resource settings: Year 1 findings and challenges. Implementation Science, 10(S1).
Chinman, M., Acosta, J., Ebener, P., Malone, P.S., and Slaughter, M.E. (2018). A cluster-randomized trial of Getting To Outcomes’ impact on sexual health outcomes in community-based settings. Prevention Science, 19(4), 437–448.
Chinman, M., Acosta, J.D., Ebener, P.A., Sigel, C., and Keith, J. (2016). Getting To Outcomes® guide for teen pregnancy prevention. Available: https://www.rand.org/content/dam/rand/pubs/tools/TL100/TL199/RAND_TL199.pdf.
Chinman, M., Imm, P., and Wandersman, A. (2004). Getting To Outcomes® 2004 promoting accountability through methods and tools for planning, implementation, and evaluation. Santa Monica, CA; Arlington, VA; Pittsburgh, PA: RAND Corporation. Available: https://www.rand.org/pubs/technical_reports/TR101.html.
Chinman, M., Tremain, B., Imm, P., and Wandersman, A. (2009). Strengthening prevention performance using technology: A formative evaluation of interactive to Getting To Outcomes®. American Journal of Orthopsychiatry, 79(4), 469–481.
Chowdhary, N., Sikander, S., Atif, N., Singh, N., Ahmad, I., Fuhr, D.C., Rahman, A., and Patel, V. (2014). The content and delivery of psychological interventions for perinatal depression by non-specialist health workers in low and middle income countries: A systematic review. Best Practice & Research: Clinical Obstetrics & Gynaecology, 28(1), 113–133.
Coker, T.R., Elliott, M.N., Kataoka, S., Schwebel, D.C., Mrug, S., Grunbaum, J.A., Cuccaro, P., Peskin, M.F., and Schuster, M.A. (2009). Racial/ethnic disparities in the mental health care utilization of fifth grade children. Academic Pediatrics, 9(2), 89–96.
Collins, L.M., and Kugler, K.C. (2018). Optimization of behavioral, biobehavioral, and biomedical interventions: Advanced topics. University Park, PA: The Pennsylvania State University.
Collins, L.M., Murphy, S.A., Nair, V.N., and Strecher, V.J. (2005). A strategy for optimizing and evaluating behavioral interventions. Annals of Behavioral Medicine, 30(1), 65–73.
Collins, L.M., Murphy, S.A., and Strecher, V. (2007). The multiphase optimization strategy (MOST) and the sequential multiple assignment randomized trial (SMART): New methods for more potent ehealth interventions. American Journal of Preventive Medicine, 32(5 Suppl.), 112–118.
Collins, L.M.P. (2014). Optimizing family intervention programs: The multiphase optimization strategy (MOST). In Emerging methods in family research (pp. 231–244). Cham, UK: Springer International Publishing.
Crosse, S., Williams, B., Hagen, C.A., Harmon, M., Ristow, L., DiGaetano, R., Broene, P., Alexander, D., Tseng, M., and Derzon, J.H. (2011). Prevalence and implementation fidelity of research-based prevention programs in public schools. Final report. Alexandria, VA: Office of Planning, Evaluation and Policy Development, U.S. Department of Education. Available: https://www2.ed.gov/rschstat/eval/other/research-based-prevention.pdf.
Danaher, B.G., and Seeley, J.R. (2009). Methodological issues in research on web-based behavioral interventions. Annals of Behavioral Medicine, 38(1), 28–39.
Dane, A.V., and Schneider, B.H. (1998). Program integrity in primary and early secondary prevention: Are implementation effects out of control? Clinical Psychology Review, 18(1), 23–45.
DeGarmo, D.S., Eddy, J.M., Reid, J.B., and Fetrow, R.A. (2009). Evaluating mediators of the impact of the Linking the Interests of Families and Teachers (LIFT) multimodal preventive intervention on substance use initiation and growth across adolescence. Prevention Science, 10(3), 208–220.
Deković, M., Asscher, J.J., Manders, W.A., Prins, P.J., and van der Laan, P. (2012). Within-intervention change: Mediators of intervention effects during multisystemic therapy. Journal of Consulting and Clinical Psychology, 80(4), 574–587.
Domenech Rodriguez, M.M., Baumann, A.A., and Schwartz, A.L. (2011). Cultural adaptation of an evidence based intervention: From theory to practice in a Latino/a community context. American Journal of Community Psychology, 47(1–2), 170–186.
Durlak, J.A., and DuPre, E.P. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology, 41(3–4), 327–350.
Fagan, A.A., and Mihalic, S. (2003). Strategies for enhancing the adoption of school-based prevention programs: Lessons learned from the blueprints for violence prevention replications of the Life Skills Training Program. Journal of Community Psychology, 31(3), 235–253.
Fagan, A.A., Arthur, M.W., Briney, J.S., Briney, J.S., and Hawkins, J.D. (2011). Effects of communities that care on the adoption and implementation fidelity of evidence-based prevention programs in communities: Results from a randomized controlled trial. Prevention Science, 12(3), 223–234.
Fagan, A.A., Hanson, K., Briney, J.S., and David Hawkins, J. (2012). Sustaining the utilization and high quality implementation of tested and effective prevention programs using the Communities That Care prevention system. American Journal of Community Psychology, 49(3–4), 365–377.
Fagan, A.A., Hanson, K., Hawkins, J.D., and Arthur, M.W. (2009). Translational research in action: Implementation of the Communities That Care prevention system in 12 communities. Journal of Community Psychology, 37(7), 809–829.
Fagan, A.A., Hawkins, J.D., Catalano, R.F., and Farrington, D.P. (2019). Communities That Care: Building community engagement and capacity to prevent youth behavior problems. New York: Oxford University Press.
Fairchild, A.J., and MacKinnon, D.P. (2014). Using mediation and moderation analyses to enhance prevention research. Defining Prevention Science, 537–555.
Feinberg, M.E., Chilenski, S.M., Greenberg, M.T., Spoth, R.L., and Redmond, C. (2007). Community and team member factors that influence the operations phase of local prevention teams: The PROSPER project. Prevention Science, 8(3), 214–226.
Feinberg, M.E., Jones, D., Greenberg, M.T., Osgood, D.W., and Bontempo, D. (2010). Effects of the Communities That Care model in Pennsylvania on change in adolescent risk and problem behaviors. Prevention Sciences, 11(2), 163–171.
Feinberg, M.E., Ridenour, T.A., and Greenberg, M.T. (2008). The longitudinal effect of technical assistance dosage on the functioning of communities that care prevention boards in Pennsylvania. Journal of Primary Prevention, 29(2), 145–165.
Fixsen, D.L., Blase, K., Metz, A., and van Dyke, M. (2013). Statewide implementation of evidence-based programs. Exceptional Children, 79(2), 213–230.
Fixsen, D.L., Blase, K.A., Naoom, S.F., and Wallace, F. (2009). Core implementation components. Research on Social Work Practice, 19(5), 531–540.
Franks, R.P., and Bory, C.T. (2015). Who supports the successful implementation and sustainability of evidence-based practices? Defining and understanding the roles of intermediary and purveyor organizations. New Directions for Child and Adolescent Development, 2015(149), 41–56. doi:10.1002/cad.20112.
Gardner, F., Burton, J., and Klimes, I. (2006). Randomised controlled trial of a parenting intervention in the voluntary sector for reducing child conduct problems: Outcomes and mechanisms of change. Journal of Child Psychology and Psychiatry, 47(11), 1123–1132.
Glassman, J.R., Franks, H.M., Baumler, E.R., and Coyle, K.K. (2014). Mediation analysis of an adolescent HIV/STI/pregnancy prevention intervention. Sex Education, 14(5), 497–509.
Gloppen, K.M., Arthur, M.W., Hawkins, J.D., and Shapiro, V.B. (2012). Sustainability of the Communities That Care prevention system by coalitions participating in the community youth development study. Journal of Adolescent Health, 51(3), 259–264.
Gloppen, K.M., Brown, E.C., Wagenaar, B.H., Hawkins, J.D., Rhew, I.C., and Oesterle, S. (2016). Sustaining adoption of science-based prevention through Communities That Care. Journal of Community Psychology, 44(1), 78–89.
Gonzales, N.A. (2017). Expanding the cultural adaptation framework for population-level impact. Prevention Science, 18(6), 689–693.
Gonzales, N.A., Wong, J.J., Toomey, R.B., Millsap, R., Dumka, L.E., and Mauricio, A.M. (2014). School engagement mediates long-term prevention effects for Mexican American adolescents. Prevention Science, 15(6), 929–939.
Goodkind, J., LaNoue, M., Lee, Lance Freeland, C., and Freund, R. (2012). Feasibility, acceptability, and initial findings from a community-based cultural mental health intervention for American Indian youth and their families. Journal of Community Psychology, 40(4), 381–405.
Goodkind, J.R., Lanoue, M.D., and Milford, J. (2010). Adaptation and implementation of cognitive behavioral intervention for trauma in schools with American Indian youth. Journal of Clinical Child & Adolescent Psychology, 39(6), 858–872.
Greenberg, M.T., Feinberg, M.E., Meyer-Chilenski, S., Spoth, R.L., and Redmond, C. (2007). Community and team member factors that influence the early phase functioning of community prevention teams: The PROSPER project. Journal of Primary Prevention, 28(6), 485–504.
Grimshaw, J.M., Eccles, M.P., Lavis, J.N., Hill, S.J., and Squires, J.E. (2012). Knowledge translation of research findings. Implementation Science, 7(50).
Griner, D., and Smith, T.B. (2006). Culturally adapted mental health intervention: A meta-analytic review. Psychotherapy (Chicago, Illinois), 43(4), 531–548.
Guttmannova, K., Wheeler, M.J., Hill, K.G., Evans-Campbell, T.A., Hartigan, L.A., Jones, T.M., Hawkins, J.D., and Catalano, R.F. (2017). Assessment of risk and protection in Native American youth: Steps toward conducting culturally relevant, sustainable prevention in Indian country. Journal of Community Psychology, 45(3), 346–362.
Hawkins, J.D., Brown, E.C., Oesterle, S., Arthur, M.W., Abbott, R.D., and Catalano, R.F. (2008). Early effects of Communities That Care on targeted risks and initiation of delinquent behavior and substance use. Journal of Adolescent Health, 43(1), 15–22.
Hawkins, J.D., Catalano, R.F., and Arthur, M.W. (2002). Promoting science-based prevention in communities. Addictive Behaviors, 27(6), 951–976.
Hawkins, J.D., Catalano, R.F., Arthur, M.W., Egan, E., Brown, E.C., Abbott, R.D., and Murray, D.M. (2008). Testing Communities That Care: The rationale, design and behavioral baseline equivalence of the community youth development study. Prevention Science, 9(3), 178–190.
Hawkins, J.D., Oesterle, S., Brown, E.C., Arthur, M.W., Abbott, R.D., Fagan, A.A., and Catalano, R.F. (2009). Results of a type 2 translational research trial to prevent adolescent drug use and delinquency: A test of Communities That Care. Archives of Pediatrics & Adolescent Medicine, 163(9), 789–798.
Horner, R.H., Blitz, C., and Ross, S.W. (2014). The importance of contextual fit when implementing evidence-based interventions. Washington, DC: Office of the Assistant Secretary for Planning and Evaluation, Office of Human Services Policy, U.S. Department of Health and Human Services.
Huey, S.J., Jr., and Polo, A.J. (2008). Evidence-based psychosocial treatments for ethnic minority youth. Journal of Clinical Child & Adolescent Psychology, 37(1), 262–301.
Ilott, I., Gerrish, K., Pownall, S., Eltringham, S., and Booth, A. (2013). Exploring scale-up, spread, and sustainability: An instrumental case study tracing an innovation to enhance dysphagia care. Implementation Science, 8(1), 1–7.
Isaacs, M.R., Huang, L., Hernandez, M., and Echo-Hawk, H. (2005). The road to evidence: The intersection of evidence-based practices and cultural competence in children’s mental health. Available: https://pdfs.semanticscholar.org/4b1c/cbbda9c76b8006b95de5ec03ee2c5938e847.pdf?_ga=2.1 9669438.782938303.1564325298-559765494.1564325298.
Jensen, P.S., Weersing, R., Hoagwood, K.E., and Goldman, E. (2005). What is the evidence for evidence-based treatments? A hard look at our soft underbelly. Mental Health Services Research, 7(1), 53–74.
Jonkman, H.B., Haggerty, K.P., Steketee, M., Fagan, A., Hanson, K., and Hawkins, J.D.P. (2009). Communities That Care, core elements and context: Research of implementation in two countries. Social Development Issues, 30(3), 42–57.
Katz, J., and Wandersman, A. (2016). Technical assistance to enhance prevention capacity: A research synthesis of the evidence base. Prevention Science, 17(4), 417–428.
Kim, B.K.E., Gloppen, K.M., Rhew, I.C., Oesterle, S., and Hawkins, J.D. (2015). Effects of the communities that care prevention system on youth reports of protective factors. Prevention Science, 16(5), 652–662.
Komro, K.A., Livingston, M.D., Wagenaar, A.C., Kominsky, T.K., Pettigrew, D.W., and Garrett, B.A. (2017). Multilevel prevention trial of alcohol use among American Indian and white high school students in the Cherokee nation. American Journal of Public Health, 107(3), 453–459.
Komro, K.A., Wagenaar, A.C., Boyd, M., Boyd, B.J., Kominsky, T., Pettigrew, D., Tobler, A.L., Lynne-Landsman, S.D., Livingston, M.D., Livingston, B., and Molina, M.M. (2015). Prevention trial in the Cherokee nation: Design of a randomized community trial. Prevention Science, 16(2), 291–300.
Lau, A.S. (2006). Making the case for selective and directed cultural adaptations of evidence-based treatments: Examples from parent training. Clinical Psychology: Science and Practice, 13(4), 295–310.
Le, H.N., Zmuda, J., Perry, D.F., and Munoz, R.F. (2010). Transforming an evidence-based intervention to prevent perinatal depression for low-income Latina immigrants. American Journal of Orthopsychiatry, 80(1), 34–45.
Leeman, J., Calancie, L., Hartman, M.A., Escoffery, C.T., Herrmann, A.K., Tague, L.E., Moore, A.A., Wilson, K.M., Schreiner, M., and Samuel-Hodge, C. (2015). What strategies are used to build practitioners’ capacity to implement community-based interventions and are they effective?: A systematic review. Implementation Science, 10(1).
Leve, L.D., and Chamberlain, P. (2007). A randomized evaluation of multidimensional treatment foster care: Effects on school attendance and homework completion in juvenile justice girls. Research on Social Work Practice, 17(6), 657–663.
Lewis, A.J., Bailey, C., and Galbally, M. (2012). Anti-depressant use during pregnancy in Australia: Findings from the longitudinal study of Australian children. Australian and New Zealand Journal of Public Health, 36(5), 487–488.
Lindquist, R., Wyman, J.F., Talley, K.M., Findorff, M.J., and Gross, C.R. (2007). Design of control-group conditions in clinical trials of behavioral interventions. Journal of Nursing Scholarship, 39(3), 214–221.
Lyon, A.R., Pullmann, M.D., Walker, S.C., and D’Angelo, G. (2017). Community-sourced intervention programs: Review of submissions in response to a statewide call for “promising practices.” Administration and Policy in Mental Health, 44(1), 16–28.
Mance, G.A., Mendelson, T., Byrd, B., III, Jones, J., and Tandon, D. (2010). Utilizing community-based participatory research to adapt a mental health intervention for African American emerging adults. Progress in Community Health Partnerships, 4(2), 131–140.
Marsiglia, F.F., and Kulis, S.S. (2009). Diversity, oppression, and change: Culturally grounded social work. Chicago, IL: Lyceum Books.
Matthias, J.N., and John, G.C. (2010). Treatment fidelity in social work intervention research: A review of published studies. Research on Social Work Practice, 20(6), 674–681.
McWilliam, J., Brown, J., Sanders, M.R., and Jones, L. (2016). The Triple P Implementation Framework: The role of purveyors in the implementation and sustainability of evidence-based programs. Prevention Science, 17(5), 636–645. doi:10.1007/s11121-016-0661-4.
Mettrick, J., Harburger, D.S., Kanary, P.J., Lieman, R. B., and Zabel, M. (2015). Building cross-system implementation centers: A roadmap for state and local child serving agencies in developing Centers of Excellence (COE). Baltimore: The Institute for Innovation and Implementation, University of Maryland.
Metz, A., and Bartley, L. (2012). Active implementation frameworks for program success: How to use implementation science to improve outcomes for children. ZERO TO THREE, 32(4), 11–18.
Meyers, D.C., Durlak, J.A., and Wandersman, A. (2012). The quality implementation framework: A synthesis of critical steps in the implementation process. American Journal of Community Psychology, 50(3–4), 462–480.
Mihalic, S. (2004). The importance of implementation fidelity. Emotional and Behavioral Disorders in Youth, 4, 83–86.
Minkler, M., and Wallerstein, N. (2011). Community-based participatory research for health: From process to outcomes. Hoboken, NJ: Jossey-Bass.
Mohr, D.C., Schueller, S.M., Riley, W.T., Brown, C.H., Cuijpers, P., Duan, N., Kwasny, M.J., Stiles-Shields, C., and Cheung, K. (2015). Trials of intervention principles: Evaluation methods for evolving behavioral intervention technologies. Journal of Medical Internet Research, 17(7), e166.
Moncher, F.J., and Prinz, R.J. (1991). Treatment fidelity in outcome studies. Clinical Psychology Review, 11(3), 247–266.
Moore, J.E., Bumbarger, B.K., and Cooper, B.R. (2013). Examining adaptations of evidence-based programs in natural contexts. Journal of Primary Prevention, 34(3), 147–161.
Mullany, B., Barlow, A., Neault, N., Billy, T., Jones, T., Tortice, I., Lorenzo, S., Powers, J., Lake, K., Reid, R., and Walkup, J. (2012). The family spirit trial for American Indian teen mothers and their children: CBPR rationale, design, methods and baseline characteristics. Prevention Science, 13(5), 504–518.
Murray, L.K., Skavenski, S., Kane, J.C., Mayeya, J., Dorsey, S., Cohen, J.A., Michalopoulos, L.T., Imasiku, M., and Bolton, P.A. (2015). Effectiveness of trauma-focused cognitive behavioral therapy among trauma-affected children in Lusaka, Zambia: A randomized clinical trial. JAMA Pediatrics, 169(8), 761–769.
Nadkarni, A., Velleman, R., Dabholkar, H., Shinde, S., Bhat, B., McCambridge, J., Murthy, P., Wilson, T., Weobong, B., and Patel, V. (2015). The systematic development and pilot randomized evaluation of counselling for alcohol problems, a lay counselor-delivered psychological treatment for harmful drinking in primary care in India: The premium study. Alcoholism: Clinical and Experimental Research, 39(3), 522–531.
National Research Council and Institute of Medicine. (2009). Preventing mental, emotional, and behavioral disorders among young people: Progress and possibilities. Washington, DC: The National Academies Press.
O’Shea, O., McCormick, R., Bradley, J.M., and O’Neill, B. (2016). Fidelity review: A scoping review of the methods used to evaluate treatment fidelity in behavioural change interventions. Physical Therapy Reviews, 21(3–6), 207–214.
Oesterle, S., Kuklinski, M.R., Hawkins, J.D., Skinner, M.L., Guttmannova, K., and Rhew, I.C. (2018). Long-term effects of the Communities That Care trial on substance use, antisocial behavior, and violence through age 21 years. American Journal of Public Health, 108(5), 659–665.
Ohannessian, C.M., Finan, L.J., Schulz, J., and Hesselbrock, V. (2015). A long-term longitudinal examination of the effect of early onset of alcohol and drug use on later alcohol abuse. Substance Abuse, 36(4), 440–444. doi:10.1080/08897077.2014.989353.
Ortiz, C., and Del Vecchio, T. (2013). Cultural diversity: Do we need a new wake-up call for parent training? Behavior Therapy, 44(3), 443–458. doi:10.1016/j.beth.2013.03.009.
Partnerships in Prevention Science Institute. (2019). PROSPER partnerships. Available: https://prosper-ppsi.sws.iastate.edu/how-it-works/state-partnership-training?_ga=2.224854418.964852718.1553619796-683765434.1551211193.
Patel, V., Xiao, S., Chen, H., Hanna, F., Jotheeswaran, A.T., Luo, D., Parikh, R., Sharma, E., Usmani, S., Yu, Y., Druss, B.G., and Saxena, S. (2016). The magnitude of and health system responses to the mental health treatment gap in adults in India and China. The Lancet, 388(10063), 3074–3084.
Pérez-Gómez, A., Mejía-Trujillo, J., Brown, E.C., and Eisenberg, N. (2016). Adaptation and implementation of a science-based prevention system in Colombia: Challenges and achievements. Journal of Community Psychology, 44(4), 538–545.
Perkins, D.F., Feinberg, M.E., Greenberg, M.T., Johnson, L.E., Chilenski, S.M., Mincemoyer, C.C., and Spoth, R.L. (2011). Team factors that predict to sustainability indicators for community-based prevention teams. Evaluation and Program Planning, 34(3), 283–291.
Perrino, T., Pantin, H., Prado, G., Huang, S., Brincks, A., Howe, G., Beardslee, W., Sandler, I., and Brown, C.H. (2014). Preventing internalizing symptoms among Hispanic adolescents: A synthesis across Familias Unidas trials. Prevention Science, 15(6), 917–928.
Powell, B.J., Beidas, R.S., Lewis, C.C., Aarons, G.A., McMillen, J.C., Proctor, E.K., and Mandell, D.S. (2017). Methods to improve the selection and tailoring of implementation strategies. The Journal of Behavioral Health Services & Research, 44(2), 177–194.
Powell, B.J., McMillen, J.C., Proctor, E.K., Carpenter, C.R., Griffey, R.T., Bunger, A.C., Glass, J.E., and York, J.L. (2012). A compilation of strategies for implementing clinical innovations in health and mental health. Medical Care Research and Review, 69(2), 123–157.
Powell, B.J., Waltz, T.J., Chinman, M.J., Damschroder, L.J., Smith, J.L., Matthieu, M.M., Proctor, E.K., and Kirchner, J.E. (2015). A refined compilation of implementation strategies: Results from the Expert Recommendations for implementing change (ERIC) project. Implementation Science, 10(21). Available:http://www.implementationscience.com/content/10/1/21.
Prado, G., and Pantin, H. (2011). Reducing substance use and HIV health disparities among Hispanic youth in the USA: The Familias Unidas program of research. Psychosocial Intervention, 20(1), 63–73.
Proctor, E.K., Powell, B.J., and McMillen, J.C. (2013). Implementation strategies: Recommendations for specifying and reporting. Implementation Science, 8(139).
Prowse, P.T., and Nagel, T. (2015). A meta-evaluation: The role of treatment fidelity within psychosocial interventions during the last decade. African Journal of Psychiatry (South Africa), 18(2).
Quinby, R.K., Hanson, K., Brooke-Weiss, B., Arthur, M.W., Hawkins, J.D., and Fagan, A.A. (2008). Installing the Communities That Care prevention system: Implementation progress and fidelity in a randomized controlled trial. Journal of Community Psychology, 36(3), 313–332.
RAND Corporation. (2019). Getting To Outcomes®: Improving community-based prevention. Available: https://www.rand.org/health-care/projects/gettingto-outcomes.html.
Redmond, C., Spoth, R.L., Shin, C., Schainker, L.M., Greenberg, M.T., and Feinberg, M. (2009). Long-term protective factor outcomes of evidence-based interventions implemented by community teams through a
community–university partnership. Journal of Primary Prevention, 30(5), 513–530.
Resnicow, K., Soler, R., Braithwaite, R.L., Ahluwalia, J.S., and Butler, J. (2000). Cultural sensitivity in substance use prevention. Journal of Community Psychology, 28(3), 271–290.
Rhew, I.C., Brown, E.C., Hawkins, J.D., and Briney, J.S. (2013). Sustained effects of the Communities That Care system on prevention service system transformation. American Journal of Public Health, 103(3), 529–535.
Romney, S., Israel, N., and Zlatevski, D. (2014). Exploration-stage implementation variation: Its effect on the cost-effectiveness of an evidence-based parenting program. Journal of Psychology, 222(1), 37–48.
Rushovich, B.R., Bartley, L.H., Steward, R.K., and Bright, C.L. (2015). Technical assistance: A comparison between providers and recipients. Human Service Organizations: Management, Leadership & Governance, 39(4), 362–379.
Salimi, Y., Shahandeh, K., Malekafzali, H., Loori, N., Kheiltash, A., Jamshidi, E., Frouzan, A.S., and Majdzadeh, R. (2012). Is community-based participatory research (CBPR) useful? A systematic review on papers in a decade. International Journal of Preventive Medicine, 3(6), 386–393.
Sandler, I.N., Schoenfelder, E.N., Wolchik, S.A., and MacKinnon, D.P. (2011). Long-term impact of prevention programs to promote effective parenting: Lasting effects but uncertain processes. Annual Review of Psychology, 62(1), 299–329.
Shapiro, V.B., Hawkins, J.D., and Oesterle, S. (2015). Building local infrastructure for community adoption of science-based prevention: The role of coalition functioning. Prevention Science, 16(8), 1136–1146.
Shapiro, V.B., Oesterle, S., and Hawkins, J.D. (2015). Relating coalition capacity to the adoption of science-based prevention in communities: Evidence from a randomized trial of Communities That Care. American Journal of Community Psychology, 55(1–2), 1–12.
Spoth, R., and Greenberg, M. (2011). Impact challenges in community science-with-practice: Lessons from prosper on transformative practitioner–scientist partnerships and prevention infrastructure development. American Journal of Community Psychology, 48(1–2), 106–119.
Spoth, R., Clair, S., Greenberg, M., Redmond, C., and Shin, C. (2007). Toward dissemination of evidence-based family interventions: Maintenance of community-based partnership recruitment results and associated factors. Journal of Family Psychology, 21(2), 137–146.
Spoth, R., Greenberg, M., Bierman, K., and Redmond, C. (2004). PROSPER community–university partnership model for public education systems: Capacity-building for evidence-based, competence-building prevention. Prevention Science, 5(1), 31–39.
Spoth, R., Redmond, C., Shin, C., Greenberg, M., Clair, S., and Feinberg, M. (2007). Substance-use outcomes at 18 months past baseline: The PROSPER community–university partnership trial. American Journal of Preventive Medicine, 32(5), 395–402.
Spoth, R., Redmond, C., Shin, C., Greenberg, M., Feinberg, M., and Schainker, L. (2013). PROSPER community–university partnership delivery system effects on substance misuse through 6 1/2 years past baseline from a cluster randomized controlled intervention trial. Preventive Medicine, 56(3–4), 190–196.
Spoth, R., Trudeau, L., Shin, C., Ralston, E., Redmond, C., Greenberg, M., and Feinberg, M. (2013). Longitudinal effects of universal preventive intervention on prescription drug misuse: Three randomized controlled trials with late adolescents and young adults. American Journal of Public Health, 103(4), 665–672.
Spoth, R.L., Trudeau, L.S., Redmond, C., Shin, C., Greenberg, M.T., Feinberg, M.E., and Hyun, G.-H. (2015). PROSPER partnership delivery system: Effects on adolescent conduct problem behavior outcomes through 6.5 years past baseline. Journal of Adolescence, 45, 44–55. doi:10.1016/j.adolescence.2015.08.008.
Stacciarini, J.M., Shattell, M.M., Coady, M., and Wiens, B. (2011). Review: Community-based participatory research approach to address mental health in minority populations. Community Mental Health Journal, 47(5), 489–497.
Stigler, M.H., Perry, C.L., Smolenski, D., Arora, M., and Reddy, K.S. (2011). A mediation analysis of a tobacco prevention program for adolescents in India: How did Project MYTRI work? Health Education & Behavior, 38(3), 231–240.
Tein, J.Y., Sandler, I.N., Ayers, T.S., and Wolchik, S.A. (2006). Mediation of the effects of the family bereavement program on mental health problems of bereaved children and adolescents. Prevention Science, 7(2), 179–195.
Thayer, Z., Barbosa-Leiker, C., McDonell, M., Nelson, L., Buchwald, D., and Manson, S. (2017). Early life trauma, post-traumatic stress disorder, and allostatic load in a sample of American Indian adults. American Journal of Human Biology, 29(3). doi:10.1002/ajhb.22943.
Toumbourou, J.W., Rowland, B., Williams, J., Smith, R., and Patton, G.C. (2019). Community intervention to prevent adolescent health behavior problems: Evaluation of Communities That Care in Australia. Health Psychology, 38(6), 536–544.
Van Ryzin, M.J., and Dishion, T.J. (2012). The impact of a family-centered intervention on the ecology of adolescent antisocial behavior: Modeling developmental sequelae and trajectories during adolescence. Development and Psychopathology, 24(3), 1139–1155.
Walker, S.C., Bumbarger, B.K., and Phillippi, S.W. (2015). Achieving successful evidence-based practice implementation in juvenile justice: The importance of diagnostic and evaluative capacity. Evaluation and Program Planning, 52, 189–197. doi:10.1016/j.evalprogplan.2015.05.001.
Walkup, J.T., Barlow, A., Mullany, B.C., Pan, W., Goklish, N., Hasting, R., Cowboy, B., Fields, P., Baker, E.V., Speakman, K., Ginsburg, G., and Reid, R. (2009). Randomized controlled trial of a paraprofessional-delivered in--
home intervention for young reservation-based American Indian mothers. Journal of the American Academy of Child & Adolescent Psychiatry, 48(6), 591–601.
Wallerstein, N., and Duran, B. (2010). Community-based participatory research contributions to intervention research: The intersection of science and practice to improve health equity. American Journal of Public Health, 100(Suppl. 1), 40–46.
Waltz, T., Powell, B., Matthieu, M., Damschroder, L., Chinman, M., Smith, J., Proctor, E., and Kirchner, J. (2015). Use of concept mapping to characterize relationships among implementation strategies and assess their feasibility and importance: Results from the Expert Recommendations for Implementing Change (ERIC) study. Implementation Science, 10(109). Available: http://www.implementationscience.com/content/10/1/109.
Wandersman, A., Alia, K., Cook, B.S., Hsu, L.L., and Ramaswamy, R. (2016). Evidence-based interventions are necessary but not sufficient for achieving outcomes in each setting in a complex world: Empowerment evaluation, getting to outcomes, and demonstrating accountability. American Journal of Evaluation, 37(4), 544–561.
Wandersman, A., Chien, V.H., and Katz, J. (2012). Toward an evidence-based system for innovation support for implementing innovations with quality: Tools, training, technical assistance, and quality assurance/quality improvement. American Journal of Community Psychology, 50(3–4), 445–459.
Wells, K.B., Jones, L., Chung, B., Dixon, E.L., Tang, L., Gilmore, J., Sherbourne, C., Ngo, V.K., Ong, M.K., Stockdale, S., Ramos, E., Belin, T.R., and Miranda, J. (2013). Community-partnered cluster-randomized comparative effectiveness trial of community engagement and planning or resources for services to address depression disparities. Journal of General Internal Medicine, 28(10), 1268–1278.
Welsh, J.A., Chilenski, S.M., Johnson, L., Greenberg, M.T., and Spoth, R.L. (2016). Pathways to sustainability: 8-year follow-up from the PROSPER project. Journal of Primary Prevention, 37(3), 263–286.
West, G.R., Clapp, S.P., Averill, E.M., and Cates, W., Jr. (2012). Defining and assessing evidence for the effectiveness of technical assistance in furthering global health. Global Public Health, 7(9), 915–930.
Whitesell, N.R., Beals, J., Mitchell, C.M., Manson, S.M., and Turner, R.J. (2009). Childhood exposure to adversity and risk of substance-use disorder in two American Indian populations: The meditational role of early substance-use initiation. Journal of Studies on Alcohol and Drugs, 70(6), 971–981.
Wingood, G.M., and DiClemente, R.J. (2008). The ADAPT-ITT model: A novel method of adapting evidence-based HIV interventions. Journal of Acquired Immune Deficiency Syndromes, 47(Suppl. 1), S40–S46.
This page intentionally left blank.