Prior chapters laid out the committee’s perspective on learning, described general processes of learning as well as learning outcomes in science, and offered examples of science learning in citizen science contexts. This chapter will approach the question of how design can amplify opportunities for learning in citizen science. There are a few ideas that frame the committee’s investigation into design, learning, and citizen science. First, design and design for learning are fields with evolving bodies of knowledge and practice that can be applied to citizen science. While there are very few explicit studies of the design process in citizen science, there is a wealth of scholarship more generally about design for learning that can be reasonably extrapolated to citizen science learning. Second, design for learning as a field has advanced, and researchers and practitioners now know more about how to enable learning for more learners than they did even 10 years ago. Third, as with many fields, design for learning has grown because of and in response to what researchers and practitioners have learned about the benefits of broad participation. Designing to engage the skills and contributions of diverse learners, especially learners whose insights may have been previously neglected or even rejected, maximizes learning for all learners.
Design for learning is also a practical set of evidence-based strategies for applying ideas and theories in connection to the environment, participants, context, and dealing with constraints while maximizing opportunities. Designing for learning is the application of learning theory to citizen science contexts, in both formal and informal settings, and for a variety of participants. In this way, this chapter can offer guidance to people who
seek to maximize learning from citizen science, and especially to those who are designing the project. As we will see, however, an evolved design strategy that has proven very effective for maximizing learning opportunities intentionally blurs the boundary between project designer and project participant.
Design for learning is something that can happen at any point in a project’s lifetime. For example, many citizen science projects are adapted to promote learning, and even citizen science projects designed for learning can be redesigned for new contexts. In fact, as we will also see later in the chapter, best practices in design suggest a process of iteration and constant refinement—design as an ongoing process.
Design for learning as a process implies intentionality. It is worth noting that design decisions are made whether or not they are consciously attended to: regardless of whether or not project designers are expressly intending to pursue a specific learning outcome, designers’ decisions about how to implement projects have implications for what participants will experience. Sometimes citizen science projects are created with learning in mind, whereas for other projects learning is not an express goal. While learning can happen either way, it is more likely when the projects are designed (either created or adapted) with learning in mind.
Taking a design lens to citizen science means seeing program engagement as an opportunity to learn. Understanding how learning opportunities emerge in citizen science efforts requires making explicit the critical aspects of program design that can engage cognitive, affective, and social outcomes. In this chapter, the committee aims to suggest how learning opportunities can be realized in citizen science efforts, especially those in which creating opportunities to learn was not the primary design intent. To do so requires a necessary focus on the inherent variability of these efforts. From the previous chapters, we can understand that:
- There is a wide range of organizers and of potential participants.
- The needs of both the organizers and the intended participants will differ.
- The opportunity to make meaningful contributions to science and communities (i.e., the authenticity of citizen science) creates unique learning opportunities.
- Planning for meeting the needs of the intended participants is vital.
- Attending to learning advances broader citizen science project goals.
- Designing for diversity maximizes learning opportunities for all participants. Conversely, not designing a program for diverse audiences means that the program is designed for the default audience, which usually matches the demographics of the designer or dominant group.
In our review of the design and planning literature (see Appendix A) we identified nine considerations for designing to achieve learning objectives. Based on the citizen science projects we surveyed, we can say that projects that attended to these considerations were able to capitalize on the learning opportunities associated with citizen science and maximize learning outcomes for all participants. Further, based on theories of learning and the state-of-the art theory of design, we can say that projects that attend to these considerations will maximize learning for all participants. To be clear, the committee is not making new recommendations here; instead we are summarizing guidance found in the research literature of design for learning and mapping that guidance onto citizen science with illustrative examples. The recommendations themselves are general, but when they are implemented in the context of citizen science, we believe they have a novel utility. The committee did not find evidence for design strategies that are unique to citizen science, though we would encourage that line of research. Instead we found evidence for how well-known, time-tested, and fairly ubiquitous design recommendations can be applied effectively to citizen science. The guiding considerations are
- Know the Audience
- Adopt an Asset-Based Perspective
- Intentionally Design for Diversity
- Engage Stakeholders in Design
- Capitalize on Unique Learning Opportunities Associated with Citizen Science
- Support Multiple Kinds of Participant Engagement
- Encourage Social Interaction
- Build Learning Supports into the Project
- Evaluate and Refine
For any project being designed for learning, the literature on designing makes clear that it is essential to have some frame of understanding of the intended audience. To effectively communicate and educate, an individual, such as a project manager or the scientist designing the citizen science project, begins with an implicit or explicit definition regarding what audience they are designing for (Slater, 1996). In designing for learning, defining the audience as explicitly and accurately as possible is an important step.
The challenge and goal of knowing the audience is knowing and growing an audience of participants/volunteers that is not homogeneous. It is important to design messages and programming based on the underly-
ing needs and preferences that drive individuals to engage in the natural exchange of benefits between program planner/educator/program lead and the desired audience (Garver, Divine, and Spralls, 2009). Program planning decisions need to come from a clear understanding of the audiences’ wants and needs (Grier and Bryant, 2005). One perspective on this interchange comes from exchange theory, which suggests that people have resources, such as time. Individuals exchange these resources for perceived program benefits, such as learning from the tasks and protocols in a citizen science project (Lefebvre and Flora, 1988).
Segmentation studies can provide an insight into understanding those who volunteer in citizen science projects. To retain volunteers, it is useful to first understand why they volunteer and then why they continue to volunteer or not (Asah, Lenentine, and Blahna, 2014). As described in Chapter 5, there are several examples of citizen science projects that attempt to capitalize on motivation and interest. For example, volunteers in an urban landscape restoration project listed various sources of motivation, which include helping the environment, getting outdoors, emotional well-being, community, socializing, meaningful actions, values, learning, career, health, personal growth, protection, and user of the landscape (Asah, Lenetine, and Blahna, 2014).
The DEVISE scale on motivation to participate in citizen science measures motivations that range from interest and enjoyment to factors such as worry or guilt (Phillips et al., in press). Motivations can be interpreted from the very simplistic and superficial “why” question to a very complex psychological framework or structure. For example, in a community health context, motivations included a need to be useful/productive, a need for affiliation, a need to help others, a need for status, a need to make oneself more marketable, and a strong personal concern for a cause (Garver, Divine, and Spralls, 2009). By understanding why people participate, designers can find overlap across participants’ goals for participation as well as projects’ scientific goals.
When considering the audience, it is important to approach the audience with an asset-based frame of reference. In previous chapters, we discuss how people learn more when learning is connected to their previous experiences and draws on all their cultural and intellectual capacity—and that simply is harder to do if the project designers think only in terms of deficits or gaps. Thinking in terms of what individuals need to know but do not know and designing for community disadvantages invites deficit framing. It can be tempting to think in terms of naïve conceptions that distort new information and experiences, but sociocultural learning suggests that
simply dismissing the conceptions that learners bring as wrong or naïve can hinder learning.
Instead, design practice and learning theory suggest welcoming the views and conceptions that participants may bring into the project, and offering participants the chance to connect their experience in the project to existing knowledge and experience, as well as support engagement. People come with prior knowledge and experience, and it increases the power of any experience to see how the present learning builds on what the participant already knows (Knowles, 1970, 1984). It is particularly important to make sure these connections are available and respected for participants whose knowledge systems or traditions have been treated dismissively in the past—for example, Indigenous knowledge systems.
A good way to ensure that projects offer the opportunity to connect with participants’ existing knowledge is for project designers to invite people who hold knowledge into the design of the project. If a project is already designed, it can be helpful for designers to invite participants into the design of learning experiences that support their goals for the project. For projects that do not have face-to-face engagement, written questions can serve as a means of allowing the individual to reflect on prior experience and knowledge and share these reflections with others. Allowing or creating space—either in person or online—for sharing culturally, community, or individually held perspectives helps people integrate prior knowledge and perceptions with project learning. Learning progressions (see Chapter 4 of this report), which are empirically based maps of how people can develop understanding over time, can be useful frameworks for anticipating and building on the conceptual models people may have as they enter projects.
As an example, many participants in the Community Collaborative, Rain, Hail, and Snow Network (CoCoRaHS) join the project because of their strong interest in the topic of weather (CoCoRaHS, 2018). When asked what sources they used for learning during participation, greater than 60 percent relied on pre-existing knowledge (Phillips, 2017). Support for sharing pre-existing knowledge with other participants can lead to role expansion or mentorship opportunities for individuals, which also can deepen learning.
Research suggests that in the absence of explicit consideration of diversity, design will default to meeting the needs and expectations of members of dominant or majority groups (e.g., Henderson, 1996). This can be true even if the design team includes members of nondominant groups. Even ideas such as universal design that are meant to promote inclusivity in design (dis-
cussed in this section, below) can be applied in ways that reinforce narrow aspects of identity and perpetuate marginalization (Edyburn, 2010). This is visible in consumer product design, urban planning and architecture, and, most importantly to this report, education (Valencia, 2012).
Designing for diversity, across all learning environments, means avoiding deficit framings, which are especially likely to be applied to members of historically underserved and underrepresented communities. In simple terms, designers should avoid ascribing negative characteristics to any identity. Proactively, designers should design with participants so that participants can engage multiple, intersecting identities that change over time, as well as invite input from multiple, sometimes intersecting, identities (Settles, 2004). Designers should consider issues of power and recognize and design to minimize differences in power (Elmesky and Tobin, 2005). Designing for diversity means designing projects that allow and value contributions and connections for multiple experiences, knowledge, and even epistemological frameworks (Bang and Medin, 2010). It means avoiding assumptions or making choices about what participants will or will not be capable of (Burgstahler, 2009).
There is a small body of research exploring how to design for diversity in citizen science, which mostly borrows strategies that have proven effective in other education settings. This literature points to projects that focus on key concerns or priorities of underserved and underrepresented communities, participatory program design, applying principles of Community-Based Participatory Research or community science, place-based projects, designing for accessibility across multiple languages, considering structural barriers to participation (e.g., transportation, language), and linking projects to culturally relevant reference points (Pandya, 2012).
As suggested by the limited research focusing on broadening participation in citizen science, there are effective and proven practices to promote diversity and equity in science education and career development that can be adapted for citizen science. For example, in research on interventions to support students of color in science, it has been shown that careful mentoring (Haring, 1999; Pfund et al., 2016), strong supportive social networks (Stolle-McAllister, 2011), visible affirmation of the importance of diversity from institutional leaders (Best and Thomas, 2004; Tsui, 2007), and positive experience of research (Russell, Hancock, and McCollough, 2007) are practices that support continued participation of learners from underrepresented groups. Experience with research is an inherent part of citizen science, and project leads can intentionally foster a positive experience. Likewise, the other practices can also easily be incorporated: Project leads can talk about the contribution diverse perspectives make toward project outcomes, social networks can be nurtured, and mentoring can be built into participant role.
A useful frame for designers is the notion of universal design for learning (Rose, 2000), which points out that disability is not a quality of an individual, but the result of curricula too inflexible to provide pathways for all learners. The same idea can be applied to citizen science: Designers can develop multiple pathways for engagement and especially design around factors that could be barriers to engagement. For instance, if project materials are only English, that will be a barrier for people who do not speak English, but one which translation can readily overcome. GLOBE and eBird have developed supports in several languages (eBird, 2018; GLOBE, 2018). As another example, projects that assume access to natural settings might be inaccessible for people who do not have ready access to parks, but partnering with an urban garden or botanical center can remove that barrier, as through the partnership between Project Budburst and the Chicago Botanic Garden (Meymaris et al., 2008; Project Budburst, 2018).
Project design often makes assumptions about the people who will participate in projects, and our committee recommends that designers interrogate those assumptions, and especially question whether the extent to which those assumptions are informed by systemic and structural inequities or personal biases. As we stated in earlier sections of this report, all participants require support and scaffolding to participate in projects, and all designers make choices about what scaffolds and supports to provide. These choices are necessarily informed by the context in which they are made, but when designers are explicit about why, how, and for whom they are designing, they are better poised to address the needs of all learners.
All of these considerations point to the fundamental question of who is designing for whom, and points toward the next recommendation as a way to uncover the design assumptions and decisions—sometimes unconscious—that limit access. The next design consideration looks at how the concerns listed above can often be addressed by including a diverse cross-section of stakeholders in the co-design process and ensuring that they have voice and agency in design decisions.
Design thinking has evolved toward an emphasis on design that is more user centered (Norman, 2013) and learner centered (Soloway, Guzdial, and Hay, 1994). Current design thinking emphasizes making the needs and aspirations of users paramount, and suggests a process of rapid prototyping to arrive at useful and usable services. An audience is easier to understand from within, which means the idea of knowing your audience is a strong additional argument for engaging stakeholders in the design process. When done well, design with stakeholders should foreground an asset-based
model and attend to diversity, as co-design is about mining diverse expertise and perspectives for a better product for all.
The Thriving Earth Exchange (TEX) worked with local community organizations to co-design a project to collect indoor air quality data in Colorado (Thriving Earth Exchange, 2018). The partnership involved the community during every phase of the project, starting with understanding what the local priorities were and providing workshops to determine how to address air, soil, and water pollution. Additionally, these conversations introduced scientists to community-based participatory research methods.
In the following sections, we consider how the specific learning opportunities associated with citizen science—mentioned in Chapter 3 of this report—can provide specific, desirable leverage points for project designers looking to support science learning. In each, we discuss how this unique feature of citizen science, when harnessed appropriately, can make citizen science a particularly useful learning context.
As discussed in Chapters 3 and 4, the centrality of data—both collecting it and working with it—in citizen science means that it provides a unique opportunity to introduce participants to developing data knowledge. Again, this outcome does not happen without intention. Like other scientific reasoning, learning about data requires a combination of content background or disciplinary grounding, and facilitation in developing, testing, and refining data-based concepts. This facilitation can take the form of software tools, prompts, curriculum, explicit instruction, etc. For instance, one form of facilitation can involve building, or accelerating, data visualization literacy. Shared data visualizations can be constructed from simple x/y visuals by overlaying additional information while holding constant a data point provided by participants and adds to the relevancy of the exercise.
eBird documents the presence, absence, and abundance of bird species using checklist data submitted by volunteers from around the world (eBird, 2018). eBird provides a simple Web-based interface to view project results via interactive queries of the database. The interface enables visualization of data in the form of maps, graphs, and bar charts. Moreover, the data are provided in raw form so they can be downloaded, analyzed, and used by anyone for numerous purposes including baseline monitoring, natural resource management, education and outreach, and policy formulation (Sullivan et al., 2014).
One of the key features that is common across many citizen science projects is the authenticity of citizen science, or the fact that individual participation contributes to something bigger, such as research, conservation, etc. As discussed in Chapter 4, this is a learning opportunity that is intimately tied to a learning outcome. It can motivate other strands of learning, and can be leveraged to amplify participants’ identities as contributors to science or to support the application of science to a community issues. Knowing their participation is leading to important, usable data reinforces the value of participation for citizen scientists.
Highlighting the authenticity of participation can be as simple as facilitating frequent feedback from scientists who use the collected data, or as complex as using data to advocate for new policies. Feedback can take many forms including written documentation shared with participants about how scientists have used the data in the past; regular updates about potential future uses of the data; lists of publications; online databases for broader use; education about using results in ways that support civic decision making; and discussions about how project results can be used to inform policy.
Nearly all citizen science projects that gather and use data can leverage the authenticity of participants’ activity. The Galaxy Zoo project provides regular and timely feedback regarding data contribution milestones and an extensive up-to-date list of publications that use the data (Galaxy Zoo, 2018). Furthering their learning and role expansion in the science process, certain “super users” who were integral to particular discoveries or manuscript development have been included as authors on peer-reviewed Galaxy Zoo publications.
Community science literacy (as discussed earlier) is distributed science knowledge and the ability to use that knowledge, in connection with a broad suite of community knowledge and capabilities, to leverage science for its community goals. Citizen science projects that explicitly offer different roles and make clear how those roles contribute to the common goal can encourage community science literacy. Some projects, such as community-based participatory research projects, obviously benefit from thinking about community science literacy. But other types of projects can also think of ways in which the whole is greater than the sum of its parts. For example, community-based participatory citizen science projects bring together people with a broadly distributed knowledge of the community,
the issue being addressed, and the systems in which the project will unfold. This collective sharing allows all participants to be both learners and educators by providing an opportunity for all voices to bring their knowledge to the discussion to look for the best solution for the community.
Silver City Watershed Keepers of New Mexico emphasize the role that citizen scientists can play in monitoring watersheds and collecting water quality data, taking personal responsibility to be an informed citizen, and sharing information to build community knowledge (Silver City Watershed Keepers, 2018). Based on the knowledge they gather, they also provide “rural mining communities with the skills and capacities they need to make their neighborhoods/watersheds better places to live and work.”
In citizen science projects, there are dabblers and divers, both within and across projects (Eveleigh et al., 2014). It is well studied and understood that repeated engagement enhances learning of facts and concepts. For many citizen science projects, a basic way of considering participant engagement, over time, is that regularity in participation matters. More frequent regular participation in short activities (e.g., data collection and reporting) has better potential for enhancing learning than less regular participation. For instance, projects that require daily observations, even if the observations are relatively easy and take a short amount of time are good for learning the process, knowledge, and concepts associated with those observations. Projects can encourage this mode of learning by providing pathways, in the form of levels of engagement that change participants’ engagement over time.
Project FeederWatch has been operating for more than 30 years using the same protocol where participants watch their bird feeders every 2 weeks for 2 consecutive days throughout the winter months (Project FeederWatch, 2018). The structured engagement allows for repeated practice to hone skills of identification and improve confidence. In turn, this results in increased participant retention and higher quality, long-term data.
Repeated engagement can also take place across projects, and stronger science and conservation outcomes occur when volunteers participate in multiple, varied projects, as shown in an analysis of participants in the Audubon’s 116th Christmas Bird Count (Cooper et al., 2016). All of the 3,000 people who were surveyed report participation in at least one other birding citizen science project, and over one-third also participate in nonbirding projects. Approximately 15–20 percent of respondents said this participation influenced their donation of conservation funds, voting for habitat conservation, and creation of wildlife habitat at home. Those who did multiple types of projects were more likely to contribute more to
science outcomes and grassroots conservation actions than those who only did bird projects.
Collectively encouraging participants to do multiple projects may also make the process of recruiting, supporting, and retaining participants, as well as undertaking scholarly studies of participation, more effective and efficient. To foster participation in multiple projects across topics and activities and to surface and support synergies that might occur as individuals engage with multiple projects, SciStarter 2.0 offers citizen scientists tools to find and join multiple projects, manage and display their citizen science activities, record their progress in projects, network, and consent to have their online participation studied across citizen science projects; and offers project organizers tools to connect with people who are interested in or experienced with similar projects as well as access to analytics to understand the movement of and interests of their own participants (SciStarter, 2018).
Learning is a sociocultural activity, so anything that encourages interaction provides the opportunity for learning. There are many different approaches to what is seen as “social.” Activities as diverse as online fora for participants, data collection in teams, in-person meetings, and having people verify others’ classifications all can be designed to provide opportunities for interaction that can enhance learning.
We know some participants in citizen science projects desire and/or benefit from engaging in science as a social activity. Even individual data collection projects can be structured to communicate to individual citizen scientists that they are part of a larger endeavor that has social implications.
It is important in designing for social interactions that promote learning to consider the comfort for all participants. Projects that support learning offer participants a place of “comfort,” trust in the environment, and social engagement around learning (Kop, Fournier, and Sui Fai Mak, 2011). Kop and colleagues suggest that within these conditions, it is possible to create a pedagogy that supports people in their learning, through the active creation of resources and learning places by participants and educators/facilitators. Such a pedagogy would be based on building connections, collaborations, and the exchange of resources between people, the building of a community of learners, and harnessing of information flows on networks.
The Hudson River American Eel Research Project mobilizes community members of all ages to count, weigh, and release juvenile American eels along the tributaries of the Hudson River (Hudson River American Eel Research Project, 2018). In addition to learning through the data collection experience, and through training and project materials, participants report
that in-person interactions are a major source of learning (Phillips, 2017). Learning is likely heightened by the diverse and intergenerational nature of these social interactions, often involving inner-city high school students and local retirees. The project also boasts an end-of-year “Eelebration” where data from each of the sites are presented to all the volunteers in a jovial and fun setting.
Knowing what both designers and participants want to learn and then creating supports for that learning is important. People will learn without supports, but they will learn faster with supports (Rogoff, 1995). Once you have decided on learning goals, build supports that help people achieve these learning goals. Think of those supports in terms of tools people can use, interactions they can have with each other, and guidance they can get from project leaders.
Some of the supports the committee has seen that work effectively include tutorials, mentoring new participants by more advanced participants, curriculum, newsletters, personalized communications, in-person and online training, and interactive multimedia tools such as quizzes and peer-to-peer communication forums. This consideration also relates to making the purpose of the learning visible. For example, curriculum-based projects such as BirdSleuth (2018) and GLOBE (2018) provide robust classroom activities and lesson plans aligned with the Next Generation Science Standards. Other projects such as BeeSpotter (2018) and Nature’s Notebook (2018) provide comprehensive training materials, such as identification keys, presentations, lecture notes, and quizzes.
Because the field is young, there is relatively little research on the kinds of supports that are inherent in or well-matched to citizen science. However, as summarized in the previous three chapters, there is a large and robust body of education research on learning outcomes and how to work with stakeholders to achieve those outcomes. Some high-level strategies that can help guide the design of learning supports are described below.
Communication often prompts reflection about a participant’s learning, which can reveal gaps, guide future learning, and aid in long-term retention. Application allows participants to extend their knowledge to new domains, which is, itself, a kind of learning, and helps with retention. Some ways to do this could include project-related discussions, an asynchronous online
discussion, or a reflective prompt for the individual to consider. Other approaches include using results in civic processes or inviting participants to help describe project findings and results in both scientific and nonscientific fora. The opportunities to communicate are closely related to the notion of authenticity in citizen activity discussed above.
The Alliance for Aquatic Resource Monitoring (ALLARM), originally developed to help communities deal with acid rain deposition, provides training and technical support for mitigating point source pollution in local watersheds (ALLARM, 2018). ALLARM was designed to engage individuals in all aspects of project design and to specifically empower them to participate in community efforts that led to solutions. Currently, ALLARM provides a decision tree for guiding action. Through letters to the editor, discussions with government representatives, and presentations at community events, participants used their newly acquired knowledge to describe the acid deposition problem.
Where enhanced perceptual learning is a goal of participation, project designers can support this objectifying by offering participants a chance to interact with many examples of the kind of data that participants will be collecting, classifying, or analyzing, coupled with feedback about that interaction. Galaxy Zoo is a good example of a project that provides many examples for participants alongside frequent feedback. In Galaxy Zoo participants learn to classify by making and comparing their classifications against those of others. Thinking of creative ways to provide feedback directly or indirectly might lead some citizen science projects to richer designs and more committed participants. For projects that tend to attract one-time engagement, building feedback into the experience may increase individuals’ satisfaction, which can help lead to interest in additional engagement in citizen science.
Projects will have an easier time reaching their nonlearning-related goals if the participants are motivated, have agency, identify with the project, and have relevant domain knowledge and perceptual knowledge. For many participants, learning is a benefit to participation; for many project owners, participant learning may be critical for their science. Either way, for the learning to have meaning to the project, the goals for learning must connect to the desired outcomes for the project (Bennett, 1978). The mission of the Monarch Larva Monitoring Project (MLMP) for example, is to “better understand the distribution and abundance of breeding monarchs and to
use that knowledge to inform and inspire monarch conservation” (Monarch Larva Monitoring Project, 2018). By linking science and conservation goals directly to the design of the project, MLMP participants not only learn a great deal about monarchs, they are also motivated to actively enhance habitats for monarchs, which in turn supports the overall project goals.
The West Oakland Environmental Indicators Project is a resident-led, community-based environmental justice organization, which operates the Community Leadership Academy (West Oakland Environmental Indicators Project, 2018). The Academy is a formal training program that offers 12 hours of leadership training on topics specific to the development of the Port of Oakland including the impact of the freight transportation industry on local development, the health impacts of diesel exhaust and air pollution, technological solutions to air pollution, how the air quality regulation works, and how to advocate successfully for social justice and community health. This is a clear example of relating learning to overall project goals.
Research makes it clear that learning to engage in scientific practices—such as constructing and testing scientific arguments and evaluating scientific evidence—is facilitated by simultaneously learning disciplinary concepts and facts, and vice-versa. In fact, it is often easier for people to learn processes of science by applying the process directly to a specific problem, rather than as an abstract exercise. However, many projects wrongly assume that an emphasis on the content will result in learning about the process. For this reason, projects that explicitly design for both content and process are more likely to advance learning. Projects can do this by being explicit about the nature of science (such as the aims of and claims made with data), providing models of reasoning from the participants’ work, and encouraging participants to propose and discuss their own claims and perhaps compare their claims against those of the scientist. Projects that are explicit in this way can contribute to scientists’ learning, as they provide other perspectives. The protocol for COASST was developed to specifically train participants to evaluate scientific evidence (in this case, seabird carcasses) to accurately identify the species (COASST, 2018). COASST also provides feedback on the accuracy of each data point to participants, thereby enhancing both content knowledge and the development of science process skills. In a prepost evaluation of COASST, Haywood (2014) showed an increase in volunteers’ ability to correctly weigh evidence to determine whether sufficient information existed for accurate species identification.
Letting people interact with open-ended problems is a good way to accelerate this learning progression. Citizen science projects are sometimes emergent, exploratory, or descriptive. When appropriate, it can be valuable to design the program so that participants can engage in data analysis, have access to the tools necessary to discuss the evidence, and be part of, or at least follow along with, the evolving understanding of scientific learning through project data. Projects that provide the most opportunities to engage participants in co-constructing knowledge typically involve participants in using and sharing project results to affect change, helping to determine the intended uses of data, and collaborating on the design of the project. Projects emphasizing a collective effort to develop understanding, with opportunities for participants to propose, critique, and refine ideas with their peers, are best at this strand of learning.
Good design for learning is an iterative process, and it is necessary to design evaluation, reflection, and revision into the design process. Again, there are relatively few tools for evaluation and iteration that are specific to citizen science or that work across all citizen science projects, so the committee urges the citizen science community to borrow, adapt, refine, and share. Remember that good evaluation is always answering the question: How do I improve this effort? Participants involved in an evaluation program can provide evaluative thinking. This often strengthens the evaluation much more than the evaluation activity itself (Patton, 1997). The values that are part of the culture of evaluation are demonstrated through evaluative thinking and include clarity; specificity and focusing; being systematic and making assumptions explicit; operationalizing program concepts, ideas, and goals; distinguishing inputs and processes from outcomes; valuing empirical evidence; and separating statements of fact from interpretation and judgements (Patton, 2002).
The fundamental message of this chapter is that citizen science projects can be designed in ways that enhance learning for all participants. The evolution of design shows that more involvement of diverse stakeholders, especially when they are welcomed for the contributions they can bring to the project, improves project outcomes. This is true for all outcomes, including the learning outcomes explored in previous chapters. Further, there are actionable strategies that can be used to promote learning in the
context of citizen science and take advantage of the unique learning opportunities associated with citizen science. If the committee had to sum it up in sentence, albeit a long one, we would suggest that iterative, cooperative engagement in design and implementation, with a diversity of stakeholders who are respected for the knowledge they bring to the design process, results in more learning for all participants, and that this learning can support other project goals.
ALLARM (Alliance for Aquatic Resource Monitoring). (2018). Available: https://www.dickinson.edu/allarm [May 2018].
Asah, S.T., Lenentine, M.M., and Blahna, D.J. (2014). Benefits of urban landscape eco-volunteerism: Mixed methods segmentation analysis and implications for volunteer retention. Landscape and Urban Planning, 123, 108-113.
Bang, M., and Medin, D. (2010). Cultural processes in science education: Supporting the navigation of multiple epistemologies. Science Education, 94(6), 1008-1026.
BeeSpotter. (2018). Available: https://beespotter.org [September 2018].
Bennett, S.N. (1978). Recent research on teaching: A dream, a belief, and a model. British Journal of Educational Psychology, 48(2), 127-147.
Best, D.L., and Thomas, J.J. (2004). Cultural diversity and cross-cultural perspectives. In A.H. Eagly, A.E. Beall, and R.J. Sternberg (Eds.), The Psychology of Gender (pp. 296-327). New York: Guilford Press.
BirdSleuth. (2018). Cornell Lab of Ornithology. Available: http://www.birdsleuth.org [September 2018].
Burgstahler, S. (2009). Universal Design of Instruction (UDI): Definition, Principles, Guidelines, and Examples. Seattle, WA: DO-IT. Available: https://www.washington.edu/doit/sites/default/files/atoms/files/UD_Instruction_05_26_15.pdf [October 2018].
COASST (Coastal Observation and Seabird Survey Team). (2018). Available: https://depts.washington.edu/coasst [September 2018].
CoCoRaHS (Community Collaborative Rain, Hail, and Snow Network). (2018). Available: https://www.cocorahs.org [September 2018].
Cooper C., Larson, L., Shipley, N., Dayer, A., Dale, K., LeBaron, G., and Takekawa, J. (2016). Divers and Dabblers: Which Types of Birdwatchers Are Most Valuable to Citizen Science Research and Grassroots Conservation? Paper presented at the 2016 American Ornithology Society Conference.
eBird. (2018). The Cornell Lab of Ornithology. Available: https://ebird.org/home [September 2018].
Edyburn, D.L. (2010). Would you recognize universal design for learning if you saw it? Ten propositions for new directions for the second decade of UDL. Learning Disability Quarterly, 33(1), 33-41.
Elmesky, R., and Tobin, K. (2005). Expanding our understandings of urban science education by expanding the roles of students as researchers. Journal of Research in Science Teaching, 42(7), 807-828.
Eveleigh, A., Jennett, C., Blandford, A., Brohan, P., and Cox, A.L. (2014, April). Designing for dabblers and deterring drop-outs in citizen science. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 2985-2994). New York: Association for Computing Machinery.
Galaxy Zoo. (2018). Available: https://www.zooniverse.org/publications [May 2018].
Garver, M.S., Divine, R.L., and Spralls, S.A. (2009). Segmentation analysis of the volunteering preferences of university students. Journal of Nonprofit and Public Sector Marketing, 21(1), 1-2e.
GLOBE (Global Learning and Observations to Benefit the Environment Program). (2018). Available: https://observer.globe.gov [September 2018].
Grier, S., and Bryant, C.A. (2005). Social marketing in public health. Annual Review of Public Health, 26, 319-339.
Haring, M.J. (1999). The case for a conceptual base for minority mentoring programs. Peabody Journal of Education, 74(2), 5-14.
Haywood, B.K. (2014). Birds and Beaches: The Affective Geographies and Sense of Place of Participants in the COASST Citizen Science Program. (Doctoral dissertation). University of South Carolina, Columbia. Available: http://scholarcommons.sc.edu/etd/2748 [September 2018].
Henderson, L. (1996). Instructional design of interactive multimedia: A cultural critique. Educational Technology Research and Development, 44(4), 85-104.
Hudson River American Eel Research Project. (2018). New York Department of Environmental Conservation. Available: https://www.dec.ny.gov/lands/49580.html [September 2018]
Knowles, M.S. (1970). The Modern Practice of Adult Education: Andragogy versus Pedagogy. Oxford, UK: Association Press.
Knowles, M.S. (1984). Andragogy in Action: Applying Modern Principles of Adult Education. San Francisco, CA: Jossey Bass.
Kop, R., Fournier, H., and Mak, J.S.F. (2011). A pedagogy of abundance or a pedagogy to support human beings? Participant support on massive open online courses. The International Review of Research in Open and Distributed Learning, 12(7), 74-93.
Lefebvre, R.C., and Flora, J.A. (1988). Social marketing and public health intervention. Health Education Quarterly, 15(3), 299-315.
Meymaris, K., Henderson, S., Alaback, P., and Havens, K. (2008, December). Project Budburst: Citizen science for all seasons. In American Geophysical Union Fall Meeting Abstracts.
Monarch Larva Monitoring Project. (2018). Available: https://monarchlab.org/mlmp [September 2018].
Nature’s Notebook. (2018). Available: https://www.usanpn.org/natures_notebook [September 2018].
Norman, D. (2013). The Design of Everyday Things: Revised and Expanded Edition. New York: Basic Books.
Pandya, R.E. (2012). A framework for engaging diverse communities in citizen science in the US. Frontiers in Ecology and the Environment, 10(6), 314-317.
Patton, M.Q. (1997). Toward distinguishing empowerment evaluation and placing it in a larger context. Evaluation Practice, 18(1), 147-163.
Patton, M.Q. (2002). A vision of evaluation that strengthens democracy. Evaluation, 8(1), 125-139.
Pfund, C., Byars-Winston, A., Branchaw, J., Hurtado, S., and Eagan, K. (2016). Defining attributes and metrics of effective research mentoring relationships. AIDS and Behavior, 20(2), 238-248.
Phillips, C.B. (2017). Engagement and Learning in Environmentally Based Citizen Science: A Mixed Methods Comparative Case Study. (Doctoral dissertation). Cornell University, Ithaca, NY.
Phillips, T.B., Porticella, N., Constas, M., and Bonney, R.E. (In press). A framework for articulating and measuring individual learning outcomes from citizen science. Citizen Science: Theory and Practice, 3(2), 3. doi: http://doi.org/10.5334/cstp.126.
Project Budburst. (2018). Available: http://www.budburst.org/ [September 2018].
Project Feederwatch. (2018). Available: https://feederwatch.org/ [September 2018].
Rogoff, B. (1995). Observing sociocultural activity on three planes: Participatory appropriation, guided participation, and apprenticeship. In J.V. Wertsch, P. del Rio, and A. Alvarez (Eds.), Sociocultural Studies of Mind (pp. 139-164). Cambridge, UK: Cambridge University Press. Reprinted (2008) in K. Hall and P. Murphy (Eds.), Pedagogy and Practice: Culture and Identities. London: Sage.
Rose, D. (2000). Universal design for learning. Journal of Special Education Technology, 15(3), 45-49.
Russell, S.H., Hancock, M.P., and McCullough, J. (2007). Benefits of undergraduate research experiences. Science, 316(5824), 548-549.
SciStarter. (2018). Available: https://scistarter.com [September 2018].
Settles, I.H. (2004). When multiple identities interfere: The role of identity centrality. Personality and Social Psychology Bulletin, 30(4), 487-500.
Silver City Watershed Keepers. (2018). Available: http://silvercitywatershedkeepers.org/index.html [September 2018].
Slater, T.F. (1996). Portfolio assessment strategies for grading first-year university physics students in the USA. Physics Education, 31(5), 329.
Soloway, E., Guzdial, M., and Hay, K.E. (1994). Learner-centered design: The challenge for HCI in the 21st century. Interactions, 1(2), 36-48.
Stolle-McAllister, K. (2011). The case for summer bridge: Building social and cultural capital for talented black STEM students. Science Educator, 20(2), 12-22.
Sullivan, B.L., Aycrigg, J.L., Barry, J., Bonney, R.E., Bruns, N.E., Dhondt, A.A., Farnsworth, A., Fitzpatrick, J.W., Fredericks, T., Gerbracht, J., Gomes, C., Iliff, M.J., Lagoze, C., La Sorte, F.A., Merrifield, M., Reynolds, M., Rodewald, A.D., Rosenberg, K.V., Trautmann, N.M., Winkler, D.W., Wong, W-K., Yu, J.L., and Kelling, S. (2014). The eBird enterprise: An integrated approach to development and application of citizen science. Biological Conservation, 169, 31-40.
Thriving Earth Exchange. (2018). Available: https://thrivingearthexchange.org [September 2018].
Tsui, A.S. (2007). From homogenization to pluralism: International management research in the academy and beyond. Academy of Management Journal, 50(6), 1353-1364.
Valencia, R.R. (Ed.). (2012). The Evolution of Deficit Thinking: Educational Thought and Practice. London: Falmer.
WOEIP (West Oakland Environmental Indicators Project). (2018). Available: http://www.woeip.org [May 2018].