Science investigation and engineering design rely upon effective instructional resources to guide teachers and facilitate student experiences. Instructional resources are key to providing coherence by presenting and revisiting phenomena throughout the year and making connections to these phenomena as instruction progresses to new topics, phenomena, or challenges. They help students see how they can use science and engineering to make sense of subsequent phenomena and their everyday world. Phenomena and design challenges should be at the center of all instructional resources, and they should provide opportunities for students to apply the science and engineering ideas and concepts learned in one investigation to help make sense of similar, but novel phenomena in and beyond the classroom.
Many types of tools and resources can support teacher instruction during science investigation and engineering design. This chapter discusses the role of instructional resources from the perspectives of the features presented in Chapters 4 and 5: making sense of phenomena; gathering and analyze data; constructing explanations; communicating reasoning to self and others; fostering an inclusive learning environment; connecting learning through multiple contexts; and fostering coherence in student experiences. It also discusses technology as an instructional resource for investigation and design, the inclusion of teachers in the teams that develop instructional
resources, and the connection between instructional resources and professional learning.1
Ultimately, the purpose of instructional resources is to support teaching practices that help students either develop evidence to support explanatory models of phenomena through scientific investigations, or to design and test solutions to real-world challenges. Instructional resources can help students make personal connections and see science and engineering as more relevant to their lives by providing information on challenges they can directly identify with. As discussed in Chapter 3, learning theories support the idea that connecting science learning to students’ experiences is essential to retain knowledge. Application of the science learned in school beyond the classroom provides a way to support meaningful science learning. Application of knowledge to new phenomena provides students with a fuller understanding of how science and engineering are used in life. Science is a practice that requires application for student to internalize and become a way of knowing. The application of science knowledge beyond the classroom requires making sense of novel phenomena and engineering solutions to new challenges. Instructional materials can facilitate science instruction to link multiple phenomena, with multiple core ideas to provide sufficient opportunities for students to apply learning to new contexts and conceptualize the science core ideas and crosscutting concepts beyond the classroom.
As discussed in Chapter 4, one of the primary ways to establish relevance is to use questions that students find meaningful (Krajcik and Mamlok-Naaman, 2006). It is a challenge to design instructional resources that help teachers engage all students. Both the inclusion of phenomena and design challenges that are likely to be interesting to a wide variety of students and highlighting the ways teachers can adapt and modify the phenomena and challenges to their own settings are important and beneficial. Adaptations and modifications by teachers can also be beneficial (Lee and Buxton, 2008; Suriel and Atwater, 2012). However, not all teachers will have the time and resources to make such adaptations. They can benefit from the support of instructional resources that point out options for modifications and from colleagues who share their own approaches to finding and creating relevant approaches.
The building of students’ understanding across time differs from common instructional practices in several key ways. A key motivation for A Framework for K–12 Science Education (hereafter referred to as the Framework;
1 This chapter includes content drawn from papers commissioned by the committee: Designing NGSS-Aligned Curriculum Materials by Brian Reiser and Bill Penuel and Data Use by Middle and Secondary Students in the Digital Age: A Status Report and Future Prospects by Victor Lee and Michelle Wilkerson. The commissioned papers are available at http://www.nas.edu/Science-Investigation-and-Design [January 2019].
National Research Council, 2012) and the Next Generation Science Standards (NGSS) was the “growing national consensus around the need for greater coherence—that is, a sense of unity—in K–12 science education. Too often, standards are long lists of detailed and disconnected facts, reinforcing the criticism that science curricula in the United States tend to be ‘a mile wide and an inch deep’” (Schmidt, McKnight, and Raizen, 1997). The goal of the Framework is to organize standards so they reflect sensible learning sequences that support students in systematically building and connecting ideas across time. Analyses of standards and instructional resources reveal that traditional resources jump from topic to topic, without helping students build ideas piece by piece, putting them together over time, and making connections to other relevant ideas, and to their own experiences (BSCS, 2017; Kesidou and Roseman, 2002; Roseman, Stern, and Koppal, 2010; Schmidt, Wang, and McKnight, 2005; Stern and Roseman, 2004). Indeed, it is common today for teachers to adopt the strategy of assembling individual lessons on a topic from colleagues or downloading individual lesson plans from social networking sites (Greene, 2016; Hunter and Hall, 2018). While sharing instructional resources could be a valuable way to support professional learning, the types of individual lesson plans found in these venues may not reflect high-quality, independently evaluated material. Furthermore, cobbling together individual lesson plans is unlikely to result in supporting students in incrementally developing, extending, and refining their explanatory models.
The traditional paradigm of having textbooks or instructional resources simply present the central parts of disciplinary core ideas, and having students then explain them back or use them to achieve particular tasks fails to reflect this three-dimensional nature of lessons. While obtaining information (one of the science and engineering practices) may include reading a textbook or other resource to find out what experts know about a topic, this should be a part of a larger meaningful “ensemble of activity” in which students engage in practices such as argumentation from evidence or constructing explanations to put the pieces together and develop an explanation or model of the phenomenon being investigated that incorporates or applies that knowledge, rather than simply taking in a pre-packaged articulation of the concept. At the other end of the spectrum, inquiry activities in which students empirically explore relationships between variables but do not explain why those relationships hold also reflects a partial view of three-dimensional learning, since this activity leaves out the knowledge-building focus of the practices. Similarly, while science practices such as designing and conducting investigations may require instrumental skills, such as using a microscope or making a graph, simply learning these skills, isolated from an effort to make progress on making sense of the phenomena or design challenge and building knowledge of the three dimensions, would not reflect the intention of the Framework or what is meant by investigation and design. While a
range of different pedagogical approaches may be possible to achieve three-dimensional learning, it is clear that certain pedagogical approaches leave little room for meaningful integration of the three dimensions to make sense of phenomena or solve challenges. The key features of instructional resources that support the Framework and NGSS are compared to features of prior instructional resources in Table 6-1.
As described in Chapter 5, the classroom envisioned by the Framework differs significantly from most current middle and high school science classrooms. Instructional resources adopted by districts, principals, and individual teachers are a primary driver of what goes on in the classroom, which means that changing the culture of the classroom requires changing instructional resources and supporting instructional resource developers in creating and maintaining excellent resources. Studies show that instructional resources make a difference in supporting students in developing the type of learning called for in the Framework and the NGSS (Harris et. al., 2014).
A key aspect of instructional resources for investigation and design is the selection of relevant phenomena. Explaining a phenomenon or solving a problem must require developing or applying key elements of disciplinary core ideas and crosscutting concepts. As discussed in Chapter 3, it is advantageous to connect phenomena and challenges to students’ interests and everyday experiences where possible. Interest is a key catalyst for science learning in both the short and long term (Bathgate and Schunn, 2017; Bricker and Bell, 2014; Crowley et al., 2015). Presenting phenomena in ways that pique student curiosity about familiar phenomena allows students to make connections to everyday experiences, captivate their attention, and develop a sense of wonder. Utilizing multiple, but related phenomena helps address the diversity of student interests and experiences. Evidence related to the interest and personal relevance of phenomena can be used to select phenomena and design challenges, so as to facilitate broad student engagement (Penuel et al., 2017). Using place-based learning can be especially powerful when it is student driven: that is, the students identify the challenges (e.g., poor drinking water quality, human impacts that decrease local biodiversity) or phenomena (e.g., a change in depth water in a local aquifer, invasive species population increases as native species populations decrease in local ecosystem) to investigate. Place-based learning has had a positive influence on learning and motivation when collaborating with the surrounding community on environmental issues such as local air quality (Powers, 2004; Senechal, 2007; effect sizes not available).
To support three-dimensional learning, the phenomenon should provide a context in which students can gather and apply relevant science ideas and
TABLE 6-1 Shifting from Instructional Strategies Common in Prior Instructional Resources to Principles of Instructional Resources to Support Science Investigation and Engineering Design
|Make sense of phenomena and design challenges.|
|Gather and analyze data and information.|
|Communicate reasoning to self and others.|
|Apply learning beyond the classroom.|
crosscutting concepts to construct explanations for the causes of changes in systems, not simply as a context for teachers or instructional resources to demonstrate those ideas or explain them to students. A phenomenon that can be explained without reference to targeted core ideas or crosscutting concepts will not provide an adequate context for three-dimensional learning (Achieve, 2016). Similarly, a phenomenon that could be explained, in principle, by disciplinary ideas but does not engage students in applying these ideas to support an explanation of the causes of the phenomenon does not meet the criteria for instruction. For example, a teacher could show a 7th-grade classroom dry ice turning to gaseous carbon dioxide (which technically illustrates a phase change, the process of sublimation), but it is difficult to see how investigating this phenomenon will help the students develop the target ideas of the nature of matter, properties of solids and liquids, and the forces that hold them together.
Instructional resources can provide the links between phenomena and help teachers with how to sequence and draw out connections between the experiences. Science and engineering instruction that establishes the expectation that students apply what they learn deepens understanding by building on prior knowledge. Well-articulated instructional materials provide opportunities for students to apply what they have learned within the instructional sequence as well as beyond the classroom. The model of
students making sense of one phenomenon and then transferring that same set of core ideas and crosscutting concepts to make sense of analogous phenomena is effective for science learning. Instructional materials can provide intentional instructional sequence that allow opportunities for application of students’ three-dimensional learning to new contexts; more importantly, application of science knowledge to contexts beyond the classroom is an essential goal of science education. Instructional resources such as 5E or Gather, Reason, Communicate2 provide instructional sequences, but many others, such as Ambitious Science Teaching (Windschitl, Thompson, and Braaten, 2018), provide resources and frameworks that are also useful for thinking about sequencing. When considering sequencing and the choice of phenomena or design challenges, teachers need to consider in advance whether there is enough depth to the examples chosen to connect the multiple learning goals or performance expectations to be met in the unit. Each investigation or design challenge requires investing extended classroom time so they should be chosen judiciously so that each one helps students meet learning goals, build student understanding incrementally, and help students see how ideas connect and relate to one another (Krajcik et al., 2014). Instructional resources should also provide a coherent structure and clear expectation for students to apply their science and engineering learning beyond the classroom.
Instructional resources can also help teachers to increase student interest by exposing students to concrete examples of the variety of work that real scientists and engineers do. This type of intervention challenges some of the stereotypical images of professionals in these fields, and students may then have a more concrete and complex picture of science work to relate to. Wyss, Heulskamp, and Siebert (2012) used this type of intervention in STEM learning by having students view video interviews with scientists about their careers; they found a positive influence on increasing interest in pursuing STEM careers for middle school children (d = 0.52), but no learning gains were measured. Another approach is instructional resources that provide strategies for adapting resources through place-based learning (Sobel, 2005). This method, often used in environmental science, focuses science and engineering investigations on challenges and phenomena that exist in the local community.
Instructional resource designers can attend to the ways phenomena and questions can support students in building deeper knowledge of the scientific and engineering practices, crosscutting concepts, and disciplinary
2 The 5Es are engagement, exploration, explanation, elaboration, and evaluation. More information about the 5E instructional model can be found at https://bscs.org/bscs-5e-instructional-model [October 2018]. More information about the Gather, Reason, Communicate approach can be found in Moulding, Bybee, and Paulson (2015).
core ideas. Instructional resources support learners exploring solutions to questions (National Research Council, 2012) that are meaningful and relevant to their lives. Engineering design challenges and solutions are still novel to many teachers and many students. Developers of instructional resources can help teachers by anticipating student questions that will arise and demonstrating a sequence for exploring those questions that can help students build and test explanatory models or design and test solutions progressively over time. Research on design learning provides insight about cognition and can provide a framework for engaging students. A review by Crismond and Adams brings together information from many sources to inform engineering design approaches in the classroom (Crismond and Adams, 2012).
A central activity of science is to verify claims based upon evidence. In addition to the data that students gather themselves, they can now use data freely accessible on the web to answer important questions that will help them build knowledge of important useable knowledge aligned to performance expectations (standards). For example, they can access climate data, such as temperature and precipitation across years from the National Oceanic and Atmospheric Administration (NOAA); data about ocean acidification on such variables as dissolved carbon dioxide, pH, and oxygen (to name a few) from NOAA; water quality data from the United States Geological Survey (USGS); and astronomy data from Web sources such as Astronomical Data Sources on the Web. While a plethora of datasets exist for students to analyze, data are not evidence unless they can be used to provide support for scientific ideas or to support design decisions. Instructional resources can provide support for these efforts. And like any learning experience, the use and analysis of data must be used to support students in three-dimensional learning to build usable knowledge. Instructional resources can also provide structure for students to collect data and analyze data, driven by finding an answer to a question, and to use findings as evidence to support claims related to those questions.
Online data analysis tools, like Concord Consortium’s Common Online Data Analysis Platform (CODAP), can allow students to analyze data that they collect as well as datasets that they might import from online sources.
3 This section draws on the paper commissioned by the committee Data Use by Middle and Secondary Students in the Digital Age: A Status Report and Future Prospects by Victor Lee and Michelle Wilkerson. Available: http://www.nas.edu/Science-Investigation-and-Design [October 2018].
The Concord Consortium’s Energy 3D tool could be utilized for engineering design (Xie et al., 2018). Such tools allow students to find patterns in data and test predictions. More advanced students can use various statistical calculations to find the best fit line or to test if one set of data is different from another set of data. While these online tools provide unprecedented power for students to analyze data, the tools must be used as a component of students’ engagement in three-dimensional learning (i.e., a scientific investigation) and not just in isolation to carry out an activity.
Publicly accessible datasets and data visualizations are likely to become more commonly used in science classes in coming years and may affect the nature and use of data in classrooms. These datasets and visualizations are not necessarily constructed with pedagogical purposes in mind, and students do not have access to or full knowledge of how they were constructed. Complex, second-hand data are an increasingly common feature of science communication and practice and could be more explicitly integrated into middle and high school science. While public datasets have existed for years, their accessibility and visibility have exploded in the past decade. There are also a growing number of initiatives to make public data available for educational use (e.g., NOAA’s Data in the Classroom initiative at dataintheclassroom.noaa.gov or NASA’s MyNASAData project at mynasadata.larc.nasa.gov). While some of these efforts come with accompanying instructional resources and simplified data, early research suggests that students can benefit from interacting with complex, “messy” public data, perhaps even more than from textbook-like second-hand data. For example, Kerlin and colleagues (2010) found that students exploring earthquakes were more likely to engage in a full breadth of discourse related to data—including early theorizing, questioning the data collection process, exploring patterns, and predicting and evaluating—when working with “raw” data from the USGS, rather than when working with clean textbook data.
One particular challenge in using publicly available datasets in education concerns the many multivariate relationships that may be present. Students can become overwhelmed searching for meaningful relationships, or they can lose sight of the goals of inquiry as different patterns are revealed. Another challenge lies in manipulating these datasets so that they are appropriate for student-driven goals—which are likely to be quite different from the original motivations for assembling a given public dataset. However, early studies suggest that even young students are capable of some aspects of data wrangling—for example, merging datasets that may each address the same investigation, identifying subsets or specific parameters within a given dataset that are relevant for inquiry, or recalculating or recoding values so that they better align with a student or classroom’s path of
Systems of embedded assessments in instructional resources thus need to include ways to assess students’ explanatory models of phenomena and solutions to design challenges, as well as tasks that elicit students’ ability to apply their understanding to reason about novel phenomena and challenges (Ruiz-Primo et al., 2002). Such tasks need to include scoring guides that help teachers interpret students’ responses in light of the overall goal for unit learning, not just discrete elements of disciplinary core ideas, science and engineering practices, and crosscutting concepts. Finally, the tasks need to include supports for “what to do next,” depending on students’ responses to tasks, so that they can be used to support learning (DeBarger et al., 2017).
The Contingent Pedagogies project provides evidence for the value of providing integrated supports for classroom assessment and having teachers elicit and interpret student thinking in multiple times and ways (DeBarger et al., 2017; Penuel et al., 2017). In that project, a set of formative assessment tasks was integrated into two investigation-based units in middle school earth science that aligned to the NGSS. Teachers received professional development on three-dimensional learning and how to use these tasks to elicit students’ initial ideas prior to investigation and to check their understandings at the conclusion of the investigation. The assessment materials included a set of questions for teachers to ask that drew on identified problematic facets of student understanding, clicker technology for collecting student responses, and a set of talk moves to use to support student argumentation about their responses. The materials also included a set of “teaching routines” (DeBarger et al., 2010) for enacting the full cycle of formative assessment that included a set of activities teachers could use if students were having particular difficulties with understanding the focal ideas of an investigation. A quasi-experimental study of the resources that compared students in classrooms with the assessment-enhanced resources to students with the original units found students in the treatment condition scored higher on both post-unit earth science tests than control after adjusting for prior test scores4 (Harris et al., 2015). The study also found
4 For the physical science (energy) unit, the estimated effect size of 0.22 was statistically significant (z1/42.16, p1/40.03). For the earth science unit, the estimated effect size of 0.25 was statistically significant (z1/42.02, p1/40.04).
that teachers were able to use the materials to foster norms of supporting claims with evidence, which mediated student learning outcomes.
In order to provide effective foundations for these kinds of assessment conversations (Duschl and Gitomer, 1997), embedded assessments need to not only provide rich questions for teachers to ask students, but also provide formats for engaging students in self- and peer-assessment, frameworks for interpreting student ideas, and strategies for teachers to employ when student thinking reveals problematic ideas after instruction (Penuel and Shepard, 2016). Furthermore, embedded formative assessments should be based upon research into how student thinking develops in a disciplinary domain, taking into account how students’ lived experiences interact with and inform their development of understandings of disciplinary core ideas and crosscutting concepts. That is, these formative assessments are not domain-general strategies for eliciting student thinking but are specific to the scientific ideas, concepts, and practices being learned. Formal, embedded assessment tasks need to be designed using evidence-centered design that specify claims of how students make use of the knowledge and evidence that is needed to support the claim (Harris et al., 2014).
Instructional resources can support equity by providing differentiated supports and multiple options. The science and engineering practices require students to engage in intensive forms of language use for both communication and learning (Lee, Quinn, and Valdés, 2013). Leveraging the communicative resources students bring to class and enabling them to express understanding using different modalities is critical in both instructional and assessment tasks (Brown and Spang, 2008; Buxton et al., 2013). Resources that follow principles of Universal Design for Learning (UDL; Burgstahler, 2012; Center for Universal Design, North Carolina State University, 1997; Duerstock, 2018; Rose and Meyer, 2002) can ensure that a variety of entry points and modalities are intentionally integrated. One strategy for promoting more equitable participation in science classrooms is to focus on phenomena and design challenges that connect to students’ everyday lives. Instruction that builds on students’ own funds of knowledge, everyday experiences, and cultural practices in families and communities shows great potential for supporting active participation in science class for all students (Calabrese Barton et al., 2005; Hudicourt-Barnes, 2003; Rosebery et al., 2010).
For specific populations, instructional resources that reflect principles of contextualization derived from ethnographic research in students’ communities can support students linking everyday ways of making sense of the world and scientific and engineering practices (Sánchez Tapia, Krajcik, and
Reiser, 2018), core ideas, and crosscutting concepts. In addition, focusing on helping students navigate between these different ways of knowing—rather than expecting students to give up their everyday ways of knowing—is critical for promoting respect for different cultural worldviews and epistemologies (Aikenhead, 2001; Bang and Medin, 2010). As Lee and colleagues (2013) concluded, promoting equitable participation across different student populations means an emphasis on making meaning, on hearing and understanding the contributions of others, and on communicating ideas in a common effort to build understanding of the phenomenon or to design solutions for the system being studied.
Instructional resources that develop student understanding over time provide extensive supports for continuous sense-making and incremental building of models and mechanisms, including providing guidance to teachers in how to support students in making connections between their investigations and the questions they are trying to answer and how the models they build explain and support phenomena. They provide tools that students can use to keep track of their questions and the progress they are making to answer them, to help assemble evidence they have gathered into coherent science explanations, and to help students come to consensus about key components and interactions to represent in explanatory models of phenomena and criteria for solutions to challenges (Windschitl and Thompson, 2013; Windschitl et al., 2012). Importantly, these tools and routines are introduced “just in time” rather than “just in case” students need them. They are not “front loaded” at the beginning of the school year or a unit, as has been customary in science textbooks that begin with a first chapter on the scientific method (Osborne and Quinn, 2017; Windschitl, Thompson, and Braaten, 2008).
Greater coherence is essential in attaining the new vision for science education. One can think of the three dimensions working together as a tapestry to help students conceptualize core ideas, in essence, building a platform or structure where students use the three dimensions in an integrated manner to reason and make sense of phenomena. A Framework for K–12 Science Education (National Research Council, 2012) asserts “successful implementation [of science standards] requires all of the components across levels cohere and work together in a harmonious or logical way to support the new vision” (p. 245). The Framework’s vision is students will acquire knowledge and skill in science and engineering through a carefully designed sequence of learning experiences. Each stage in the sequence will develop students’ understanding of particular science and engineering practices, crosscutting concepts, and disciplinary core ideas. Coherence,
therefore, means the three dimensions are connected together and lead students to an explanation of the phenomena.
Stress on isolated parts can train students in a series of routines without educating them to understand an overall picture that will ensure the development of integrated knowledge structures and information about conditions of applicability (National Research Council, 2000). The application of practices, core ideas, and crosscutting concepts to make sense of phenomena provides a way for students to internalize, conceptualize and generalize knowledge in ways that it becomes part of how they see the natural and engineered world. Understanding the three dimensions is essential, but real transformation occurs when these dimensions are integrated in a coherent instructional approach.
The principle of incremental sense-making is one implication of the Framework’s first strategy of “a developmental progression.” The notion of developmental progressions could be taken in part to reflect a logical sequence based upon the structure of the discipline as disciplinary experts see it, as Bruner (1960) argued. This approach of disciplinary coherence, for example as reflected in the Atlas work of the American Association for the Advancement of Science (2001, 2007), would be a major advance over many existing instructional resources that do not pay adequate attention to connecting ideas and helping students build complex ideas from more simple ones (Roseman et al., 2010). However, it would not necessarily provide students with meaningful encounters with how scientific activity unfolds in practice. The logic of walking through an already-worked out explanation (with 20-20 hindsight) is quite different from what makes sense for students to question and work on a step-by-step basis. Reiser, Novak, and McGill (2017) argued that supporting meaningful engagement in three-dimensional learning requires developing and enacting instructional resources that are coherent from the students’ perspective. They argue that the notion of a social practice suggests that it is insufficient for instructional resources or teachers to present in a top-down fashion what questions or challenges students should work on and what practices they should engage in.
Instead, Reiser et al. (2017) argued that authentically engaging in science and engineering practices should help students address questions or challenges they have identified and committed to address. They build on earlier arguments for project-based learning (Blumenfeld and Krajcik, 2006; Blumenfeld et al., 1991) and learning-for-use (Edelson, 2001; Kanter, 2010) to argue that achieving the Framework’s vision means that students should be partners, along with instructional resources and teachers, in figuring out what to work on next in order to progress in making sense of phenomena and solving challenges. Expectations of what it means to be competent in doing science and understanding science go beyond skillful performance and recall of factual knowledge. Contemporary views of learning value
understanding and application of knowledge to new contexts, both in and beyond the classroom. Students who understand science can use and apply ideas and concepts in diverse contexts, drawing connections among multiple representations of a given concept (National Research Council, 2007). Instructional materials should provide useful tools and expectations for investigating phenomena and challenges beyond the classroom. Building understanding across time will only occur when instruction and the resources supporting them are coherent. There are a number of different ways to interpret coherence when exploring how instructional resources and teaching can support more effective approaches to science teaching and learning.
The shift in the aim of science education away from simply knowing science to using science and engineering ideas and practices to make sense of the world or solve challenges requires working with students’ initial resources for sense-making as valuable starting points, even though they may be piecemeal and contextualized in everyday experiences rather than coherent, generalized theories (diSessa and Minstrell, 1998; Hammer and Elby, 2003; Minstrell, 1992). Therefore, instructional resources need to be organized to help students build on their prior understandings, incrementally extending and revising these understandings as the students use practices in meaningful ways to explore phenomena and design challenges. Furthermore, the target disciplinary core ideas are more than collections of facts, but are complex coherent understandings of mechanisms, such as how matter can be rearranged or how living things get the energy and matter they need. Constructing these ideas is not like simply providing a series of answers to particular questions or testing a series of hypotheses about different variables. Instead, this knowledge building occurs incrementally as students use their prior knowledge to make sense of new situations. The Framework argues that learning should be viewed as a progression “designed to help children continually build on and revise their knowledge and abilities, starting from their curiosity about what they see around them and their initial conceptions about how the world works” (National Research Council, 2012, p. 11). Thus, instructional resources can support students’ building initial models, and continuously extending those models as they encounter new phenomena, connecting to prior explanations, deepening mechanisms to improve their explanatory power, and revising them as they uncover limitations in these models (Berland et al., 2016; Windschitl et al., 2008, 2012).
Instructional resources help students build toward performance expectations by engaging learners in making sense of phenomena or solving
challenges by using a variety of disciplinary core ideas and crosscutting concepts across disciplines to engage in various scientific and engineering practices. In this way, the components of three-dimensional learning are used as flexible tools (Duncan, Krajcik, and Rivet, 2016) to support students in sense-making. By focusing on phenomena and design challenges, instructional resources bring together science and engineering practices with disciplinary core ideas and crosscutting concepts into three-dimensional performances that have three-dimensional learning goals, rather than treating “content” and “process” as separate learning goals. Learning goals are articulated as performance expectations and are necessarily three-dimensional. In order to reach these expectations, the instructional sequences used in lessons must describe what students are doing in the lesson sequences using three-dimensional student performances. Integration of the Framework’s three dimensions means going beyond simply focusing students’ attention at some point on each of the three dimensions to authentic integration around a compelling question or challenge. This structure is reflected in one of the criterion of the EQuIP5 rubric on Integrating the Three Dimensions in NGSS lessons—“Student sense-making of phenomena and/or designing of solutions requires student performances that integrate elements of the SEPs [science and engineering practices], CCCs [crosscutting concepts], and DCIs [disciplinary core ideas]” (Achieve, 2016, p. 2).
Learning goals that gradually develop understanding toward the performance expectations need to guide the development of materials (Krajcik et al., 2014). A careful sequence of learning goals helps build coherence in a unit. Performance expectations include many components and students cannot be expected to develop understanding of these expectations in a single class period or even a single unit. Rather, learners are expected to build understanding necessary to demonstrate mastery of the performance expectations over time. In order to do so, they need to experience instruction that engages them in three-dimensional learning. As such, learning goals expressed as three-dimensional performances need to guide instruction and instructional materials.
5 EQuIP stands for Educators Evaluating the Quality of Instructional Products. It provides criteria by which to measure the degree to which lessons and units are designed for the NGSS. The purpose of the rubric and review process is to (1) review existing lessons and units to determine what revisions are needed; (2) provide constructive criterion-based feedback and suggestions for improvement to developers; (3) identify examples/models for teachers’ use within and across states; and (4) inform the development of new lessons, units, and other instructional materials. For more information, see http://www.nextgenscience.org/resources/equip-rubric-lessons-units-science [October 2018].
Technology is a tool for facilitating learning and is itself an instructional resource. It can be used for data collection, as a source of data, for data analysis, for modeling, for visualization, for simulations, and for presentations. There are important distinctions that educators must consider now between data collected through familiar modes of measurement (e.g., using common instruments in classroom laboratories, such as rulers and scales) and data collected by automated sensors, generated by simulations or other computational means, or publicly available scientific data reused by educators (Cassel and Topi, 2015; Wallis, Milojevic, and Borgman, 2006). Furthermore, many examinations of students’ data use focus on one specific context, topic, and grade range. For instance, student-collected “first-hand” and educator or curriculum-provided “second-hand” data each carry different affordances for classroom practice (Hug and McNeill, 2008), with second-hand data requiring additional context creation work in the classroom for the data to be made sensible. There are also equity implications to be considered in terms of which students have access to the tools and time needed to generate and capture their own data.
The use of automated data collection sensors has become more established in science education since publication of America’s Lab Report (National Research Council, 2006), even as research on the conditions for their effective use is still emerging. In this section, we review the latest work and emerging trends in these areas. While we refer to these data collection sensors as “automated,” we do not mean to imply that they require no oversight from a student or a teacher. Indeed, these tools place new demands on teachers and students that differ from manual data collection activities. For instance, electronic probes and accompanying software allow learners to collect data that would be difficult, time-consuming, or impossible to collect without their use. Probes are electronic sensors and software that can be used to collect and analyze data. Electronic probes attached to various computer devices—including handheld devices such as smartphones and tablets and associated software—allow students to collect, graph, and visualize a variety of data, including pH, force, light, distance and speed, dissolved oxygen, and much more. Students can use probes to facilitate data collections and visualizations. Although probes have been used in science classrooms for more than 25 years as laboratory tools and can support learners in multiple scientific practices and investigations, their use is still not commonplace in most secondary science classrooms.
Metcalf and Tinker (2004) demonstrated that probeware indeed could be used with handheld computers and effectively integrated into middle school science classrooms when coupled with supportive instructional resources. In their study, teachers responded positively to the introduction of
probeware in their classrooms. Beyond the classroom, field trip and field work experiences, such as water sampling and ecosystem exploration, have also served as effective and feasible spaces for probeware use (Kamarainen et al., 2013).
The effectiveness of using probeware up to grade 8 with moderate to large effect sizes in inquiry-oriented science and engineering, across a range of topics, has been documented in Zucker et al. (2008). Struck and Yerrick (2010) have also documented effectiveness of probeware with high school physics students, which can be augmented even further when those students also participate in digital video analysis. Consistent with prior research on probeware (e.g., Linn, Layman, and Nachmias, 1987), students also improved in their graph comprehension capabilities. Together, these studies affirm that the use of probeware in science and engineering classrooms, when coupled with supportive instructional resources and other tools, can be an asset for student learning. Also, see the discussion of modeling physics in Chapter 9.
Other types of technology can also be used for automated data collection, such as wearable sensors (Ching et al., 2016; Klopfer, Yoon, and Perry, 2005; Lee, Drake, and Williamson, 2015; Lyons, 2014); log data such as records of clicks on Websites or which tools are most frequently used (Rivera-Pelayo et al., 2012); and networked sensors (Hsi, Hardy, and Farmer, 2017; Martin et al., 2010). The measurements and visualizations made available by wearable sensors are not always intuitive nor easily comprehended by students (Ching, Schaefer, and Lee, 2014), largely because they were not initially designed with youth or learners’ needs or familiar activities in mind (Lee, Drake, and Thayne, 2016). However, as the range of possible measurements (e.g., time spent standing, heart rate, electrodermal activity) and the ecosystem of wearable devices expands, these off-the-shelf wearable devices appear to offer familiar options for classrooms that can also produce significant gains in students’ ability to reason with data (Lee and DuMont, 2010; Lee, Drake, and Thayne, 2016). One project with networked sensors, the iSense project, seeks to enable remote sensing and analysis of relevant proximal and local data using a network of sensors placed around a classroom or within a neighborhood (Martin et al., 2010). Students could log on to an online data repository that includes analysis and visualization tools to monitor the data generated by sensors. Similarly, the InSPECT project led by the Concord Consortium involves using Internet of Things (IoT) technologies and student-programmed automated data collection technologies to support high school biology lab activities (Hsi et al., 2017). These are coupled with data visualization tools, such as CODAP (Common Data Analysis Platform),6 to support data analysis
activities. Another project using IoT at the University of Colorado Boulder and Utah State University is exploring the use of SparkFun’s Smart School IoT platform that will obtain remote sensor data—such as temperature and air quality—for student inquiry activities (NSF Grant No. DRL-1742053). Further infrastructure work is still needed for these tools to be effectively used in educational settings. The aforementioned projects demonstrate feasibility using a range of paradigms, whether they involve students engineering their own sensor networks (Hsi et al., 2017; Martin et al., 2010) or obtaining and examining data from more public remote sensors. However, the abundance of data that can be collected from such projects yields both technological and pedagogical questions. These questions include how to effectively store and archive data for subsequent access and examination by classrooms (Wallis et al., 2006), as well as how to best support students in designing and navigating complex collections of data sources for which relationships are likely to be especially noisy, multivariate, and caused by unknown or unexpected factors.
Computer-based technology has the potential to support learners in conducting all aspects of scientific investigations. Computers, Internet access, and other widely available technologies are used in the examples above and can facilitate a broad and diverse array of investigations and design challenges by providing the means to quickly collect and analyze data, share results, and access additional data and information. The interactive features of computer-based technology have the capabilities to capture, display, and analyze data and information. Traditional formats can, of course, be used for analyzing and displaying information and are a key part of investigation and design. Learners can also use computer-based technologies to explore various complex aspects of science, including building, testing, and revising models; collecting, analyzing, and representing data; and finding, sharing, and presenting information (Krajcik and Mun, 2014). New computer-based technology tools can also scaffold learners in planning and conducting investigations (Quintana et al., 2004).
Portable technologies, including interactive tablets and cell phones, can support students conducting investigations in the field, allowing for more authentic investigations (Tinker and Krajcik, 2001). The flexibility, interactive power and networking capabilities, customization, and multiple representation functionality of computer-based tools, including portable technologies, will change the structure of science classrooms and how students engage in doing investigations. Instead of students receiving information from teachers or computer applications, students can use computer-based technology tools to take part in making sense of phenomena or solving design challenges by building models and developing explanations using evidence. Students and teachers can use tablets and other mobile devices with ubiquitous information access from cloud technologies to support
students in the collection, organization, and analysis of data. They can then use these data to support the development of scientific explanations.
The full potential to promote student learning using new technology tools can only be realized when they are used in ways to support learning to do tasks that cannot be accomplished without them. For example, students can use simulations to explore and visualize the atomic world and the forces that hold atoms together.7 When technology is used to carry out investigations that could have been done with real materials (such as a titration lab), the learning gains are less clear.
Computer-based technologies that have been designed to support students in learning have been referred to as “learning technology” (Krajcik and Mun, 2014; Krajcik et al., 2000). Learning technologies can serve as powerful tools that help learners meet important learning goals and engage in various scientific practices. But not all learning technologies are designed to support students in conducting scientific investigation. E-books that are a digital representation of a classical textbook might have certain features to support student learning, but they typically do not support learners in conducting scientific investigations.
Tools exist that are not necessarily designed to support students conducting investigations, but that can serve that purpose in the hands of skillful teacher. For example, spreadsheets can accomplish a number of challenging tasks that can be used to promote scientific investigations. Although not designed for classrooms, spreadsheets can facilitate scientific investigations by organizing and analyzing data and presenting data in graphical form. The use of presentation tools is another example. Presentation tools allow learners to create multimedia documents to share the results of their investigations. Multimedia documents can be critiqued and shared and serve to make students’ thinking visible.
Computer-based technology tools support students in scientific investigation by promoting access and collect a range of scientific data and information. Krajcik and Mun (2014) discussed several ways in which computer-based technology tools allow and support students in scientific explanations: (1) use visualization, interactive, and data analysis tools; (2) collaborate and share information across remote sites; (3) plan, build, and test models; (4) develop multimedia documents that illustrate student understanding (Novak and Krajcik, 2004); (5) access information and data when needed; and (6) use remote tools to collect and analyze data. These features support students in conducting scientific investigation by expanding the range of questions that students can investigate and the variety
and type of phenomena and challenges that students can experience and explain.
Computer-based tools by themselves will not necessarily support learning or students’ engagement in three-dimensional learning. However, when they are embedded within a learning environment in a manner that supports learners in answering meaningful questions, making sense of phenomena, and finding solutions to challenges in ways that support clear and specified learning goals, they can support students in three-dimensional learning. Scaffolds can be provided to support students in being successful with challenging tasks.
Research has also shown the critical role of cognitive tools in learning (Salomon, Perkins, and Globerson, 1991; Jonassen, 1995). Computer applications, such as databases, spreadsheets, networks, and multimedia/hypermedia construction, can function as computer-based cognitive tools with the role of an intellectual supporter to facilitate the cognitive process. With appropriate supports and instructional components, cognitive tools can amplify and expand what students can do in constructing knowledge (Jonassen, 1995). For example, the periodic table serves as a cognitive tool for many chemists as it represents important ideas and relationships about the properties of matter that chemists can use to make predictions. The chemist understands the underlying ideas and how to apply those ideas. Unfortunately, in school, the periodic table is often seen as something to memorize. However, for computer applications to promote learning, instruction needs to be designed around the relationships and use of core ideas.
Various forms of computer applications also serve as cognitive tools because they allow learners to carry out tasks not possible without the application’s assistance and support. For instance, new forms of computer software allow learners to visualize complex datasets and interact with visualizations that show underlying mechanisms that explain phenomena (Edelson and Reiser, 2006; Linn and Elyon, 2011). In addition, many eLearning environments provide prompts to promote student reflection of the learning process, such as the WISE8 project (Slotta and Linn, 2009).
Designing quality instructional resources requires time, effort, intention, and different types of expertise. Instructional materials are strongest
8 Web-based Inquiry Science offers a collection of free, customizable curriculum projects on topics central to the science standards as well as guidance for teachers on how these Internet-based projects can be used to improve learning and instruction in their science classrooms (grades 6-12).
when they have been developed by teams that include classroom teachers, content experts, and other experts as needed. Many classroom teachers do not have the time to design the instructional materials to support all of their classes, but they have expertise that is critical to designing materials that are effective in supporting students’ sense-making. Likewise, content experts may have deep understanding of the core ideas, crosscutting concepts, and science and engineering practices, and they can identify common and persistent misconceptions and alternate conceptions that should be acknowledged in instruction (and may be educative for the teachers). Others can bring technical expertise in designing simulations and illustrations and in assembling a coherent set of investigations.
Development of a high-quality sequence of instructional resources requires significant work from a team that includes or has access to multiple types of expertise. For example, the team might include an expert in the science to be learned, in instruction for three-dimensional learning goals, in grade-level-appropriate expectations of students and their interests, in equity and inclusion for science and engineering learning, and in assessment of student learning. Instructional materials should also be designed against a rubric to meet clear goals. As highlighted earlier in this chapter, the EQuiP rubric (Achieve, 2016) provides one example that can guide materials development to align with the Next Generation Science Standards and other Framework-based standards and supplements. An iterative process of development allows materials to be tested in the classroom and modified as needed based on their initial use and provides teachers, principals, and districts with confidence that materials can be effectively implemented in their classrooms.
To be effective, instructional resources need to be bundled with professional learning for teachers, along with assessment activities, into an integrated “curricular activity system” (Roschelle, Knudsen, and Hegedus, 2010). Of particular importance is professional learning that helps teachers to discern underlying purposes and structures of the instructional resources, so that when they select and adapt resources, they do so with integrity to the coherence of the resources (Davis and Varma, 2008). Professional learning that supports teachers in critically evaluating instructional resources to ensure that they align with classroom goals and use three-dimensional approaches to student learning is crucial.
In addition, professional learning works best when it is closely tied to what teachers will be expected to do to support students’ productive disciplinary engagement with activities that are part of the resources: that is,
focused on the content of the unit, its underlying theory of how to develop student understanding, and pedagogical strategies hypothesized to support learning in the unit (Ball and Cohen, 1999). Some supports for teacher learning are integrated into resources themselves, in which they support teacher learning of new practices, content, and/or resources. Instructional resources, as concrete reflections of the way instructional shifts can play out in teacher moves and in student work, are a key component of helping teachers shift their practice (Ball and Cohen, 1996; Remillard and Heck, 2014). Instructional resources that incorporate resources to support teacher learning are called educative curriculum materials (Davis and Krajcik, 2005). Their purpose is to help guide teachers in making instructional decisions—such as how to respond to different student ideas—when using the resources. These resources and other types of professional learning are discussed in further detail in Chapter 7.
How phenomena and challenges are treated in Framework-aligned classrooms requires a key instructional shift in both instructional resources and teaching. Phenomena and challenges need to shift from illustrations or applications of science ideas that students have already been taught to contexts that raise questions or challenges in which students develop, reason through, and utilize these ideas to explain phenomena or develop solution to challenges. When instructional resources provide a variety of carefully chosen phenomena and design challenges, teachers can select and adapt phenomena and design challenges that are best suited to their students’ backgrounds, prior knowledge and experiences, and culture and place. Instructional resources provide support to teachers in crucial areas, such as scope, sequencing, and coherence of investigation and design, gathering and use of data, and the role and use of technology. These resources can facilitate three-dimensional learning, offer phenomena that are relevant to students, support the use of data as evidence, and support development of argument for how evidence supports an explanation or design solution. They can integrate supports for equitable participation and for assessment and provide an expectation that students will apply learning to novel phenomena and design challenges beyond the classroom.
Achieve. (2016). Educators Evaluating the Quality of Instructional Products (EQuIP) Rubric for Science, Version 3.0. Available: http://www.nextgenscience.org/sites/default/files/EQuIPRubricforSciencev3.pdf [October 2018].
Aikenhead, G.S. (2001). Integrating western and aboriginal sciences: Cross-cultural science teaching. Research in Science Education, 31(3), 337–355.
American Association for the Advancement of Science. (2001). Atlas of Scientific Literacy. Washington, DC: Author.
American Association for the Advancement of Science. (2007). Atlas of Scientific Literacy (vol. 2). Washington, DC: Author.
Bathgate, M., and Schunn, C.D. (2017). Factors that deepen or attenuate decline of science utility value during the middle school years. Contemporary Educational Psychology, 49, 215–225.
Ball, D.L., and Cohen, D.K. (1996). Reform by the book: What is—or might be—the role of curriculum materials in teacher learning and instructional reform? Educational Researcher, 25(9), 6–8.
Ball, D.L., and Cohen, D.K. (1999). Developing practice, developing practitioners: Toward a practice-based theory of professional education. In G. Sykes and L. Darling-Hammond (Eds.), Teaching as the Learning Profession: Handbook of Policy and Practice (pp. 3–32). San Francisco: Jossey Bass.
Bang, M., and Medin, D. (2010). Cultural processes in science education: Supporting the navigation of multiple epistemologies. Science Education, 94(6), 1008–1026. doi:10.1002/sce.20392.
Berland, L.K., Schwarz, C.V., Krist, C., Kenyon, L., Lo, A.S., and Reiser, B. J. (2016). Epistemologies in practice: Making scientific practices meaningful for students. Journal of Research in Science Teaching, 53(7), 1082–1112. doi: 10.1002/tea.21257.
Blumenfeld, P., and Krajcik, J.S. (2006). Project-based learning. In R.K. Sawyer (Ed.), The Cambridge Handbook of the Learning Sciences (pp. 333–354). New York: Cambridge University Press.
Blumenfeld, P., Soloway, E., Marx, R., Krajcik, J.S., Guzdial, M., and Palincsar, A. (1991). Motivating project-based learning: Sustaining the doing, supporting the learning. Educational Psychologist, 26(3 & 4), 369–398.
Bricker, L.A., and Bell, P. (2014). “What comes to mind when you think of science? The perfumery!”: Documenting science-related cultural learning pathways across contexts and timescales. Journal of Research in Science Teaching, 51(3), 260–285. doi: 10.1002/tea.21134.
Brown, B.A., and Spang, E. (2008). Double talk: Synthesizing everyday and science language in the classroom. Science Education, 92, 708–732.
Bruner, J.S. (1960). The Process of Education: A Searching Discussion of School Education Opening New Paths to Learning and Teaching. Cambridge, MA: Harvard University Press.
BSCS. (2017). Guidelines for the Evaluation of Instructional Materials in Science. Colorado Springs, CO: Author.
Burgstahler, S. (2012). Making Science Labs Accessible to Students with Disabilities: Application of Universal Design to a Science Lab. (DO-IT)—Disabilities, Opportunities, Internetworking and Technology. University of Washington, College of Engineering. Available: https://www.washington.edu/doit/sites/default/files/atoms/files/Making-Science-Labs-Accessible-Students-Disabilities.pdf [October 2018].
Buxton, C.A., Allexsaht-Snider, M., Suriel, R., Kayumova, S., Choi, Y.-J., Bouton, B., and Baker, M. (2013). Using educative assessments to support science teaching for middle school English-language learners. Journal of Science Teacher Education, 24(2), 347–366.
Calabrese Barton, A., Koch, P.D., Contento, I.R., and Hagiwara, S. (2005). From global sustainability to inclusive education: Understanding urban children’s ideas about the food system. International Journal of Science Education, 27(10), 1163–1186.
Cassel, B., and Topi, H. (2015). Strengthening Data Science Education through Collaboration. Workshop report 7-27-2016. Arlington, VA. Available: http://computingportal.org/sites/default/files/Data%20Science%20Education%20Workshop%20Report%201.0_0.pdf [October 2018].
Center for Universal Design. (1997). Seven Principles of Universal Design for Learning. Raleigh, NC: North Carolina State University.
Chick, H., Pfannkuch, M., and Watson, J.M. (2005). Transnumerative thinking: Finding and telling stories within data. Curriculum Matters, 1, 87–109.
Ching, C.C., Schaefer, S., and Lee, V. E. (2014). Identities in motion, identities at rest: Engaging bodies and minds in fitness gaming research and design. In Learning Technologies and the Body: Integration and Implementation in Formal and Informal Learning Environments (pp. 201–219). New York: Routledge, Taylor & Francis Group.
Ching, C.C., Stewart, M.K., Hagood, D.E., and Rashedi, R.N. (2016). Representing and reconciling personal data and experience in a wearable technology gaming project. IEEE Transactions on Learning Technologies, 9(4), 342–353.
Crismond, D., and Adams, R. (2012). The informed design teaching and learning matrix. Journal of Engineering Education, 101(4), 738–797.
Crowley, K., Barron, B.J.S., Knutson, K., and Martin, C.K. (2015). Interest and the development of pathways to science. In K.A. Renninger, M. Nieswandt, and S. Hidi (Eds.), Interest in Mathematics and Science Learning and Related Activity (pp. 297–313). Washington, DC: American Educational Research Association.
Davis, E.A., and Krajcik, J.S. (2005). Designing educative curriculum materials to promote teacher learning. Educational Researcher, 34(3), 3–14.
Davis, E.A., and Varma, K. (2008). Supporting teachers in productive adaptation. In Y. Kali, M.C. Linn, and J.E. Roseman (Eds.), Designing Coherent Science Education (pp. 94–122). New York: Teachers College Press.
DeBarger, A.H., Penuel, W.R., Harris, C.J., and Schank, P. (2010). Teaching routines to enhance collaboration using classroom network technology. In F. Pozzi and D. Persico (Eds.), Techniques for Fostering Collaboration in Online Learning Communities: Theoretical and Practical Perspectives (pp. 224–244). Hershey, PA: Information Science Reference.
DeBarger, A.H., Penuel, W.R., Moorthy, S., Beauvineau, Y., Kennedy, C.A., and Boscardin, C.K. (2017). Investigating purposeful science curriculum adaptation as a strategy to improve teaching and learning. Science Education, 101(1), 66–98. doi: 10.1002/sce.21249.
diSessa, A.A., and Minstrell, J. (1998). Cultivating conceptual change with benchmark lessons. In J.G. Greeno and S.V. Goldman (Eds.), Thinking Practices in Mathematics and Science Learning (pp. 155–187). Mahwah, NJ: Lawrence Erlbaum Associates.
Duerstock, B. (2018). Inclusion of Students with Disabilities in the Lab. The American Physiological Society. Available: http://www.the-aps.org/forum-disabilities [October 2018].
Duncan, R., Krajcik, J., and Rivet, A. (2016). Disciplinary Core Ideas: Reshaping Teaching and Learning. Arlington, VA: National Science Teachers Association Press.
Duschl, R. (2008). Science education in 3-part harmony: Balancing conceptual, epistemic and social learning goals. Review of Research in Education, 32, 268–291.
Duschl, R., and Gitomer, D. (1997). Strategies and challenges to changing the focus of assessment and instruction in science classrooms. Educational Assessment, 4(1), 37–73.
Edelson, D.C. (2001). Learning-for-use: A framework for integrating content and process learning in the design of inquiry activities. Journal of Research in Science Teaching, 38(3), 355–385.
Edelson, D.C., and Reiser, B.J. (2006). Making authentic practices accessible to learners: Design challenges and strategies. In R.K. Sawyer (Ed.), The Cambridge Handbook of the Learning Sciences (pp. 335–354). Cambridge, UK: Cambridge University Press.
Greene, K. (2016). For sale: Your lesson plans. Educational Leadership, 74(2), 28–33.
Hammer, D., and Elby, A. (2003). Tapping epistemological resources for learning physics. Journal of the Learning Sciences, 12(1), 53–90.
Harris, C.J., Penuel, W.R., DeBarger, A., D’Angelo, C., and Gallagher, L.P. (2014). Curriculum Materials Make a Difference for Next Generation Science Learning: Results from Year 1 of a Randomized Control Trial. Menlo Park, CA: SRI International.
Harris, C.J., Penuel, W.R., D’Angelo, C.M., DeBarger, A.H., Gallagher, L.P., et al. (2015). Impact of project-based curriculum materials on student learning in science: Results of a randomized controlled trial. Journal of Research in Science Teaching, 52(10), 1362–1385. doi: 10.1002/tea.21263.
Hsi, S., Hardy, L., and Farmer, T. (2017). Science thinking for tomorrow today. @Concord, 21(2), 10–11.
Hudicourt-Barnes, J. (2003). The use of argumentation in Haitian Creole science classrooms. Harvard Educational Review, 73(1), 73–93.
Hug, B., and McNeill, K.L. (2008). Use of first-hand and second-hand data in science: Does data type influence classroom conversations? International Journal of Science Education, 30(13), 1725–1751. doi: 10.1080/09500690701506945.
Hunter, L.J., and Hall, C.M. (2018). A survey of K-12 teachers’ utilization of social networks as a professional resource. Education and Information Technologies, 1–26. doi: 10.1007/s10639-017-9627-9.
Jonassen, D. (1995). Computers as cognitive tools: Learning with technology, not from technology. Journal of Computing in Higher Education, 6, 40. doi: 10.1007/BF02941038.
Kamarainen, A.M., Metcalf, S., Grotzer, T., Browne, A., Mazzuca, D., Tutwiler, M.S., and Dede, C. (2013). EcoMOBILE: Integrating augmented reality and probeware with environmental education field trips. Computers & Education, 68(Supplement C), 545–556. doi: 10.1016/j.compedu.2013.02.018.
Kanter, D.E. (2010). Doing the project and learning the content: Designing project-based science curricula for meaningful understanding. Science Education, 94(3), 525–551.
Kerlin, S.C., McDonald, S.P., and Kelly, G.J. (2010). Complexity of secondary scientific data sources and students’ argumentative discourse. International Journal of Science Education, 32(9), 1207–1225.
Kesidou, S., and Roseman, J.E. (2002). How well do middle school science programs measure up? Findings from Project 2061’s curriculum review. Journal of Research in Science Teaching, 39(6), 522–549.
Klopfer, E., Yoon, S.A., and Perry, J. (2005). Using Palm Technology in Participatory Simulations of Complex Systems: A New Take on Ubiquitous and Accessible Mobile Computing. Available: https://repository.upenn.edu/gse_pubs/54 [October 2018].
Krajcik, J., and Mamlok-Naaman, R. (2006). Using driving questions to motivate and sustain student interest in learning science. In K. Tobin (Ed.), Teaching and Learning Science: A Handbook (pp. 317–327). Westport, CT: Praeger.
Krajcik, J.S., and Mun, K. (2014). Promises and challenges of using learning technologies to promote student learning of science. In Handbook of Research on Science Education 2 (pp. 337–360). New York: Routledge, Taylor & Francis, Group.
Krajcik, J., Blumenfeld, B., Marx, R. and Soloway. E. (2000). Instructional, curricular, and technological supports for inquiry in science classrooms. In J. Minstell and E. Van Zee (Eds.), Inquiry into Inquiry: Science Learning and Teaching (pp. 283–315). Washington, DC: American Association for the Advancement of Science Press.
Krajcik, J.S., Codere, S., Dahsah, C., Bayer, R., and Mun, K. (2014). Planning instruction to meet the intent of the Next Generation Science Standards. Journal of Science Teacher Education, 25(2), 157–175. doi:10.1007/s10972-014-9383-2.
Lee, O., and Buxton, C. A. (2008). Science curriculum and student diversity: Culture, language, and socioeconomic status. Elementary School Journal, 109(2), 123–137.
Lee, O., Quinn, H., and Valdés, G. (2013). Science and language for English language learners in relation to Next Generation Science Standards and with implications for Common Core State Standards for English language arts and mathematics. Educational Researcher, 42(4), 223–233. doi: 10.3102/0013189X13480524.
Lee, V.R., and DuMont, M. (2010). An exploration into how physical activity data-recording devices could be used in computer-supported data investigations. International Journal of Computers for Mathematical Learning, 15(3), 167–189. doi: 10.1007/s10758-010-9172-8.
Lee, V.R., Drake, J., and Williamson, K. (2015). Let’s get physical: K-12 Students using wearable devices to obtain and learn about data from physical activities. TechTrends, 59(4), 46–53. doi: 10.1007/s11528-015-0870-x.
Lee, V.R., Drake, J.R., and Thayne, J.L. (2016). Appropriating quantified self technologies to improve elementary statistical teaching and learning. IEEE Transactions on Learning Technologies, 9(4), 354–365. doi: 10.1109/TLT.2016.2597142.
Linn, M., and Eylon, B. (2011). Science Learning and Instruction: Taking Advantage of Technology to Promote Knowledge Integration. New York: Routledge.
Linn, M.C., Layman, J., and Nachmias, R. (1987). Cognitive consequences of microcomputer-based laboratories: Graphing skills development. Journal of Contemporary Educational Psychology, 12, 244–253.
Lyons, L. (2014). Exhibiting data: Using body-as-interface designs to engage visitors with data visualizations. In V.R. Lee (Ed.), Learning Technologies and the Body: Integration and Implementation in Formal and Informal Learning Environments (pp. 185–200). New York: Routledge.
Martin, F., Kuhn, S., Scribner-MacLean, M., Corcoran, C., Dalphond, J., Fertitta, J., et al. (2010). iSENSE: A web environment and hardware platform for data sharing and citizen science. In 2010 AAAI Spring Symposium Series. Available: https://www.aaai.org/ocs/index.php/SSS/SSS10/paper/download/1099/1394 [October 2018].
Metcalf, S.J., and Tinker, R. (2004). Probeware and handhelds in elementary and middle school science. Journal of Science Education and Technology, 13(1), 43–49.
Minstrell, J. (1992). Facets of students’ knowledge and relevant instruction. In F. Duit, F. Goldberg, and H. Niedderer (Eds.), Research in Physics Learning: Theoretical Issues and Empirical Studies (pp. 110–128). Kiel, Germany: IPN.
Moulding, B., Bybee, R., and Paulson, N. (2015). A Vision and Plan for Science Teaching and Learning. Essential Teaching and Learning PD, LLC.
National Research Council. (2000). How People Learn: Brain, Mind, Experience, and School. Expanded Edition. Washington, DC: National Academy Press.
National Research Council. (2006). America’s Lab Report: Investigations in High School Science. Washington, DC: The National Academies Press.
National Research Council. (2007). Taking Science to School. Washington, DC: The National Academies Press
National Research Council. (2012). A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas. Washington, DC: The National Academies Press.
Novak, A., and Krajcik, J.S. (2004). Using learning technologies to support inquiry in middle school science. In L. Flick and N. Lederman (Eds.), Scientific Inquiry and Nature of Science: Implications for Teaching, Learning, and Teacher Education (pp. 75–102). The Netherlands: Kluwer.
Osborne, J.F., and Quinn, H. (2017). The framework, the NGSS, and the practices of science. In C.V. Schwarz, C.M. Passmore, and B.J. Reiser (Eds.), Helping Students Make Sense of the World Through Next Generation Science and Engineering Practices (pp. 23–31). Arlington, VA: NSTA Press.
Penuel, W.R., and Shepard, L.A. (2016). Assessment and teaching. In D.H. Gitomer and C.A. Bell (Eds.), Handbook of Research on Teaching (pp. 787–851). Washington, DC: AERA.
Penuel, W.R., Van Horne, K., Jacobs, J., Sumner, T., Watkins, D., and Quigley, D. (2017). Developing NGSS-Aligned Curriculum that Connects to Students’ Interests and Experiences: Lessons Learned from a Co-design Partnership. Paper presented at the NARST, San Antonio, TX.
Powers, A.L. (2004). An evaluation of four place-based education programs. The Journal of Environmental Education, 35(4), 17–32.
Quintana, C., Reiser, B.J., Davis, E.A., Krajcik, J., Fretz, E., Duncan, R.G., Kyza, E., Edelson, D., and Soloway, E. (2004). A scaffolding design framework for software to support science inquiry. Journal of the Learning Sciences, 13(3), 337–386.
Reiser, B.J., Novak, M., and McGill, T.A.W. (2017). Coherence from the Students’ Perspective: Why the Vision of the Framework for K-12 Science Requires More than Simply “Combining” Three Dimensions of Science Learning. Paper commissioned for the Board On Science Education Workshop Instructional Materials for the Next Generation Science Standards. Available: http://sites.nationalacademies.org/cs/groups/dbassesite/documents/webpage/dbasse_180270.pdf [October 2018].
Remillard, J.T., and Heck, D. (2014). Conceptualizing the curriculum enactment process in mathematics education. ZDM: The International Journal on Mathematics Education, 46(5), 705–718.
Rivera-Pelayo, V., Zacharias, V., L., and Braun, S. (2012). Applying quantified self approaches to support reflective learning. In Proceedings of the 2nd International Conference on Learning Analytics and Knowledge (pp. 111–114). New York: ACM. Available: http://dl.acm.org/citation.cfm?id=2330631 [October 2018]
Roschelle, J., Knudsen, J., and Hegedus, S.J. (2010). From new technological infrastructures to curricular activity systems: Advanced designs for teaching and learning. In M.J. Jacobson and P. Reimann (Eds.), Designs for Learning Environments of the Future: International Perspectives from the Learning Sciences (pp. 233–262). New York: Springer.
Rose, D.H., and Meyer, A. (2002). Teaching Every Student in the Digital Age: Universal Design for Learning. Washington, DC: ASCD.
Rosebery, A., Ogonowski, M., DiSchino, M., and Warren, B. (2010). “The coat traps all your body heat”: Heterogeneity as fundamental to learning. The Journal of the Learning Sciences, 19(3), 322–357.
Roseman, J.E., Stern, L., and Koppal, M. (2010). A method for analyzing the coherence of high school biology textbooks. Journal of Research in Science Teaching, 47(1), 47–70.
Ruiz-Primo, M.A., Shavelson, R.J., Hamilton, L.S., and Klein, S. (2002). On the evaluation of systemic science education reform: Searching for instructional sensitivity. Journal of Research in Science Teaching, 39(5), 369–393.
Salomon, G., Perkins, D.N., and Globerson, T. (1991). Partners in cognition: Extending human intelligence with intelligent technologies. Educational Research, 20, 2–9.
Sánchez Tapia, I., Krajcik, J., and Reiser, B.J. (2018). “We don’t know what is the real story anymore”: Curricular contextualization principles that support indigenous students in understanding natural selection. Journal of Research in Science Teaching. Available: https://onlinelibrary.wiley.com/doi/full/10.1002/tea.21422 [October 2018].
Schmidt, W.H., McKnight, C.C., and Raizen, S.A. (Eds.). (1997). A Splintered Vision: An Investigation of U.S. Science and Mathematics Education. Boston: Kluwer.
Schmidt, W.H., Wang, H.C., and McKnight, C.C. (2005). Curriculum coherence: An examination of us mathematics and science content standards from an international perspective. Journal of Curriculum Studies, 37(5), 525–559.
Senechal, E. (2007). Environmental justice in Egleston Square. In D. Gruenewald and G. Smith (Eds.), Place-Based Education in an Era of Globalization: Local Diversity (pp. 85–111). Mahwah, NJ: Erlbaum.
Slotta, J.D., and Linn, M.C. (2009). WISE Science: Web-Based Inquiry in the Classroom. New York: Teachers College Press.
Sobel, D. (2005). Place-Based Education: Connecting Classrooms and Communities. (2nd ed.) Great Barrington, MA: Orion Society.
Stern, L., and Roseman, J.E. (2004). Can middle-school science textbooks help students learn important ideas? Findings from Project 2061’s curriculum evaluation study: Life science. Journal of Research in Science Teaching, 41(6), 538–568.
Struck, W., and Yerrick, R. (2010). The effect of data acquisition-probeware and digital video analysis on accurate graphical representation of kinetics in a high school physics class. Journal of Science Education and Technology, 19(2), 199–211.
Suriel, R.L., and Atwater, M.M. (2012). From the contribution to the action approach: White teachers’ experiences influencing the development of multicultural science curricula. Journal of Research in Science Teaching, 49(10), 1271–1295.
Tinker, R., and Krajcik, J. (2001). Portable Technologies: Science Learning in Context. Innovations in Science Education and Technology. Hingham, MA: Kluwer Academic.
Wallis, J.C., Milojevic, S., and Borgman, C.L. (2006). The special case of scientific data sharing with education. Proceedings of the American Society for Information Science and Technology, 43(1), 1–13. Available: http://onlinelibrary.wiley.com/doi/10.1002/meet.14504301169/full [October 2018].
Wilkerson, M.H., and Laina, V. (n.d.). Reasoning about Data, Context, and Chance through Storytelling with Repurposed Local Data.
Wilkerson, M.H., Lanouette, K.A., Shareff, R.L., Erickson, T., Bulalacao, N., Heller, J., and Reichsman, F. (2018). Data transformations: Restructuring data for inquiry in a simulation and data analysis environment. In Proceedings of ICLS 2018. London, UK: ISLS.
Windschitl, M., and Thompson, J. (2013). The modeling toolkit: Making student thinking visible with public representations. The Science Teacher, 80(6), 63–69.
Windschitl, M., Thompson, J., and Braaten, M. (2008). Beyond the scientific method: Model-based inquiry as a new paradigm of preference for school science investigations. Science Education, 92(5), 941–967. doi: 10.1002/sce.20259.
Windschitl, M., Thompson, J., and Braaten, M. (2018). Ambitious Science Teaching. Cambridge, MA: Harvard Education Press.
Windschitl, M., Thompson, J., Braaten, M., and Stroupe, D. (2012). Proposing a core set of instructional practices and tools for teachers of science. Science Education, 96(5), 878–903.
Wyss, V.L., Heulskamp, D., and Siebert, C.J. (2012). Increasing middle school student interest in STEM careers with videos of scientists. International Journal of Environmental and Science Education, 7(4), 501–552.
Xie, C., Schimpf, C., Chao, J., Nourian, S., and Massicotte, J. (2018), Learning and teaching engineering design through modeling and simulation on a CAD platform. Computer Applications in Engineering Education, 26(4), 824–840. doi: 10.1002/cae.21920.
Zucker, A.A., Tinker, R., Staudt, C., Mansfield, A., and Metcalf, S. (2008). Learning science in grades 3–8 using probeware and computers: Findings from the TEEMSS II project. Journal of Science Education and Technology, 17(1), 42–48.