Committee member Alfred Aho, a professor of computer science at Columbia University, commented on several topics: motivations for computational thinking in education, potential pitfalls in ineffectively teaching computational thinking, the need for investment in infrastructure and tools to facilitate learning of computational thinking, and the role of assessment.
Motivation for Computational Thinking
Echoing the sentiments of Matthew Stone, Aho described three common motivations for explicitly introducing computational thinking into education. First, he argued, computational thinking has an impact on virtually every area of human endeavor, as illustrated by the first workshop report’s discussion of computational thinking applications in fields as diverse as law, medicine, archeology, journalism, and biology.
Second, he noted dangers in computational thinking done badly. He recounted a story—“A number of years ago when I was doing some consulting for NASA, I came to Washington and noticed an article in the Washington Post that said global warming wasn’t as bad as scientists feared because the empirical measure of the rate of rise of Earth’s oceans wasn’t as bad as the computer models had predicted. It turned out to be a software error. So if we’re going into this world of modeling and simulation, I would like to put in a plea for good software engineering
practices and making sure not only that the data are correct but also that the programs are correct.” To underscore this point, Aho cited an article in Nature about bad software in computational science.1
Third, Aho suggested that computational thinking plays an important role in developing new and improved ways of creating, understanding, and manipulating representations—representations that can change, sometimes dramatically, the way in which people see problems.
Humanization of Computational Thinking
Aho observed that a number of workshop participants pointed to the humanizing effect of computational thinking. Recalling Idit Caperton’s thoughts that using information technology in an appropriate manner “engages people, engages their souls, their passion, and their productivity, and people care,” Aho described similar experiences in working with undergraduates. He found that using creative programming projects to hone and develop computational thinking skills motivated students to pursue further education in computer science. Aho described classes in which students work in small teams to create their own innovative programming language and then to build a compiler for it, and he reported that “often the students say the most important things that they learned from this course are not principles of programming languages or compiler design but the interactions that they had with the other students and the fun they had in doing the projects.”
Aho also suggested that this kind of response to the use of technology was an effective rebuttal to those who argue that computers and information technology are dehumanizing, as illustrated by Jaron Lanier’s arguments in You Are Not a Gadget.
Computational Thinking as a Moving Target
Aho acknowledged the community’s need for a common definition of computational thinking, development of which is inherently difficult given the rapidly changing world to which computational thinking is often applied. Any static definition of computational thinking would likely be obsolete 10 or 20 years for now, he argued, and thus, ”The real challenge for the entire community is to define computational thinking and also to keep it current.”
With that thought in mind, Aho stated that he was particularly taken
1 Zeeya Merali, 2010, “Computational Science: … Error: … Why Scientific Programming Does Not Compute,” Nature 467(7317):775-777. Available at http://www.nature.com/news/2010/101013/full/467775a.html.
with a point made during Deanna Kuhn’s presentation, that computer science and education communities should use computational thinking not just to teach old things but also to teach new things, both new methods and new ideas, to solve new problems, because that’s what the people we will be educating are going to be doing in the future.
Need to Apply Learning Science to the Problem of Teaching/Learning Computational Thinking
Echoing Jeannette Wing’s original charge for the workshop series, Aho said he also believed that educational theory and developmental psychology would help to inform the teaching of computational thinking regarding what particular content to teach and when to teach it. For example, developmental psychology could help identify the specific concepts of computational thinking that would be most appropriate for young children. More generally, he argued that for computational thinking to be taught effectively, any curriculum for computational thinking should be phased according to a developmental sequence characteristic of the students engaged with that curriculum.
Finally, he also suggested that developmental psychology might have value in contributing to different pedagogical models for learners with different cognitive styles and in shaping the infrastructure and tools needed to teach computational thinking.
Infrastructure for Computational Thinking
Addressing the issue of the infrastructure needed to support a serious educational effort to promote computational thinking broadly, Aho noted that such an infrastructure did not consist only of hardware but also necessarily included continuing funding streams, instruments for gathering data needed to analyze outcomes, and an ongoing data collection effort. He added that the infrastructure would also require ongoing maintenance for and the development of new tools to support computational thinking.
A key element of infrastructure, Aho argued, is the ability to integrate applications. Aho warned that “unless these issues get resolved, we are going to find ourselves in a world of the future which may resemble the software world that we’re currently in, which is largely a Tower of Babel, [with] lots of incompatible infrastructures and a lot of expense.” This comment prompted Stephen Uzzo to argue that interoperability, access, usability, and portability of data are problems that can be explored through collaboration.
How Do You Know What Students Are Learning?
Aho reflected concerns shared by a number of workshop participants that determining what students are learning in computational thinking activities may be difficult. He noted that assessing how a student has internalized the abstractions of computational thinking may be challenging, and even assessing programming skills can be difficult. For example, he indicated that although program correctness is an essential goal of good programming, a student who writes a correct program (i.e., one that exhibits the appropriate behavior) nevertheless may not have made the conceptual connections that one might expect from someone who has written a correct program. He illustrated this point in commenting on Walter Allan’s presentation, in which he observed that “[in thinking about] the kind of thought process that a student is following to get the bunny to eat these carrots, I am not sure what the student is actually learning about some of these much deeper issues that a serious programmer would have to face.”
Committee member Uri Wilensky, a Northwestern University professor and director of the Center for Connected Learning and Computer-Based Modeling, shared his observations on a number of key issues discussed at the workshop, including the motivation and value in teaching computational thinking, the challenges arising from the continuing non-convergence on one definition of computational thinking, and identification of the best environment and tools for conveying computational thinking to different audiences.
Wilensky noted that in recent years, many branches of science and engineering have changed in ways that require researchers to be facile with computational thinking. Disciplines such as biology, physics, mathematics, and so on utilize computational methods to analyze problems and model phenomena.2 Computational thinking in many ways offers a new way to interact and learn about the world and scientific phenomena. According to Wilensky, in order to effectively engage and contribute to modern science and engineering, future scientists and engineers must be
2 National Research Council, 2010, Report of a Workshop on the Scope and Nature of Computational Thinking, Washington, D.C.: The National Academies Press. Available at http://www.nap.edu/catalog.php?record_id=12840. Last accessed February 7, 2011.
able to do computational thinking. Thus early in their academic careers, key computational thinking concepts should be introduced and mastered.
A second reason for encouraging computational thinking is the power that it affords for greater automation of tedious tasks and the ability to manage more complexity for all types of learning and discovery. Mechanical automation allows one to delegate certain tedious tasks and simple problem solving in favor of more complex tasks and problem solving. Indeed, as the problems at hand become more complex from a process and computational perspective, increasingly computational tools and abstractions are needed to analyze and understand them.
A third reason is that computational thinking supports the capacity for complex design and simulation. It enables one to naturally create designs within a specific context that do not require access to different kinds of materials because the materials are represented computationally in the form of data and bits. Wilensky also noted that such simulations could be used to inform public debate and discourse about issues of public policy—simulations could be used as modeling tools to explore alternative scenarios for situations in which the interactions and feedback loops among the relevant elements (e.g., resources) are tightly coupled.
Fourth, computational thinking (and computational tools) can enhance self-expression and collaboration, supporting the use of many different forms of expression and the easy sharing of those expressions. The potential for expression and collaboration can be very motivating to many individuals, especially children. Wilensky suggested that the use of computation in art, music, and other kinds of expressive media is under-explored in much of the available research.
As a fifth reason to motivate computational thinking, Wilensky recalled Caperton’s argument that educators do not always have to start with kids but rather can focus on those in positions of leadership. That is, Wilensky paraphrased, “If we are thinking about the citizenship value of computational thinking, then it is shortsighted to not pay attention to the people who are actually empowered to make a difference and to try to change the discourse among that group so that they are computationally literate enough to be able to understand this complex world they are being asked to lead.”
Last, Wilensky argued that computational thinking, much like the use of Arabic numerals, democratizes access to knowledge. He noted that the significance of Arabic numerals was not that they were essential to multiplication and division (indeed, there were algorithms for multiplying and dividing Roman numerals), but rather that because they were so much less cumbersome, Arabic numerals enabled many more people to perform multiplications and divisions. Wilensky then said, “The claim I was making is that we can now use computational representation—
which similarly affords greater access to knowledge and new knowledge development.”
As an example, Wilensky pointed to work he has done with seventh-grade students to use computational thinking and computer modeling to study segregation patterns in Chicago.
They started by using some of the NetLogo variation models that were based on the work of the Harvard economist Thomas Schelling, who actually won a Nobel Prize for that … last year. Schelling, who was a very learned and skilled economist, took many months to build these segregation models by using lots of checkered boards and moving coins around and flipping them back and forth according to determinant rules. He had the basic thinking that was needed to do those models. What he didn’t have was tools that could actually do it quickly enough so that he could consider all kinds of alternative scenarios. Now these seventh graders were doing that and they were asking all kinds of questions that pushed well beyond Schelling, like what would happen if there were some Asians that desired only integrated neighborhoods or what would happen if you had many more sets of groups that had different criteria. All those things could be easily explored within the foundational framework—[but] really [were] pretty much impossible without computational thinking and related tools.
Epistemological Diversity Regarding Computational Thinking
Wilensky said that although he saw a lot of common ground on certain aspects of computational thinking among educators and researchers, there were a number of significant areas where workshop participants saw things differently. (This diversity of perspective was also reflected in the first workshop report.) Specifically, he thought that the different ways of understanding computational thinking discussed in the second workshop fell into several categories: ways of seeing and knowing, ways of doing or capacities, a method of inquiry, and ways of collaborating.
He noted that some panelists talked about computational thinking as ways of seeing and knowing. For example, he pointed to Robert Tinker’s presentation in which Tinker talked about breaking up the world into different simple processes or pieces as a way of seeing the world not just as objects but rather as various informational pieces that can be attached to objects and processes and manipulated. Wilensky said that in this view, computational thinking as ways of seeing and knowing really represents a different way of understanding the world.
Ways of doing and capacities was another conception of computational thinking present in many of the different presentations. According to Wilensky, this view emphasizes the importance of building, designing, and going through various “constructionist” kinds of activities, as
Seymour Papert would call them. In this sense, ways of doing includes issues of modeling and thinking using computation as a way of representing the world and being able to experiment and explore alternative scenarios within the simulated world.
Computation as a method of inquiry was interesting to Wilensky because the computer is protean enough to afford users the ability to explore and manipulate all kinds of processes within a small space. Illustrating this view, Wilensky cited John Jungck’s presentation in which Jungck talked about the very small range in the evolutionary possibility scale that is actually represented in real creatures. Through computational thinking researchers can create simulated worlds in which one can explore evolutionary trajectories that never actually came into being, creatures that could have evolved but did not.
Last, Wilensky noted that a number of speakers described computational thinking as though it was a way of collaborating. These presenters focused on the ways in which collaboration can be extended as a result of computation and computational tools. New ways to connect and form different groups are no longer necessarily limited by geography. Wilensky held that “instead of a spatial model of collaboration, we have this kind of network model of collaboration where there are many different opportunities for synching up, and that capacity is becoming more and more important in our society, and computation is another way to facilitate that.”
A Diversity of Venues for Computational Thinking
Represented at the workshop were a number of different perspectives regarding the most effective environment and tools for teaching computational thinking. Wilensky distilled the points of view as those favoring formal curricular learning versus extracurricular learning and those favoring lab-based learning versus in-the-field learning.
The case for making computational thinking a part of the formal school curriculum was made by several speakers. Wilensky pointed to arguments made by Tinker that the right place for computational thinking is in schools, specifically within the science curriculum, because science already uses computational thinking and computers in major ways. With computational thinking, educators can facilitate all kinds of modeling activities in science that really represent ways of actually doing real-world science as opposed to just sort of learning about science. Wilensky argued that social science research may also be a fertile ground for computational thinking, saying that “social science is another very fertile area to integrate computational thinking because tools now enable us to be able to mine large data sets or to create models that were not possible.” The con-
structs of new representational infrastructure and meta-representational capacities in computational thinking offer possibilities for substantial advances in social science.
Others argued that computational thinking should be its own subject within the formal curriculum. Wilensky pointed to Caperton’s presentation, which demonstrated that educators can actually design a curriculum around computational thinking. Tinker and others did note a concern that the fact of already packed school curricula may generate some push back on the idea of computational thinking taught as a separate subject. To this point, Wilensky responded that this may be more a strategic discussion rather than a pedagogical one. Work presented by Paulo Blikstein and others shows that there is room even in current science curricula to introduce computational thinking concepts in a way that they fit, and also mutually support learning of other complex concepts. Others further argue that computational thinking fits best in an extracurricular context. Wilensky argued that each option should be explored.
Lab-based approaches were discussed, as were in-the-field approaches. Wilensky argued that this theme is an important one because it reflects the fact that the public in general and educators in particular “tend to think of computing as these kinds of things that are built into our computers and we tend to do them inside. But there were at least some hints of capacities beyond that.” Wilensky pointed to presenters Tinker, Jungck, and Uzzo, whose presentations discussed the use of various probes and sensors in the field to collect data for computational learners to analyze and manipulate. Wilensky stated that these options illustrate that “we are not necessarily limited by this [indoor] model of what computation is. Instead we can think of ubiquitous computation and all the different kinds of ways in which we can do things.” Thus in-the-field approaches to computational thinking education must be explored, just as lab-based approaches must be developed.
Speaking for himself, Wilensky argued that computational thinking is important enough that it should not have to be squeezed in on the margins or sneaked in on the side. He acknowledged the pragmatic benefits of such an approach but noted that it is perhaps inconsistent with a serious view of computational thinking as a major new mode of thinking that can be powerful for everybody, not just for an elite few.
Wilensky also believed that it is sometimes a red herring to assert that there is no room in the standard curriculum to accommodate a serious examination of computational thinking. Indeed, he argued, sometimes important ideas in computational thinking can be introduced incrementally along with standard content in a way that makes the standard content easier to learn (and vice versa).
Different Tools for Computational Thinking
Wilensky indicated that another theme emphasized in the discussion was what kinds of tools are being used to enable computational thinking. The workshop revealed a range of approaches in use, many of which were related, although Wilensky noted that the main distinguishing factor in these approaches was the question of whether the tools were developed with a target audience in mind. Thus the question is whether, even with children, educators want to use these specially designed learning tools or whether professional tools should be used. For example, Wilensky proposed that maybe tools such as Scratch are designed more with an audience of children in mind—for a target audience especially tuned to learning and motivation through things such as games. That is, using games can be a major motivational tool for an audience of children learning computational thinking.
Other tools may have been designed specifically to target a professional community. Caperton argued for use of professional-level tools such as Flash, one of the most prevalently used animations programs in the world, in computational thinking activities because this use of authentic tools can be a kind of motivation for students to continue learning. Wilensky pointed to modeling tools such as NetLogo, AgentSheets, and many others that particularly help in science. He reiterated that presenter John Jungck demonstrated the “extent [to which] biology has changed dramatically as a result of computation and all the different kinds of tools that are now in the regular toolbox of biologists that just were not there several decades ago.” He went on to say that “these tools have changed what the discipline is and made the science of biology much less a natural descriptive one and much more one that involves modeling and analysis with very large sets of data, for example,” suggesting that students interested in pursuing careers in biology may be motivated to learn computational thinking as a result of having access to authentic professional biology research tools.
Create and Modify as Complementary Approaches
Wilensky noted the difference between students writing programs starting from a blank screen versus students modifying existing programs, but argued that both approaches have value in conveying concepts of computational thinking. However, he did caution that the canonical “use-modify-create” sequence is not the only viable approach to teaching the skills of computational thinking. In his words,
It could be in the very first class that kids might create something, like it might be in a biology class where we might say, “Start with some kind
of creature and give some rule of birthing and dying and see what happens to the system.” That’s a very small creation, but nonetheless it’s a creation, and there will be a diversity of different possible choices that people will come up with, and a comparison of those can lead to lots of insights. So we can think of ”creating” in small bites as well, and sometimes creation is a lot easier than modifying as a different kind of entry point, and all of the outcomes are ones that we want.
An Affective Dimension
Wilensky also noted an affective dimension to some of the presentations. Specifically, many of the participants in the activities that were reported in the workshop had done sophisticated programming work in developing genuinely useful applications but nevertheless did not believe that they were, in fact, programming. Wilensky saw this disconnect between their capabilities and their self-reported assessments as a problem worth addressing, and pointed to the importance of boosting the students’ confidence that in fact they can master complex topics.
He further drew an analogy to the teaching of reading—“I am struck by how much effort we as a school system put into reading. It is a really difficult process, yet we think it is so valuable that we invest enormous amounts of resources in it in the schools. I want to think of us as being bold enough to try to make the claim that computational thinking and computational literacies are becoming important enough that we ought to be investing major resources into it.”
Committee member Yasmin Kafai is a professor at the University of Pennsylvania Graduate School of Education. Her research focuses on the design and study of tools for learning programming.
Her comments at the workshop focused on ways to articulate and teach computational thinking more effectively. They included a discipline-oriented approach for identifying key facets of computational thinking, a developmental progression approach for teaching, a real-world problem-solving approach for identifying concepts and teaching, and a cycle approach (use-modify-create) for teaching and assessing learning.
A discipline-oriented approach, Kafai said, means starting from individual disciplines to identify important and useful aspects of computational thinking. This approach may allow the community to articulate more clearly what computational thinking is and what it is not. Kafai noted, “It is within the disciplines that aspects such as programming, visualization, data management, and manipulation can actually help us illuminate and understand processes.”
According to Kafai, the computer science and education communities have not developed what presenter Jill Denner termed a “developmentally appropriate definition of computational thinking.” Kafai acknowledged that
[w]e all have examples of kids of many ages and adults who are being very courageous and interested in doing computational thinking, but we also know from prior experience in mathematics and science education that we really do need a more profound understanding of what kids’ engagement with computational thinking at different ages is, and then how we can kind of build pedagogies, examples, on it. I think we are far away from that point. These presentations here gave us some ideas about where to start looking and where the examples are.
Kafai found the approach presented by Danny Edelson and Robert Panoff helpful. They focused on how computational thinking can help ask interesting questions and solve real-world problems, rather than simply develop algorithms. They used computational thinking to help students answer questions such as, What’s real? Where are the issues? Where are the anomalies and what do they mean? Kafai argued that these questions point to a “kind of social aspect of computational thinking which we don’t talk enough about but which would be really important in the social relevance of bringing computational thinking into the disciplines and judging what the value is.”
Kafai is a fan of the cycle approach to learning and teaching computational thinking; she argued that the workshop’s presentations seemed to come together in favor of this approach as well. “I think we have some convergence here on a kind of cycle approach,” Kafai said, “and I know other presenters before us also alluded to this kind of use-modify-create as a kind of pedagogy to introduce students into approaches to computational thinking.”
Kafai added, “I don’t think it’s so bad that the kids get some pieces of code to start with, rather than … a blank screen and … [the expectation that they] develop all the programming on their own, especially if they don’t have any prior competencies in it.” She argued that learning to use the code and manipulate it is a good way to try out strategies before designing one’s own programs. In addition, the cycle approach works across the disciplines and can be used to facilitate computational learning based in data analysis, visualization, and game design approaches to teaching computational thinking. Kafai felt that the next step was to articulate some extensions and caveats to the cycle approach in order to build better assessment tools.
Marcia Linn’s comments focused on the value and trajectory of computational thinking and on the challenges associated with incorporating computational thinking into the curriculum. Linn, of the University of California, Berkeley, believes that computational thinking is important for everyone for many reasons, including:
• Making personally relevant decisions as a citizen in a scientific and technologically advancing society,
• Succeeding in a growing number of disciplines and jobs,
• Increasing interest in the information technology professions, and
• Enhancing U.S. economic competitiveness in the international sphere.
Linn echoed a point made by presenter Taylor Martin—that computational thinking empowers learners. When learners successfully combine disciplinary knowledge and computational methods they develop their identity as STEM learners. These opportunities for empowerment and expression can affect the way students think about themselves and their potential for continuing in STEM fields.
Linn argued that computational thinking is a powerful concept that by its very nature involves multiple disciplines. She recommended characterizing the trajectory of computational thinking from elementary to college courses. In her view, computational thinking has a role in nearly every discipline and at every level of learning.
Linn acknowledged that the community has not settled on one consensus concept of computational thinking. She felt that recent work—including the two National Research Council workshops on the topic of computational thinking—has resulted in a growing set of compelling examples and some emerging criteria.
The examples characterize the types of reasoning and disciplinary problems that could illustrate computational thinking at every grade level and for a wide range of courses. Several such examples follow:
• Human genome sequencing. To understand human genome sequencing, learners need to combine computational ideas with disciplinary knowledge about genetics. The computational ideas that students need to integrate include those of repeated applied algorithms; precisely formulated, unambiguous procedures; search, pattern matching, and iterative refinement; and randomization as an asset in repeated fragmentation. The disciplinary knowledge includes the notion of DNA as a long string of base pairs.
• Modeling of economic or sociological systems. To understand modeling of economic or sociological systems, students need to combine computational ideas with disciplinary knowledge of economics and sociology. As an example in economics, consider the idea of aggregating multiple independently specified rule-based agents and sensitivity to initial conditions. As an example in sociology, consider knowledge of community as a collection of independent decision makers.
Linn argued that the criteria for identifying computational thinking are emerging. They include combining computational ideas with disciplinary knowledge. Successful applications of computational thinking involve a process of design. Learners who do computational thinking engage in a sustained process of investigation that results in a novel solution to a problem.
Linn thinks that the value of computational thinking lies in its ubiquity, contemporary role in scientific research, and potential to motivate learners. As the character and criteria for computational thinking are refined, it will grow in importance in the curriculum.
Linn argued that incorporating computational thinking into the curriculum, especially for precollege learners, faces many challenges. She pointed out that it is still not clear whether computational thinking should ultimately be incorporated into education as a general subject, a discipline-specific topic, or a multidisciplinary topic. She noted the conundrum that the goal of becoming literate in computational thinking may not be achieved by taking a course on computational thinking but rather by studying topics in various disciplines that require computational thinking. Indeed, it may be necessary to study computational thinking in several disciplines to fully understand its scope and nature. Only by exploring computational thinking in multiple disciplines can learners appreciate its common features and the challenges of using computational thinking in a new discipline.
Computational thinking is emerging in new specialties that integrate disciplinary knowledge and computational algorithms. Linn argued that computational thinking would be most effective when integrated into specific disciplines rather than as a stand-alone course. Linn remarked, “It seems more efficient to take a disciplinary course and create activities that use computational ideas to advance understanding, but the case could be made for other solutions.”
She noted the challenges associated with incorporating computational thinking into the already-packed school curriculum. Computational thinking activities require access to technology, development of new curriculum materials that align with standards, teacher professional
development, and building of a community of users who can try out and refine the activities.
Understanding where and how computational thinking fits into current courses will require a concerted effort. Linn remarked, “We could call for emphasizing computational thinking everywhere and end up finding that it is nowhere because no one felt responsible for it. In addition, even if we did incorporate computational thinking into every course we might fail to build competence because the experiences were not cumulative. We need to think about ways to build coherent understanding of computational thinking as students encounter it across disciplines.” Linn saw the overarching goal of the workshop as being to catalyze thought about the steps needed to make computational thinking central to all of education.
Linn also commented on the nature of the curricular materials available for teaching computational thinking. Although materials exist, they are not widely available to educators and may be optimized for home use. The available materials generally result from small-scale grassroots efforts or are centered on technology environments. For example, many students are using computational learning tools outside the classroom (such as Scratch or Alice) but do not see any connection between these tools and what they may be doing in school. Teachers could use these tools to enable students to combine disciplinary knowledge and computational thinking.
Many of the available materials are designed to encourage students to explore complex problems fairly autonomously. They need trial and refinement to meet the needs of a broader audience. It is not clear how to make these materials useful and available throughout the educational system.
To make computational thinking a central part of the curriculum in all the relevant courses, Linn argued, requires making the case that it is essential to each discipline. To convince the K-12 community that computational thinking is central requires proactive work with teachers, school administrators, and policy makers. Linn recalled Christine Cunningham’s comments about the importance of beginning with teachers and administrators to persuade them of the value of incorporating computational thinking into the curriculum. She also related her own experience that getting school administrators on board “has made an enormous difference in creating a willingness to sustain the use of technology-enhanced materials and even to obtain resources dedicated to using those materials consistently.”
Linn drew an analogy to the challenges that exist with respect to adding projects to the STEM subjects. In her view, K-12 students should do at least one 2-week project every year. Such a project would be a natural venue for using computational thinking. She argued that advocating
for allocating time in the curriculum to projects—time that would be used to support the teaching of computational thinking and other STEM subjects—would be effort well spent. Teachers could select projects and appropriate technologies. They could use any technology including paper and pencil but would have an opportunity to use powerful computational tools.
Linn noted that the project-based format is particularly well suited for computational thinking because it allows for the kind of sustained reasoning and iterative refinement that occurs when a student is doing a complex task. By contrast, most K-12 curricula do not require students in STEM courses to engage in sustained activities.
She also argued that this type of effort will require the formation of a community of teachers who support each other and mentor newcomers. A one-time summer workshop will not be sufficient. Computational thinking education cannot succeed in the long term without several teachers at every school doing the same thing, “because if you don’t have a community, you don’t have anything that can sustain this kind of exciting, innovative work.”
As far as options to start this integration, there was quite a bit of discussion at the workshop as to whether the best initial approach would be to start with the informal extracurricular activities or the typical school curricula to incorporate computational thinking. Linn suggested targeting both approaches since each will reach a different learning audience. “We are always going to be reaching kind of a different population starting in after-school and summer programs than if we start in school… . We know that these after-school and summer programs reach a wide range and often a very deserving group of students, but there are many students that just never have that opportunity.” Instead, she argued that the goal should be to target a large audience to maximize the positive empowerment factors of computational thinking.
In summary, Linn saw great potential for computational thinking as a new focus for the curriculum. She was excited about the synergy between course projects and computational thinking. She sees computational thinking as adding motivation for course projects while enticing students into STEM disciplines and preparing them for contemporary careers.
Committee member Larry Snyder is a professor of computer science and engineering at the University of Washington. Snyder has researched the topic of fluency with information technology for the past decade.
Throughout the workshop discussion, Snyder expressed particular
interest in several key discussion topics: comparisons of programming pedagogies involving programming modification versus novel program creation, opportunities for teacher education and development, and tools and options for integrating computational thinking into the school curriculum. While Snyder felt that assessment of learning is an important topic as well, he argued that in the absence of having a firm definition of computational thinking, “assessing how well we’re doing at it is probably a little bit premature.” He felt that it is clear that there is a lot to do in this area and continued research is important.
Snyder was interested in the rate of transfer of concepts in computational thinking among students who participate in projects that encourage them to read and modify an existing program versus projects in which students create programs from scratch. To illustrate this issue, Snyder cited points made by Jill Denner and Paulo Blikstein. A result of projects in which a student modifies programs prior to creating them can sometimes be that the student “stalls” at the “modify” stage and never advances to the “create” stage.
Snyder agreed that the use-modify-create model is an excellent way to formulate the computational learning challenge, but he also felt that it is important to understand and assess a student’s overall progress rather than focus on at which stage the student gets stuck. Denner agreed, replying that much more research is needed before she can be sure, but she believes that determining a student’s overall progress may be attributed to an aspect of “intrepid exploration, a willingness and a confidence to confront the complexity and not back away from it when they’re confronted with something that’s difficult.” Denner added that nearly every student has some ideas that could be executed using computational thinking, “but students come in with different levels of comfort with engaging and with going through the process of trying something, failing, trying something, and failing.”
Snyder recalled that Blikstein had a different perspective. Blikstein argued that it may be dangerous to assume that models that seem to go from simple to complex—such as modifying a program first, and then creating a new one from scratch—offer pedagogical benefits. Snyder acknowledged this point, noting that young learners are capable of creating programs, even before they can read programs from other people; in fact, students likely prefer to create their own original programs, which in turn may motivate them to learn more computational thinking skills and concepts.
Snyder made several points highlighting some of the different approaches available for teacher education and development related to computational thinking. Presenter Michelle Williams mentioned several options she and her colleagues have explored, including summer and
multiweek programs, in-service training, and so on. Snyder agreed with Williams that teacher development is critical in getting teachers up to speed and prepared to teach computational thinking. Snyder was particularly struck by the concept of teachers learning through working directly on computational thinking-related projects and activities, much like their prospective students would, rather than through rote lecture.
As far as facilitating integration of computational thinking into the school curriculum, Snyder reiterated some of the practices put forth in Jeri Erickson and Walter Allan’s presentation. Practices such as teaching teams in which several instructors in related subjects collaborate to instruct a group of students, introducing technology for the project a year in advance, and making the software available on every laptop throughout the school system are a few examples Snyder believes could really increase dissemination of computational thinking. Snyder also appreciated the use of both the technology-based computer game and the physical grid-mapped tarp to make the computational programming concepts as well as the ecology concepts behind the bunny foraging project more concrete.
Janet Kolodner is a professor of computing at Georgia Institute of Technology. Her research focuses on the cognitive sciences and learning sciences, and the roles of computing technologies in promoting and mediating learning.
Overall, Kolodner found this second workshop to be much more grounded than the first in learning research and in-the-classroom feedback. Rather than talking in the abstract about what computational thinking might be, discussion focused on real examples of the use and promotion of computational thinking, as well as cases where computational thinking may not have been furthered. Kolodner found that some of the discussions delved very deeply into many practical issues associated with developing computational thinking curricula. In particular, she found presentations by Robert Tinker, Mitch Resnick, and Robert Panoff particularly impressive in this respect.
She pointed out that the first two panels had helped add maturity and depth to her understanding of what computational thinking is, and that some of the later discussions had helped her further refine and develop aspects of this conception of computational thinking. This was particularly impressive given that Kolodner, through her work with the committee and similar computational thinking activities, already had a sophisticated understanding of computational thinking and its attendant issues. Particularly valuable to her were contributions regarding what educators
have done to help kids learn to become computational thinkers, and the ways educators might integrate those things into the curriculum.
Kolodner’s comments focused on several themes:
• The need for a formal definition of computational thinking,
• Two dueling definitions of computational thinking and their relationship to each other,
• Pedagogy and learning progressions explored in the workshop,
• Pedagogy and its role in assessment,
• Targeting specific goals for assessment,
• Distinguishing learning assessment from project evaluation, and
• Setting standards and baselines for assessment.
Definitions of Computational Thinking
Kolodner argued that the computational thinking community needs to be able to identify exactly what is meant by computational thinking to decide what learners should learn and to assess and evaluate what learners know, what they can do, and their attitudes and capabilities with respect to computational thinking. The community must be specific about the definition of computational thinking. The multiple definitions collected at the first workshop are a good start to the discussion but not enough on which to base assessment tools. She noted, “Interestingly, I am left with two not-quite-consistent views of what computational thinking is and what everyone should be capable of. Furthermore, I think this tension is something that warrants further discussion.”
Kolodner noted that the first view of computational thinking was described in the presentation by Tinker and later elaborated on by Danny Edelson. Tinker defined computational thinking as fundamentally about breaking problems into smaller and smaller problems that are solvable by rather simplistic computational devices. Edelson, in his discussion, talked about the fact that those who are AI (artificial intelligence) experts and are knowledgeable in computational modeling of cognition have learned to describe mental processes well enough so that they could be run on a computer. Further, this community of experts has a disposition toward describing various processes in that amount of detail in anticipation of using computers to assist in determining solutions. Edelson claimed that computational thinkers aim toward solutions that are constrained by the machine and aim toward breaking problems into parts or chunks that make sense computationally. Making sense computationally means that one can specify the sequencing or control within each chunk, and the chunks (or many of them) each have a particular function. Kolodner stated, “Some of us who come from computer science, and especially
cognitive-type AI, are really good at breaking down problems to a set of functional components—the pieces each play useful roles as part of a process, and they can be fit together in a variety of ways to create other processes that perform bigger functions. It’s where we feel comfortable, and we can never understand why someone else would break a problem down any other way (but people do—how odd).” Kolodner found this characterization effective and useful in describing what computational thinkers do.
Kolodner pointed out that this definition of computational thinking is far more constrained than simply thinking about computational thinking as problem solving. Rather, this definition regards computational thinking as “a certain kind of problem solving that computer scientists are pretty good at,” in particular, thinking in terms of processes to be carried out, imagining the functional pieces of those processes, and identifying which of those pieces have been used in solving previous problems and which might be used in solving later ones.
Notice that this approach is not synonymous with programming. In fact, Kolodner pointed to the work of Richard Lipton of the Georgia Institute of Technology, in which he and several colleagues figured out a treatment for the AIDS virus in patients by mapping out the biological processes within a person’s body, the substances those processes use and create, the conditions under which they work that way, and how the processes are sequenced, and then identifying ways in which the sequence of processes might be changed or disrupted. In this way, he used a computational approach to address the problem, but without programming.
This view of computational thinking is consistent with systems thinking and with model-based reasoning, both of which play a huge role both in scientific reasoning and in engaging in computational sciences. Indeed, both Tinker and Panoff proposed integrating model building, simulation, and model-based reasoning into math and science classes as a way to engage kids in computational thinking as they are getting to greater understanding and raising and solving problems in mathematical and scientific domains.
Kolodner added that she believes that computational thinking is a set of skills that transfers across disciplinary domains. She compared computational thinking to the processes involved in inquiry, noting that just as inquiry is not one specific skill but rather a collection of relevant skills specialized for different disciplines, so too is computational thinking a collection of skills that may be applied differently to different disciplines. As an example, Kolodner stated, “If you are a chemist, you are paying attention to different things than if you are a physicist or a biologist, and you answer questions by different means. You might use experimental methods or modeling methods or simulation methods or data-mining methods
as you investigate. But in all sciences, you are, in general, attempting to explain phenomena and collecting evidence to help you answer questions about those phenomena and develop well-formed explanations.” She believes computational thinking may or may not include quantitative elements, but it always includes, in some way, manipulation of variables, decisions about selecting “the right” representations, and decomposition of complex tasks into manageable subtasks, to name a few.
Although Kolodner is partial to the problem-solving view of computational thinking just described, she was also drawn to a second view of computational thinking put forth by Mitch Resnick. In his view, computational thinking is not simply for problem solving. Rather, he believes that for most people, computational thinking means expressing oneself by utilizing computation fluently. For Resnick, computation’s power is in allowing people (everybody, not just those who are good problem solvers) to express themselves through a variety of media. In this view, computational thinking means being able to create, build, and invent presentations and representations using computation. This requires fluency with computational media.
Relationship Between Two Views of Computational Thinking
Kolodner argued that a deep understanding of computational thinking may encompass a synthesis of these two views. She synthesized the Tinker/Edelson view and the Resnick view as follows:
Computational thinking is a kind of reasoning in which one breaks problems/goals/challenges into smaller pieces that are doable by a stupid computational device. This, in general, means thinking in terms of functions that need to be carried out to achieve a goal or solve a problem (not functions in the mathematical sense, but rather in terms of how things work) and pulling apart those problems/goals/challenges into smaller pieces that are functionally separate from each other and where the functions that are pulled out tend to repeat over many different situations. Computational thinkers tend to break problems into functional pieces that have meaning beyond the particular situation in which they are being used. These functional pieces can then be called on repeatedly in solving the problem or combined in new ways to solve new problems and achieve new goals and challenges.
Resnick’s view of computational thinking comes into play when one thinks about the role the computer might play in helping to break problems into pieces and compose the pieces in new ways. To the extent that the computer can help with this kind of thinking, we become capable of achieving bigger goals or solving more complex problems. But this requires two things: (1) that we develop tools to help people think com-
putationally (e.g., one could think about Scratch in this way) and (2) that we be able to use those tools fluently. A computational thinker is fluent in this kind of thinking and in using some set of tools that help with this kind of thinking.
With respect to computational thinking for everyone, the implication is that all individuals should get as far as being able to use these types of tools well to help them solve problems, meet challenges, or express themselves. Some will become more proficient, being able to manipulate these tools and solutions to create, build, or invent better solutions or creations. At the highest level are those who will be able to use computational media and thinking in the most sophisticated ways—as scientists, computationalists, and even artists.
Yet, the relationship between these two views of computational thinking is not entirely clear, and there may be a certain tension between the two. Certainly, Kolodner argued, there is overlap, for example, for those whose expression is of sophisticated complex systems. Those learning to be computational biologists and computational physicists and so on might need to have capabilities in both domains of computational thinking: problem solving/modeling and expression. But beyond this point, the relationship between the two characterizations of computational thinking is not clear. It is not clear that beginning with developing capabilities within the realm of View 2 (expression) is necessarily the way to get students to develop capabilities within the realm of View 1 (problem solving/modeling). Similarly, it is also not clear whether those who are facile at the skills and practices of View 1 will automatically be facile at the skills and practices of View 2. Kolodner believes this blurred relationship is “a really interesting conundrum that needs more attention from the research community.”
Helping People Learn to Be Computational Thinkers
Presenter Derek Briggs of the University of Colorado, Boulder, put forth a question during one of the panel discussions that Kolodner found helpful in articulating how to promote computational thinking. Briggs questioned the goals sought with respect to learning computational thinking. He wondered whether we want to be able solely to build tools that will help people reason better computationally, or rather whether we believe that computational thinking is something we want everybody to learn. He pointed out that if the latter is the case, then we seem to be going against the grain, because we know from the learning sciences and from education best practice that it is hard to learn skills disembodied from the contexts in which they are used.
Kolodner argued that the community has both goals—tool building
for better computational thinking and computational thinking as a core skill for everyone—and that Briggs’s warning about teaching computational thinking in context is a key reminder of best practice. She went on to say that the education community should most definitely be aiming toward helping everybody learn computational thinking and that, yes, the community does seek to promote computational thinking as a set of necessary general-purpose skills. Kolodner believes it is important not to fall prey to the mistaken notion that if one learns computational thinking skills in one context, one will automatically be able to use them in another context. Rather, it will be important to remember that one can learn to use computational thinking skills across contexts only if (1) the skills are practiced across contexts, (2) their use is identified and articulated in each context, (3) their use is compared and contrasted across situations, and (4) learners are pushed to anticipate other situations in which they might use the same skills (and how they would).
These four guidelines come from the transfer literature—the chapter on transfer in How People Learn3 makes them clear. Kolodner pointed out that following these guidelines is absolutely necessary in designing instruction—otherwise, we are only helping kids learn to program or learn to use some set of skills in some particular contexts. This is analogous, she added, to what we now understand about learning to be a scientific reasoner. Scientific reasoning, or inquiry, is not a simple skill that one learns in one domain and applies in a bunch of others. Rather, scientific reasoning is a set of complex cognitive skills that one must learn to carry out flexibly over a variety of domains, and the way to help kids learn that is to help them carry out scientific reasoning over a variety of situations, help them recognize what they are doing, and help them recognize how their reasoning is similar and different over a variety of situations. The workshop touched on these issues in the discussion, but the four guidelines were not entirely articulated.
This set of guidelines is really important for educators to remember with respect to computational thinking; if kids are introduced to computational thinking only in the context of programming and never think about how to use computational thinking, or never have opportunities to use computational thinking in other situations, then they may not develop computational thinking. Mike Clancy’s cases are interesting with respect to this—they make the computational thinking of experts visible as a way to illustrate computational thinking applied to a domain. Kolodner wondered to what extent students who use those cases take their compu-
3 NRC, 2000, “Learning and Transfer,” in How People Learn: Brain, Mind, Experience, and School: Expanded Edition, Washington, D.C.: The National Academies Press. Available at http://www.nap.edu/catalog.php?record_id=9853. Last accessed May 20, 2011.
tational thinking outside the computer science class, and what it would take to promote that type of cross-domain application.
Several people, across both views of what computational thinking is, talked about teaching computational thinking concepts and skills through a learning progression paradigm of use, modify, and create. Kolodner thought that many of the examples of computational thinking learning discussed in the workshop reflected adoption of this approach to teaching computational thinking, with varying levels of success.
One example was Tinker’s learning progression for learning computational thinking in a science class, learning that involved the following:
• Numbers are associated with things and their interactions (e.g., temperature),
• Values change over time,
• Changes can be modeled,
• Models involve lots of little steps defined by simple rules (e.g., molecular dance),
• Models can be tested to find a range of applicability,
• You can make models, and
• Many applications of computers share these features.
If using models is done repeatedly in science classes, and if kids gradually move from using to modifying to creating their own models, and if they discuss the features behind the models—why they are the way they are, why and how one might want to change them, and how they went about making changes and creating new models—then there is a good chance that kids will learn to think fluently about running, trusting the results of, revising, and maybe, designing computational models. If, in addition, they discuss how what they are doing is similar to what computer programmers do and/or how it is similar to other problem solving and design, they will broaden their understanding and capabilities with respect to computational thinking. Kolodner added, “The deal is that one develops the ability to broadly use cognitive skills to the extent that one has experiences using them in a variety of situations, considers how one was using them, and anticipates their use in other situations.” So, for example, one could start from science class and broaden out from there. Edelson’s analogy between computational thinking and geographic computational reasoning illustrated this point. If one helps kids reason geographically, helps them see that process as computational reasoning, and helps them anticipate other ways that reasoning might be useful, one can use that as a base and broaden knowledge and use of computational thinking from there.
Kolodner was very interested in perspectives on learning progres-
sions associated with older children. Specifically she wanted to understand at what point students were capable of creating their own computational models using computer programs rather than just using existing models and manipulating them. She noted that around middle school age, students seem able to grasp increasingly sophisticated computational and programming concepts. This observation seems consistent with a point presenter Tinker made that at around fourth grade seems to be when a number of factors such as student development, teaching resources, and opportunity converge and make computational modeling more likely. Tinker also added that creating a computational model from scratch on a computer can require a great deal of time learning programming to realize that model. On the other hand, systems like NetLogo and AgentSheets allow students to manipulate models someone else built without necessarily having to master a whole lot of detail themselves, and then allow looking inside those models and changing them before beginning from scratch to build one’s own models.
Presenter Christina Schwarz added some warnings to this discussion, pointing out again that one cannot just assume that students will learn computational thinking through model building. She pointed out that instructors have to be realistic about students and their motivations to build models. When projects have them focus on concepts that they already understand based on outside or prior knowledge, students may be more likely to explore and try more complex models. If concepts are brand new, however, students need to explore before they can do complex model building. And they certainly won’t be able to learn new computational thinking skills or concepts while they are struggling with some new science concept.
Kolodner agreed and emphasized that those creating curricula should be sure to think longitudinally—the focus should be on creating more opportunities to model year after year, helping learners to gradually build up their ability to model and their computational thinking capabilities. Their progress on both should be tracked over time. She also highlighted one more important caveat about the use and promotion of computational thinking in the classroom: simply programming, or even simply teaching students to program, is not necessarily promoting computational thinking.
Kolodner expressed concern over a thread of discussion running through some of the presentations that seemed to presume that as a part of the process of learning to program, students would learn computational thinking. For Kolodner, a big question is how an instructor can be sure that students engaging in programming activities are actually learning computational thinking. Similarly, do students themselves realize they are learning thinking skills that can be applied outside the constraints
of the particular activity they are engaging in? Or are the students just becoming better programmers or model builders or game players?
To get a clear picture of what is happening in a computational activity in terms of assessment and evaluation, one has to apply an entire toolbox of assessment and evaluation tools, according to Kolodner. One tool or method is not enough. Kolodner believes that a student’s reflecting on a computational activity, being able to teach or help someone else learn the concepts, or being able to effectively articulate the relevant computational process at issue can be seen as likely indications that the student is learning computational thinking. As students are able to use increasingly elegant, efficient, and sophisticated approaches to tackle computational thinking tasks, this ability can also demonstrate learning and improvement in computational thinking, Kolodner believes.
Another important point is that one cannot presume that just because one is programming, one is learning to be a computational thinker. Kolodner pointed out the importance of remembering that separating out the abstract processes from the specifics of what one is doing does not come easily to everyone. Referring back to points from How People Learn,4 she stated that to learn computational thinking from programming experiences, learners need to engage in thinking about what they are doing and under what other circumstances they might use the same type of thinking. Also, she was concerned that perhaps this assumption (that learning to be a computational thinker would arise simply from learning to program) reflected confusion over what computational thinking is. Although programming may be one tool that is used to teach or highlight computational concepts, it is not synonymous with computational thinking, and Kolodner again warned that a good definition of computational thinking is needed—both so that curricula will be designed to promote computational thinking and so that achieving capability in computational thinking can be measured well.
Pedagogy as a Criterion for Assessment: An Elegant Relationship
Kolodner believes that assessment and pedagogy can be rather elegantly related to each other. She pointed to arguments from Clancy and Blikstein, who both talked about pedagogy as a lead-in to assessment. Clancy talked about how the case studies in his lab-centric approach, as well as the derivative pedagogy, provided lots of criteria for assessing how well learners are actually doing computational thinking. In Clancy’s
4 NRC, 2000, How People Learn: Brain, Mind, Experience, and School: Expanded Edition, Washington, D.C.: The National Academies Press. Available at http://www.nap.edu/catalog.php?record_id=9853. Last accessed May 20, 2011.
approach, learners are learning to program (and could be learning computational thinking) through the use of case studies that show how others have solved similar programming problems. He pointed out that the decisions about what content to put into cases, and then how to evaluate and assess learners’ computational thinking, go hand in hand with each other. Blikstein talked about animated representations students develop in his activity and how when combined with the underlying pedagogy of the activity, analysis of the drawings allows certain kinds of assessments and ways of interpreting what the kids are saying and doing.
Goals of Assessment
In addition to knowing what one wants to assess, one must consider the purpose of the assessment, because the reason for any assessment plays a critical role in determining the data and process necessary to perform it. Kolodner identified three reasons for assessing computational thinking: (1) to judge the curriculum and related materials and pedagogy, (2) to judge the progress of individuals, e.g., for giving grades, and (3) to manage instructor training and support. Kolodner noted that the kinds of data relevant to each reason would not necessarily be identical.
Assessment versus Evaluation
Kolodner explained that assessment is not the same as evaluation, although the terms are often used interchangeably. According to her, assessment is about measuring what people have learned, how they feel about something, or their capabilities. Formative assessments deal with discovering what has been learned along the way to inform what comes next. Presenters Jim Slotta and Mike Clancy both noted the importance of capturing some of the reasoning learners are doing that otherwise would be invisible in a formative assessment in order to explore when and how one might change instruction along the way to improve learning. Summative assessment occurs at the end of a module or semester or project to determine how much knowledge was gained overall. Evaluation, on the other hand, speaks more to how well a curriculum or a software tool is working—its efficacy, its costs, its usability, and so on. Kolodner agreed with presenter Cathy Lachapelle of the Museum of Science, Boston, who also discussed evaluation, specifically with respect to the need for usability in a computational thinking project in order to incorporate computational thinking effectively into a curriculum and make it widely available.
In response to discussion from Lachapelle, Kolodner said that the computational thinking community should consider at some point creat-
ing its own assessment framework. The National Assessment of Educational Progress currently looks at subjects like science, math, technology education, pre-engineering, and so on, but does not assess computational thinking.
Standards and Tactics for Assessment and Evaluation
Kolodner echoed the sentiments of several presenters (Briggs, Clancy, and Schwarz) that assessment and evaluation are more than just collecting data points. They are about doing comparisons and analyzing outcomes. Sometimes those comparisons are as simple as what a researcher hypothesized versus what actually resulted. Presenter Derek Briggs argued that there must be some standard or baseline to which researchers must compare results. Briggs focused on learning progressions and constructs as one example of a standard or baseline for comparison. Kolodner called the process by which a researcher considers what standards and baselines to use and embeds those standards in the computational thinking project, the “tactics” of assessment used. In some cases, the researcher does not select his or her own baselines or learning progression but instead adopts them from an external source. Kolodner pointed to presenter Christina Schwarz’s experience dealing with her local school district’s biology learning progression guidelines for middle school students as an example of an external baseline.
Repetition and Reflection as an Assessment Tactic
One tactic Kolodner endorsed was repetition across disciplines combined with reflection. She argued that scientific reasoning and computational thinking should be done in a number of different subjects and repeated over and over in order to help learners understand both the similarities and the differences in the ways in which scientific reasoning and computational thinking are done as well as develop general skills in computational thinking. To cross disciplines effectively, Kolodner argued, there should be some sort of reflection on what it is that has been done as well as some anticipation of other circumstances in which skills and lessons learned would be useful.
Kolodner also felt that reflection on pedagogical content knowledge with respect to computational thinking is important for instructors of computational thinking. In response to Michelle Williams’s presentation, Kolodner asked for more information about how the reflection questions were developed that were posed to teachers after they had completed a teacher development computational thinking learning project. In essence, if the purpose of having the teachers complete the same project that their
students would do later was to provide scaffolding in a systematic way, Kolodner wanted to understand the underlying system better.
Embedded Assessments and Tracking/Logging Data
Embedded assessments, especially those that capture online the thinking of learners, allow assessment of student understanding that a researcher may not have access to otherwise. Kolodner noted that Briggs talked about collecting performance data and Slotta mentioned the value of real-time reflections on threads of collaborative discussions among the students. They argued, and Kolodner agreed, that these embedded real-time assessments allow “getting in there and really dealing with the issues that the learners are having at the moment that they are having them. Maybe at the moment they are having them, maybe later, but the talking uncovers things that you might not see otherwise.”
Kolodner believes that tracking of activities seems particularly important to analyzing computational thinking. Whether Blikstein’s log files, or Schwarz’s interviewing to help track thinking, or Clancy’s noting details of collaborative discussions, such tracking enables particularly important and informative project assessment and evaluation.
Kolodner finds that it is hard to tell who to go to concerning community building in the education community and the various disciplines. She stated that “people seek environments that align to their ways of thinking and working. We all do it, and this self-sorting process tends to create silos.” Kolodner argued that such silos will not help computational thinking have a wide impact.
Brian Blake is a professor of computer science at the University of Notre Dame and is associate dean for engineering. His research areas include software engineering and, more recently, methods to make advanced computer science techniques digestible for those who are not in the same specialty. The latter effort is intended to attract underrepresented minorities into computing.
In his comments to the workshop, Blake expressed the evolution of his thoughts on computational thinking through dialog and interaction with various scholars over the course of the two NRC computational thinking workshops. In the first workshop, he explained, the committee sought to characterize computational thinking by first attempting to look for the existence of computational thinking in other fields, in other ways of thinking. From there the committee could then classify and describe it as computational thinking in a way that would enable researchers and
educators to re-embed it into training students or retooling teachers or professionals.
Blake went on to explain that his experience over the past year, based on the first workshop and his own personal observations of his son’s learning progression from kindergarten to first grade, had caused his thinking to evolve. Now, the notion of developmental milestones is very important to him. He believes that the understanding of computational thinking should be thought of in terms of decomposing computational thinking “elements” into developmental milestones.
Blake noted that during Peter Henderson’s presentation on the efforts underway at Shodor, Henderson’s example featuring Thomas the Train in solving a routing challenge demonstrates that there seems to be an opportunity to start to understand computational thinking at the lowest levels, and then as we move from K-12 into postsecondary education, we can explore increasing complexity within the milestones.
Blake summarized several main points he had gathered from the second workshop’s presentations. There may be an opportunity very early in a child’s learning progression to identify significant computational thinking talent. This might be done by looking at specific instances where computational thinking might fold into a learning activity, and then assessing a student’s competency with respect to these computational elements. To illustrate, Blake pointed back to Henderson’s Thomas the Train example and suggested that a simple activity with embedded computational thinking challenges might be a means of identifying talent. Concerning the idea of training, Blake argued that by taking opportunities to identify and assess computational thinking talent in individual students, and to start to enumerate indicators of such talent, a researcher or an educator might be able to recognize when a student either is demonstrating a significant talent in computational thinking or is at least at the appropriate learning progression level for that age range.
Blake argued that the next step would be to use this process of embedding, assessing, and identifying at the macro level over a longer period of time to identify learning progression baselines. This technique utilizes assessment and evaluation to determine where in learning development a particular baseline is situated.
From the perspective of learning progression at the macro level, the types of concepts to be enumerated so as to identify potentially talented computational thinkers at young ages are not limited solely to concepts related obviously to computer science thinking, math thinking, or even scientific thinking. Instead, these concepts are likely to span all of these types of thinking and analysis. As the emerging computation community moves forward, scholars should perhaps target these sorts of concepts to specify them more clearly and possibly re-embed them for identification of talent and for determination of learning progression.