In order to obtain broad input into its work, the committee publicly released a draft report for comment in August 2016 after completing Phase I of the study. This draft report was intended to elicit feedback from the interested public in order to ensure that the committee was comprehensively covering the relevant terrain and also proposing reasonable goals and objectives that could be monitored over time without imposing undue data collection burdens. The interim report was available on the committee’s website, with a 7-week period for comment.
Public comments were sought to obtain perspectives and insights from researchers and practitioners knowledgeable about undergraduate STEM reform and education statistics.
The public comment draft included a conceptual framework for the indicator system, identified goals and objectives for improving undergraduate STEM education at both 2-year and 4-year institutions, and reviewed existing systems for monitoring undergraduate STEM education: Table A-1 shows the draft goals and objectives on which the committee sought comment. Based on the committee’s consideration of what information from the public would be most useful for the second phase of the study, the report included a series of questions for readers to respond to, as follows:
- The committee proposes five goals to improve the quality of undergraduate STEM education (see Chapter 2). Is this the right set of goals? Should any be deleted or other goals added? Why do you suggest this change?
- The committee identifies 14 objectives around which national indicators for undergraduate STEM education will be developed in Phase II of the study (see Chapter 2). Is this the right set of objectives? Should any be deleted or other objectives added? Why do you suggest this change?
- The committee discusses various data sources on undergraduate STEM (see Chapter 3). Are these the right data sources? Should any be deleted or other sources added? Why do you suggest this change?
- Are there larger issues related to measuring and improving quality in undergraduate STEM that are missing from the committee’s proposed conceptual framework, goals, and objectives?
- How and where, if at all, do you see the national indicators to be developed in Phase II being used to improve undergraduate STEM?
Individuals and representatives of organizations were encouraged to submit their responses to these questions through an online questionnaire that was posted with the public comment draft. The committee received 32 comments through the website questionnaire and 2 comments through letters.
To supplement the input from the online questionnaire, the committee convened a day-long public meeting in October 2016, which included responses from invited individuals and institutions, as well as open microphone sessions for all meeting participants. The meeting drew just over 100 people, 62 in person and 40 by the webcast.
Following the public meeting, the committee reviewed all of the feedback and identified possible revisions. The committee used the Phase II of the study to revise its work in response to the concerns and suggestions it had received, resulting in this final report. The rest of this appendix summarizes the feedback and describes the steps taken to revise the initial conceptual framework and the preliminary review of data sources and monitoring systems. Those revisions are reflected in this final report.
Several themes emerged across all comments received, through the website, letters, and at the October meeting:
- Role of Discipline-Based Education Research (DBER). Several commenters ask that we consider the role of DBER in improving the quality of undergraduate STEM. They asked the committee to consider adding DBER to one of our objectives, and to consider it
as a potential indicator of the use of evidence-based educational practices.
- Meaning of Goal 1. Several commenters raised concerns about the wording of Goal 1, which was “Increase numbers of STEM majors,” and offered suggestions for other ways of approaching the goal.
- Evidence-Based Practices. Commenters asked for a more thorough explanation of the meaning of “evidence-based practices” and suggested broader, more expansive use of the term.
- Expanded Definitions of Equity. Commenters praised the committee’s attention to diverse learners, but several asked that the committee broaden this group to include students with learning disabilities, first-generation college students, and other populations, along with discussion of the ethical dimensions of diversity and inclusion.
- Unit of Analysis. Commenters raised questions and offered suggestions about the most appropriate unit of analysis for measuring improvement in STEM. For example, some called for indicators at the department or institutional level, while others expressed concern that individual institutions would be held accountable for national-level indicators of equity, diversity, and inclusion.
- Defining STEM and Related Terms. Commenters asked for expanded definitions of STEM, STEM literacy, and STEM learning to emphasize the role of the social sciences, the social and civic application of STEM knowledge, the development of ethics, positive attitudes, and “21st century” skills and to more closely integrate STEM with the humanities.
- Future Indicators. A few commenters noted that the framework, goals, and objectives are relevant to current undergraduate STEM but may need to be updated in the future, as student populations and higher education institutions change.
- Use of Existing Rubrics. Many commenters were concerned that the draft did not give more prominence to the PULSE vision and change reform initiative, which has developed rubrics to measure progress toward some of the proposed objectives.
- Data Sources. A few commenters noted that the Science and Engineering Indicators report is not an original data source. In addition, several pointed to the PULSE rubrics as a potential data source.
In response to these comments, the Committee made several revisions to the interim goals and objectives shown in Table A-1:
DBER Although the committee decided not to identify DBER as a specific objective or indicator, discussion of DBER was added to the chapter on use of evidence-based educational practices (see Chapter 3).
Goal 1 The committee revised the wording of Goal 1 from “Increase numbers of STEM majors” to “Ensure adequate numbers of STEM professionals,” and added language clarifying that demand varies across the different STEM disciplines (see Chapters 1 and 5).
Equity The committee expanded its focus on equity to include students with disabilities and first–generation college students (see Chapter 4).
Unit of Analysis The committee clarified its focus on national indicators, and the nation as a whole as its primary unit of analysis, as called for in the study charge (see Chapter 1).
Definition of STEM Considering comments about the lack of clarity around STEM literacy, the committee discussed the recent report on science literacy (National Academies of Sciences, Engineering, and Medicine, 2016b). Given that report’s findings about the difficulty of defining science literacy, as well as the challenge of specifying a level of STEM literacy that all students should master, the committee decided to drop STEM literacy as a formal goal. However, the committee explains that its vision for undergraduate education includes all students developing a basic understanding of STEM concepts and skills (see Chapter 1).
While recognizing the value of the social sciences and the humanities and the benefits of an integrated liberal arts education, the committee concluded it needed to maintain its focus on undergraduate STEM education, as required by the study charge. Additionally, in response to calls to address the future of STEM education, this report discusses the growth of online education and assessment, noting that as these technologies advance, new indicators of STEM learning may be needed (see Chapters 1 and 7). Finally, to clarify and focus the overarching goals, the committee decided to drop the goal of continuous improvement, but added continuous improvement as an objective within the goal of increasing mastery of STEM concepts and skills by increasing students’ engagement in evidence-based educational practices (see Chapter 3).
Data Sources In response to comments about data sources, the committee distinguished between original data sources and compilations of statistics and data (see Chapter 6), clarified its focus on national-level indicators (see Chapter 1), and considered a few specific PULSE rubrics that
have the potential to be used to gather information on a national basis for the proposed indicators (see Chapter 3).
TABLE A-1 Goals and Objectives in the Draft for Public Comment
|Goal||Framework||Objective||Strategies to Advance Objective and Possible Measures|
||Input||1.1 Multiple pathways into and through STEM programs||
|Process||1.2 High retention of students in STEM disciplines beyond core foundational courses||
|Process||1.3 Appropriate general education experiences for STEM students’ foundational preparation||
|Outcome||1.4 STEM credential attainment||
||Process||2.1 Equity of access to high-quality undergraduate STEM education||
|Outcome||2.2 Representational equity in STEM credential attainment||
||Process||3.1 Use of evidence-based STEM educational practices both in and out of classrooms||
|Process||3.2 Equitable access to evidence-based STEM educational practices both in and out of classrooms||
|Environment||3.3 Support for instructors to use evidence-based teaching methods||
|Environment||3.4 Institutional culture that values undergraduate STEM education||
|Outcome||3.5 STEM learning for students in STEM educational experiences||
|Goal||Framework||Objective||Strategies to Advance Objective and Possible Measures|
||Outcome||4.1 Access to foundational STEM experiences for all students, to develop STEM literacy||
|Outcome||4.2 Representational equity in core STEM literacy outcomes||
||Environment||1.1 Ongoing data-driven improvement||