Engineers strive to apply good practices in their profession, said Barbara Bogue and Betty Shanahan, principal investigator and co–principal investigator of the Society of Women Engineers’ Assessing Women and Men in Engineering project, yet they often fail to do so when engaged in outreach. The survey and interviews conducted before the workshop revealed that assessment is a critical but often missing influence on professional societies’ outreach efforts. Bogue and Shanahan made the case that effective assessment should be the basis for all engineering outreach initiatives.
An assessment-based framework aligns collaboration and outreach practices with typical engineering design and project management practices. It involves identifying audiences, specifying goals and objectives, and defining the metrics and data to be used in measuring outcomes. An assessment-based framework is a core tool for successful initiatives, Bogue and Shanahan said.
An assessment-based framework can apply at a meta level and at the level of specific actions, Shanahan explained. For both an overall collaboration and individual societies, it can ensure that goals are relevant to the mis-
sion and goals of each partner as well as the collaboration. It can have built-in measures of the impacts of activities and create a continuous improvement platform that serves the overall outreach programming.
Bogue and Shanahan illustrated some assessment issues by recounting the outcomes of an outreach program Bogue offered at Pennsylvania State University. A one-week residential engineering camp had the objective of recruiting high school girls into the engineering profession. The program specifically recruited girls who did not plan to become engineers. The program emphasized hands-on projects led by role models, and 42 girls participated. Postevent survey results indicated that all 42 participants were enthusiastic about the event. Before the program, 40 of the participants said they did not plan to study engineering; after the program the same number said they wanted to become engineers. In addition, all 12 of the participants who were high school seniors said they planned to apply to engineering at Penn State.
Postcamp tracking revealed, however, that only two participants followed through in applying to Penn State, and only one was accepted. Furthermore, the camp was expensive to administer—about $1,400 per girl—and a time analysis revealed that only about a quarter of their time at the camp was spent on engineering activities. “In the postreview, [the program] failed on a lot of different points,” said Bogue.
Based on these findings, the camp was radically redone. It moved from an overnight camp to a day camp with modules that had a different interdisciplinary focus each day. Young women could come to the camp on one day, two days, three days, or five days, and the camp served many more girls—more than 300 as opposed to a maximum of 60 in the overnight camp.
Moreover, the revised camp design had a much greater concentration on objectives and outcomes, and better pre- and postcamp assessments produced more relevant data. Ninety percent of time was now spent on engineering-related activities. Resources also were used more efficiently, with a cost of $142 per girl per day. In addition, the organizers avoided areas of nonexpertise: “no more slumber parties,” said Bogue.
Bogue identified several lessons from this experience:
- Participants having fun is a success indicator only if the only goal is fun.
- Poor or incomplete data can lead to wrong overall evaluations and decision making.
- Surveys that do not ask the right questions produce the wrong answers.
- Adding data through postassessments can lead to more accurate evaluations.
Shanahan noted that engineers would never approach a problem the way we often approach outreach. Companies would not begin product development without reviewing relevant technologies, determining customer needs, and establishing product goals and objectives around costs, performance, and safety. So why does outreach by engineering societies so often fail to incorporate standard engineering design? One reason is a failure to identify and serve an intended audience. In practice, the de facto audience often becomes the member volunteers or funders rather than the kids that the activity is supposed to reach. As Bogue put it, “Are we going to make Joe unhappy because we’re not offering his camp the tenth year in a row?”
Another reason for unsuccessful outreach is a failure to define the value added for every partner, including the volunteers. Limits on human and financial resources are a barrier, as are actions that belie the goal of an outreach program. For example, if a goal is to attract underrepresented students to engineering, why are members of a society so often unaware of outreach programs and why are the programs poorly resourced?
An assessment-based framework can help meet these challenges, said Shanahan and Bogue. The framework is based on an agreed-upon, shared, and overarching goal. With the goal established, measurable objectives should be determined that will fulfill that goal, they said. These objectives describe what the initiative will achieve rather than describing an activity, and they create the foundation for planning, assessment, and continuous improvement.
The next step is to leverage resources and define initiatives, using research to inform choices and designing initiatives based on the goal and objectives, not the other way around. This research can come from many different sources, including the social sciences, and can be informed by the practices of other organizations. Shanahan mentioned TED talks as a par-
ticularly useful source; they can be distributed to volunteers as 15-minute videos that capture the essence of research results.
A data collection plan with defined metrics needs to be created, Shanahan and Bogue continued. If the goal is to reach underrepresented populations, count the number of people who are reached. Use before and after questions to assess changes in knowledge, interest, skills, or confidence. Surveys, formal observations, and formal interviews are all ways of gathering data; anecdotal information may be interesting or useful but it is not trackable, comparable, or objective. Longitudinal data are the gold standard, but they are often difficult to gather.
Time needs to be scheduled for data collection and analysis, with online tools to facilitate these steps. Collaboration in data collection and analysis can ease obstacles to sharing information and provide information that is useful for shaping other initiatives.
Assessment can be affordable if it is integrated into an overall plan and scheduled from the beginning. Bogue and Shanahan offered several pointers:
- Identify a volunteer who enjoys working with data, enlist the educational arm of a society in data collection and analysis, centralize data, use off-the-shelf resources, and create a bank of common surveys and tools.
- Recruiting, selecting, and training volunteers is crucial to success, and they need to understand the goals and objectives, the assessment-based approach, and outcome metrics.
- Break down tasks by interests and skills so that no role is too big and a small number of volunteers do not carry an intensive load.
- Enlisting experts at developing and analyzing outreach initiatives can be much more effective than training nonexperts to serve in a key role.
Once data are collected and analyzed, they can be used for continuous improvement. “It’s not just what we know,” said Shanahan. “It’s what we do with what we know. Use the results to say, ‘This went well, this didn’t go well. How do we change? How do we improve? How do we enhance?’ and invest in making those changes.” Initiatives that do not work are more expensive than good assessment.
Finally, tell the story to recruit participants, motivate volunteers, engage and convince board members and sponsors, enhance and expand collabora-
tions, and share positive and negative lessons. A good story, backed by solid data, can enable programs to be scaled up to have greatly magnified impact.
Research in change management shows that change does not happen without leadership driving it, said Shanahan. “You are the ones who are going to make change happen. It’s not going to be your volunteers. . . . As someone who was . . . a society leader, I know it’s not easy to say to your volunteers, ‘You’re going to have to do more work’ or ‘We’ve got some bad news here, and we need to respond to it.’ It is challenging. But our mission as engineering societies [doing] outreach programs is not to create fun for volunteers. It’s to have effective outreach.”