The workshop sought to understand how the development of fair, valid, and reliable selection tools might address some of the challenges facing the hiring and training of pattern evidence examiners in forensic laboratories. The focus was on the pattern recognition task required of the job and consideration as to whether any individual traits or abilities were necessary before hiring in order to perform successfully on the job and in training. However, the job of a pattern evidence examiner is not just analyzing evidence; it sometimes entails reporting findings to end users, such as lawyers and judges.
The workshop steering committee included presentations on how expert testimony is treated in the courtroom in order to better understand what preparation pattern evidence examiners need. To do this, the steering committee invited four people to present their perspectives: Rockne Harmon is a retired prosecutor who served in California, Marvin Schechter is a defense attorney in New York, Mara Merlino (Kentucky State University) is a researcher whose research examined expert testimony, and Dan Murrie (University of Virginia) is a forensic psychologist who has trained others to testify.
Harmon laid out the nature of the legal system and presented some questions to consider if the forensic science community embarks on developing selection tools. He pointed out that the legal system is an adversar-
ial system. “By definition, there have to be two inconsistent, irreconcilable goals,” he said. In the courtroom, according to Harmon, a defense attorney reserves the right and has the responsibility to challenge testimony on evidence. He noted that most criticisms are not related to the findings but have more to do with the process, how evidence is collected and handled, and how analyses are documented. He recognized that forensic testimony occurs in context and is seldom the only testimony presented in a case.
Harmon identified several questions that might be worth considering before selection tools are put into practice. The legal system has a process called discovery, which generally requires the government to turn over all information relevant to a case. Harmon queried whether selection tools would be deemed discoverable or whether there would be some privilege that applies. He considered that selection tools could be used to bolster testimony if they are shown to speak to an expert’s qualifications.
Schechter concurred that the right to cross-examine is one of the hallmarks of the legal system. He recognized that the courtroom context places a number of challenges on expert testimony (e.g., a judge may decide to exclude the evidence from the case or put limitations on what can be reported to the jury, and attorneys will question methodologies and interpretations). However, he pointed out that the toughest witness to cross-examine is the one who wants to give objective information.
Merlino remarked on the admissibility of expert testimony in court. She pointed out that judges are required to look at the qualifications of experts and the scientific merits of the experts’ methods and conclusions. These determinations are often the result of interactions that take place among attorneys, experts, and judges and can at times be directed by precedent. Merlino reminded the audience that in this context communication skills become particularly important. A pattern evidence examiner asked to review evidence in the courtroom will be expected to communicate clearly the information judges and attorneys need to make good decisions and effectively represent their clients.
Schechter pointed out that the nature of forensic testimonies and the culture within forensic laboratories have been in the process of changing over the last 5 years or so. He observed these changes are for the better. He emphasized that an examiner’s ability to perform and testify successfully in the courtroom will require an understanding of the courtroom context and training to communicate appropriately in this context.
Murrie recognized that forensic examiners need certain skills to provide good testimony in court. They should be able to communicate well, particularly under pressure, and have the right amount of ego to recognize they have specialized knowledge and can present it in a neutral, objective way. Murrie pointed out that such communication is challenging
because most people are influenced by their context. “It’s really difficult to be an objective scientist in an adversarial system,” he stated.
Murrie referred to a 2009 report from the National Research Council1 that highlighted the vulnerability to certain cognitive and contextual biases and to studies by Itiel Dror that show contextual information (e.g., knowledge of a confession) can influence expert opinions.2 He emphasized the importance of understanding that bias is not an ethical issue. “Biases are universal and automatic. They happen without awareness,” he stated. He recognized that bias can be more problematic in situations where the data are ambiguous (e.g., when analyzing low-quality, degraded evidence). The challenge, according to Murrie, is that errors due to bias cannot be avoided solely by good intentions. He added that general knowledge of bias will not eliminate biased judgments.
Historically, according to Murrie, the forensic science field did not do much to manage bias. However, increasing attention to cognitive bias has prompted a number of changes in the field and interventions. Murrie referred to blinding and case management procedures that separate contextual information from the analysis task. He surmised that progress in managing bias will continue to be made procedurally by changing lab procedures in ways that account for human bias. He cautioned, however, that some procedural changes may have implications for job satisfaction because some examiners appreciate exposure to case details.
On another note, Murrie pointed out that little is known about examiners’ motivations. He considered that different examiners might have different motivations for the job, such as fighting crime, solving cases, or doing science, and that these motivations might lead toward different orientations to the task of pattern recognition.
In closing the workshop, Frederick Oswald (Rice University) observed that the attendees had created a community of interest among different disciplines. They had discussed issues and generated ideas on how to move forward. He suggested concerted efforts should be made to continue the conversation to understand each other’s disciplines.
1 National Research Council. (2009). Strengthening Forensic Science: A Path Forward. Committee on Identifying the Forensic Science Community. Committee on Science, Technology, and Law; Policy and Global Affairs. Committee on Applied and Theoretical Statistics; Division on Engineering and Physical Sciences. Washington, DC: The National Academies Press.
2 Dror, I.E., and Charlton, D. (2006). Cognitive bias and fingerprint examiners: Why experts make errors. Journal of Forensic Identification, 56(4), 600-616.
Wendy Becker (Shippensburg University), Andrew Imada (A.S. Imada & Associates), and Ann Marie Ryan (Michigan State University) presented a summary of the workshop discussion and highlighted ways that they believe the fields of forensic science and industrial and organizational (I-O) psychology could work together in the future. Becker started by outlining some of the contextual factors that would continue to impact personnel selection in forensic science. Imada and Ryan then presented some ideas on next steps.
Becker enumerated several contextual challenges that exist in the field of forensic science that will continue to impact personnel selection. One of these challenges is budget. She recognized that workshop presenters had illustrated uneven budgets, where at times there are appropriate resources to hire a new cohort of forensic examiners and at other times hiring is put on hold. There is also the influence of the civil service environment and restrictions that can impose long waits in the hiring process. Several participants recalled lab managers reviewing a list of candidates only to find the best candidates were no longer available.
Becker recognized changes in progress in the forensic science field that will affect the management of forensic laboratories and the culture of these workplaces. These changes include the creation of independent public laboratories, separation of labs from police departments, rise of private laboratories, and expansion in accreditation. Such changes can have positive and negative effects on the hiring process and selection, she said.
In the course of the workshop, Imada said he identified what he called “islands of innovation” in current selection systems. He referred to the form blindness test that had been developed many years ago but is being used today in predictive ways. He said interpretations of this test were consistent with what was heard during the research presentations on cognitive differences in visual attention. He referred to a selection process in a lab in a large city that was developed by professional staff and employed what is considered a funnel approach.3 Imada felt this example provided a model for a sound selection process that could be developed and implemented. He heard about other signs of change within forensic laboratories. He pointed to remarks made at the workshop that forensic examiners generally are beginning to report findings on evidence as “associated with” as opposed to “identified with” known sources or suspects.
Imada recognized that recruitment is not a problem in forensic science. The problem is more likely selection ratio. Attendees heard that in
3 A funnel approach is a hiring strategy of multiple assessments to narrow a large pool of applicants to target candidates. See, for example, https://premierhrsolutions.wordpress.com/tag/funnel-approach/ [October 2016].
Los Angeles this year, 450 people applied for 12 positions (see Chapter 2). Although Imada felt this was a good problem to have, this large selection ratio makes it challenging to narrow down to the best candidates. One fix is targeting efforts toward self-selection. He recalled an advertisement of the past that asked “Can you draw this?” to identify people with artistic talents. He asked if something could be developed and publicized to help people recognize whether they have pattern recognition skills and whether they would be a good fit for the job of a forensic examiner. Such an exercise could help fill applications with more of the right people, who would not have to be filtered out.
Imada also commented on the notion of bias. It is not a weakness but rather a human condition, he said. In trying to accommodate for that human condition, labs need to recognize the existence of bias and cope with it organizationally and through the workplace culture, he stated. He noted that procedural strategies and events such as rotations, inspections, schedules, and work demands can either increase or mitigate the impact of bias depending on how they are managed. He offered an impractical solution where people are hired who lack the emotional need for satisfaction—what if examiners did not need to know the results of their work, that all they were doing was processing it. But, he said, “The fact is we all have feelings, we all have biases, and trying to eliminate [bias] is not as important as trying to recognize it and trying to minimize its effect on the kinds of work that we do.”
Imada recognized that developing selection tests and improving the selection process of pattern evidence examiners is too large a problem for any one group of people or lab to do. The kind of funding and support required could come from an organization like the National Institute of Standards and Technology. He expressed that uncovering the critical knowledge, skills, aptitude, and other characteristics (KSAOs) and core competencies of this job and validation of an appropriate selection test would be extremely useful.
Becker reminded the audience that I-O psychology has a history of methodology to identify critical KSAOs and that information already exists that could be useful for advancing selection in forensic science. She pointed to the Organizational Network Website from the Department of Labor4 and decades of research from the Dictionary of Occupational Titles.
4 See http://www.onetonline.org/link/summary/19-4092.00 [November 2016] for a summary report on the job of forensic science technician with the tasks to collect, identify, classify, and analyze physical evidence related to criminal investigations. Other related job titles include crime laboratory analyst, crime scene analyst, crime scene technician, crime scene investigator, evidence technician, forensic science examiner, forensic scientist, forensic specialist, latent fingerprint examiner, and latent print examiner.
These resources have pages of detail on the tasks, knowledge, skills, abilities, and other characteristics, including the motivations.
Ryan encouraged those in the forensic science field to capitalize on existing initiatives and methods. She was optimistic that it would be possible to validate some of the research presented at the workshop for selection purposes. Predictors of performance that already exist, like the form blindness test, could stand up to the scrutiny of the courts and be defensible as job relevant and consistent with business necessity if a formal validation study was conducted.
Since many forensic labs are very small, Ryan agreed that a validation study could be conducted through a consortium or coalition of labs. She pointed out that in the private sector, selection tools are thought of as belonging to companies as part of their competitive advantage. But there is no real competition between labs in forensic science; people with the right skills and motivations in the field writ large are needed, and the nature of the profession is amenable to selection tools that can be used across labs.
Becker pointed out the synergy and sense of collaboration that developed at this workshop. Ryan recognized both the professional challenge and personal challenge to making sure the conversations continue. She said the workshop was an example of connecting different disciplines. Going forward, there are major conferences and other events where people with different perspectives and viewpoints might be invited. The personal challenge, suggested Ryan, is to create a network, add the new people who were met at the workshop, and find ways to build on the information that could be useful.
In closing, Melissa Taylor (National Institute of Standards and Technology) noted she has worked in the forensic science community for almost 15 years, dealing with the types of questions that came up in this workshop. (See Box 4-1 for a summary of the key questions and observations that were raised during the workshop.) She said she was pleased to observe the level of enthusiasm from both communities at the workshop in finding answers to tough and important questions. She thanked everyone for participating in not only the workshop, but also in the networking opportunities to create a community of interest that can grow in the future. She expressed her hope that this workshop would be a beginning of a new conversation on personnel selection in forensic science.
This page intentionally left blank.