In response to the request by the Laura and John Arnold Foundation and with input from foundation staff, the committee developed text for a Request for Proposals (RFP) (see Appendix C) that addresses Recommendation 11 of the 2014 National Research Council (NRC) report, Identifying the Culprit: Assessing Eyewitness Identification. In Recommendation 11, the committee called for
- a broad exploration of the merits of different statistical tools for use in the evaluation of eyewitness performance;
- a broad exploration of the effects of different system variables (e.g., additional variants on lineup procedures, witness lineup instructions) and estimator variables (e.g., presence or absence of weapon, elapsed time between incident and identification task, levels of stress) and—importantly—interactions between these variables using either the ROC approach or other tools for evaluation of binary classifiers that can be shown to have advantages over existing analytical methods; and
- the scientific community engaged in studies of eyewitness identification performance [to] work closely with law enforcement to identify other system and estimator variables that might influence performance and practical issues that might preclude certain strategies for influencing performance […and] that policy decisions regarding changes in procedure should be made on the basis of evidence of superiority and should be made in consultation with police departments to determine which procedure yields the best combination of performance and practicality.8
8See National Research Council. 2014. Identifying the Culprit: Assessing Eyewitness Identification. Washington, DC: The National Academies Press. Contextual information and the complete text of Recommendation 11 appear on pages 117-119 of the report.
In developing the RFP, the committee established criteria to be used in evaluating the scientific merit and research design of proposals submitted to the foundation. While proposals were not to be constrained by the particular ideas raised in Recommendation 11 of the NRC’s 2014 report, proposals designed to further advance the understanding of statistical tools appropriate for validating the reliability of eyewitness performance were to be particularly encouraged.
Via email correspondence and a series of conference calls, the committee drafted and reached consensus on the language of the RFP.
In keeping with the 2014 recommendation, the RFP specifically called for research to explore “the effects on eyewitnesses of different system variables (e.g., variants on lineup procedures or instructions given to eyewitnesses) and estimator variables (e.g., presence or absence of weapon, lighting conditions and distances, elapsed time between the incident and the identification task, levels of stress) and the interactions between these variables.” The RFP further stated that proposed research should “use statistical evaluation tools to improve upon existing methods.” It emphasized that proposed research should exhibit strong research design, using, for example, “random assignment, when feasible, and other research methodologies that allow for the strongest possible causal inferences when random assignment is not feasible.”
Foundation staff asked that the RFP, in keeping with the foundation’s core objective “to address our nation’s most pressing and persistent challenges using evidence-based, multi-disciplinary approaches,” emphasize that “interdisciplinary research partnerships and/or research performed in collaboration with law enforcement agencies and the judiciary” be encouraged.
The RFP described four areas of research of particular interest to the foundation:
- research that yields an improved understanding of the relative merits of simultaneous and sequential lineup procedures;
- research that assesses the effectiveness of other lineup procedures;
- research that helps characterize the evidentiary strength of an eyewitness’s identification or nonidentification of a suspect from a lineup. This is the binary classification problem of eyewitness identifications, which raises questions such as how to properly handle different kinds of error, which have different consequences; and
- research from the broader scientific community that addresses the wide range of issues related to eyewitness identifications, such as how the probability of a correct identification varies with estimator variables including, but not limited to, those mentioned above.
The RFP indicated that proposals were to be evaluated against four criteria:
- IMPORTANCE: Is the applicant proposing research that could produce important improvement in our understanding of eyewitness identification and the ability to reduce eyewitness identification errors? Could
- the insights arising from the proposed research be applied within the constraints of real-world conditions?
- EXPERIENCE OF THE RESEARCHER(S) AND RELEVANCE OF THEIR BACKGROUND(S)
- STUDY DESIGN: Is the applicant’s proposed study design likely to result in strong and useful insights? If the proposed study involves a randomized trial, can high-value data be collected?
- PARTNERS: Does the applicant’s team include all parties needed to perform the proposed study?
Finally, the RFP described the grant application process, which began with applicants submitting letters of interest that included a description of work to be performed. The committee assisted the Arnold Foundation in broadly distributing the RFP.
For the letters of interest, applicants were asked to address all four selection criteria listed above, but it was not expected that applicants would have finalized all aspects of the study design and partnership agreements. All four criteria were, however, used as a basis for evaluation when each full proposal was reviewed by the committee. While the committee would provide the foundation with assessments of the merits of each proposal, the foundation would make final decisions as to which proposals were ultimately funded.
In total, 20 letters of interest were received by the Arnold Foundation and then provided to the committee. The committee was asked to evaluate the letters against the criteria above and provide an evaluation of the letters to the foundation. The foundation, taking into account the committee’s evaluation, invited nine applicants to submit formal proposals. Nine proposals were received by the foundation and these were transmitted to the committee for review.
In July 2016, the committee met via conference call and at an in-person meeting to assess and draft text that provided its evaluation of the nine invited full proposals. In considering the full proposals, the committee, in keeping with its charge, evaluated the scientific merit and research design of the proposals. Based upon the criteria set forth by the RFP, the committee developed a list of questions by which to gauge the merit of the individual proposals:
- Does the proposed research ask a new or interesting question?
- Is the proposed research actionable/operationalizable in some way?
- Does the researcher/research team have the appropriate toolkit/expertise to carry out the proposed research?
- Does the proposed research offer an appropriate mechanism to collect data/evaluate collected data?
- Does the proposed research develop new statistical approaches and/or apply existing approaches not previously employed to evaluate the accuracy of eyewitness identifications?
- Does the proposal demonstrate that the researcher(s) has/have appropriate knowledge of the relevant scientific literature?
- Does the proposed research have the potential to significantly impact eyewitness identification procedures and is there a mechanism that would allow for dissemination of the research to the relevant stakeholder community/communities?
- Does the proposed research directly address items a and b of Recommendation 11 of the NRC report (see page 5)?
Based upon the application of these criteria, the committee found three proposals to be superior and found four additional proposals of interest and of significant scientific merit. The committee found the remaining two proposals to be inadequate.
With regard to study design, the committee observed some unevenness among the proposals. While several proposals provided detailed study designs, other proposals made only general statements about how the project would be conducted.
The committee noted that the members of the various research teams involved in the three proposals in the superior category and the four proposals of significant scientific merit possess the necessary academic qualifications and experience with which to carry out the proposed research projects.
The committee noted unevenness with regard to the identification of research partners. While several proposals provided detailed information about extramural partnerships, others did not.
The proposals raised two issues that the foundation might consider as it seeks to advance eyewitness identification research by supporting the current projects or other future research: (1) there is intrinsic value in making research data widely available, and projects that collect large data sets should be encouraged to share the data widely, and (2) a relationship between the scientific and law enforcement communities is necessary at all stages of a research project if the research is to have actionable outcomes.
The committee also noted that many of the submitted proposals are guilty of the “prosecutor’s fallacy.” That is, they assume that we really know who the guilty party is and then seek to find the probability that the correct person will be identified by a given eyewitness identification process. But in criminal procedures we do not generally know who committed the crime, so the proper question in examining various processes for eyewitness identification is, “What is the probability that the ‘truth’ revealed by the process will be correct?” That is a very different question and will generally lead to very different probabilities.
To illustrate, consider the analogous situation with drug testing, for which we normally have good information from laboratory experiments about the reliability of a test, which is the probability of finding the truth when the truth is known. If the drug test for a person on trial came back positive, and the reliability of the test is known to be, say, 99%, a prosecutor might say that the test shows with a very high probability that the defendant was using drugs. But that is not really the correct assertion for the purposes of trial because it ignores the false positives from those tests (which are analogous to erroneous eyewitness
identifications). Rather, the appropriate question is, “What is the probability that, when a drug test comes back positive, the test is correct?”
The committee mentions this only to say that care must be taken with the research proposals it evaluated. Experienced researchers should be able to focus clearly on the proper questions, and none of the proposals received was excluded because of this apparent imprecision.
The committee is delighted that the foundation has taken this important step to advance eyewitness identification research and looks forward to further activity in this area.
This page intentionally left blank.