Cognitive and Human Factors in Expert Decision Making
Cognitive and Human Factors in Expert Decision Making
copying and redistribution of the article or any adaptations for non-commercial purposes.
pubs.acs.org/ac Perspective
ABSTRACT: Fallacies about the nature of biases have shadowed a proper cognitive
understanding of biases and their sources, which in turn lead to ways that minimize
their impact. Six such fallacies are presented: it is an ethical issue, only applies to
“bad apples”, experts are impartial and immune, technology eliminates bias, blind
Downloaded via 89.248.248.96 on November 14, 2020 at 20:51:12 (UTC).
spot, and the illusion of control. Then, eight sources of bias are discussed and
conceptualized within three categories: (A) factors that relate to the specific case and
analysis, which include the data, reference materials, and contextual information, (B)
factors that relate to the specific person doing the analysis, which include past
experience base rates, organizational factors, education and training, and personal
factors, and lastly, (C) cognitive architecture and human nature that impacts all of
us. These factors can impact what the data are (e.g., how data are sampled and
collected, or what is considered as noise and therefore disregarded), the actual
results (e.g., decisions on testing strategies, how analysis is conducted, and when to
stop testing), and the conclusions (e.g., interpretation of the results). The paper concludes with specific measures that can minimize
these biases.
ethical issues, in books11, conferences, and training (e.g., “ethics Research has found that 70% of forensic scientists now
in cognitive bias”12). It is a fallacy that arises from a basic acknowledge that cognitive bias is a cause for concern in
misunderstanding of what cognitive bias is all about. It is not a forensic science as a whole, but only 52% think it is a concern in
matter of dishonesty, intentional discrimination, or of a their own specific forensic domain, and just 25% think it is
deliberate act arising from underlying ethical issues.13 It is true relevant to them personally, reflecting the hallmarks of the bias
that there are cases of intentional professional misconduct, e.g., blind spot.17,23
when a chemist in a drug laboratory in Boston forged tests.14,15 Sixth Fallacy: Illusion of Control. Even when experts are
However, cognitive bias is not about such ethical issues of made aware of and acknowledge their biases, they nevertheless
personal character, integrity, or intentional misconduct. think they can overcome them by mere willpower.25 This is the
Second Fallacy: Bad Apples. Often when errors and biases illusion of control. Combating and countering these biases
are found, the experts involved are blamed for the errors, rather requires taking specific stepswillpower alone is inadequate to
than acknowledging any systemic issues, such as the human deal with the various manifestations of bias.
element in the analysis and the general susceptibility to bias.16 In fact, trying to deal with bias by the illusion of control may
Surely there are cases where error is due to examiners’ actually increase the bias, due to “ironic processing” or “ironic
competency, but these are relatively easy to detect and to fix. rebound”.26 Hence, trying to minimize bias by willpower makes
The kind of biases discussed here, implicit cognitive biases, are you think of it more and actually increases its effect. This is
widespread and are not due to incompetency, and therefore are similar to a judge instructing jurors to disregard certain evidence.
harder to detect. By doing so, the judge is actually making the jurors notice this
Third Fallacy: Expert Immunity. There is a widely incorrect evidence even more.27
belief that experts are impartial and immune to biases.17 The beliefs in such fallacies, see Table 1, prevent dealing with
However, the truth of the matter is that no one is immune to the biases because they dismiss their powers and even their very
bias, not even experts.18 In fact, in many ways, experts are more existence. We need to acknowledge the impact of biases and
susceptible to certain biases. The very making of expertise understand their sources, so we can take appropriate measures
creates and underpins many of the biases.19 For example, when needed and when possible to combat their affects.
experience and training make experts engage in more selective
attention, use chunking and schemas (typical activities and their Table 1. Six Fallacies about Cognitive Bias Commonly Held
sequence), and rely on heuristics and expectations arising from by Experts
past base rate experiences, utilizing a whole range of top-down
Fallacy Incorrect belief
cognitive processes which create a priori assumptions and
expectations. 1. Ethical Issues It only happens to corrupt and unscrupulous individuals, an
issue of morals and personal integrity, a question of
These cognitive processes enable experts to often make quick personal character.
and accurate decisions. However, these very mechanisms also 2. Bad Apples It is a question of competency and happens to experts who
create bias that can lead them in the wrong direction. Regardless do not know how to do their job properly.
of the utilities (and vulnerability) of such cognitive processing in 3. Expert Experts are impartial and are not affected because bias does
Immunity not impact competent experts doing their job with
experts, they do not immune experts from bias, and indeed, integrity.
expertise and experience may actually increase (or even cause) 4. Technological Using technology, instrumentation, automation, or artificial
certain biases. Experts across domains are subject to cognitive Protection intelligence guarantees protection from human biases.
vulnerabilities.20 5. Blind Spot Other experts are affected by bias, but not me. I am not
Although experts tend to be very confident (sometimes even biased; it is the other experts who are biased.
overconfident), more experienced experts can actually perform 6. Illusion of I am aware that bias impacts me, and therefore, I can control
Control and counter its affect. I can overcome bias by mere
worse than novices. This has been demonstrated, for example, in willpower.
environmental ecology, where data collection is critical and
underpinned by the ability to correctly identify and collect
samples, and novices actually outperform experts.21
Fourth Fallacy: Technological Protection. People think
that the mere use of technology, instrumentation, automation,
or artificial intelligence eliminates bias. These can reduce bias,
■ EIGHT SOURCES OF BIAS
Cognitive biases are widespread, a ubiquitous phenomenon in
but even when these are in used, human biases are still at play many guises.13 Figure 1 illustrates eight sources of bias in expert
because these systems are built, programmed, operated, or decision making. It shows that cognitive bias can arise from a
interpreted by humans. There is a danger that people will variety of sources, which are categorized into three groups.
incorrectly believe that using technology is a guaranteed Category A relates to the specific casesomething about this
protection from being susceptible to and affected by bias. case causes bias in how data are perceived, analyzed, and
Furthermore, technology can even introduce biases, be it mass interpreted. Other sources of cognitive bias, Category B, have
spectral library matching software, Automated Fingerprint nothing to do with the specific case, but they arise from factors
Identification Systems (AFIS), or other technological devices.22 relating to the specific person doing the worksomething about
Fifth Fallacy: Bias Blind Spot. The experts themselves are them (e.g., their experience, their personality, their working
often not aware of their biases and therefore hold a fallacy that environment, their motivation, etc.) causes the bias. Further
they are not biased. One must remember that these are implicit sources of cognitive bias, Category C, arise from human nature,
cognitive biases, not intentional discriminatory types of biases. the very cognitive architecture of the human brain that we all
The bias blind spot23 is well documented and has been share, regardless of the specific case or the specific person doing
demonstrated in a variety of domains, including forensic the analysis.
science17 and forensic psychology.24 While it is relatively easy It is important to understand these different sources of bias so
to see bias in others, we are often blind to our own biases. we can more easily recognize and expose them, and most
7999 https://dx.doi.org/10.1021/acs.analchem.0c00704
Anal. Chem. 2020, 92, 7998−8004
Analytical Chemistry pubs.acs.org/ac Perspective
Figure 1. Eight sources of bias that may cognitively contaminate sampling, observations, testing strategies, analysis, and conclusions, even by experts.
They are organized in a taxonomy within three categories: starting off at the top with sources relating to the specific case and analysis (Category A),
moving down to sources that relate to the specific person doing the analysis (Category B), and at the very bottom sources that relate to human nature
(Category C).
importantly, each of the different sources of biases requires Take the case of Kerry Robinson who was wrongly convicted
specific countermeasures. of rape based on mistaken analysis of DNA evidence.33 The two
(1) The Data. The first source of cognitive bias within the DNA examiners in the case made an error as they seem to have
factors that relate to the specific analysis is the data. How can been working backward from Kerry Robinson’s known DNA
data cause bias? Well, it depends on the data. Some data, such as profile to the latent evidence. Thus, the suspect “target” was
fingermarks, do not, per se, cause bias, as they convey no driving the analysis rather than the evidence itself. He was
information beyond the friction ridges impressions. However, exonerated and released from jail after serving 17 years.34
with other types of data (such as in the analysis of voice, Indeed, when the same DNA evidence was later analyzed
handwriting, blood spatter, and bitemarks), the data can contain properly, different results were obtained.35
potentially biasing information. For example, in voice analysis, The problem with “going backward”, circular reasoning, is not
the content, or even the tone and screaming, can reveal a brutal limited to DNA and can be found in other comparative domains,
assault. Similarly, gang-rape mixture DNA evidence can evoke such as fingerprinting, handwriting, firearms, and other domains
emotions which can impact decision making.28−30 (for a review, see ref 5), where the reference materials of the
(2) Reference Materials. Reference materials can bias how known suspect influences the interpretation of the data from the
the data are perceived and interpreted. This applies to a variety crime scene. This has been exemplified when the FBI (as well as
of data, including DNA, fingerprinting, and any decisions that an expert hired by the defense) erroneously identified Mayfield
are based on making comparisons. If DNA evidence as the Madrid bomber.36 Again, the “target” (in this case, the
interpretation is influenced by the “target” suspect known fingerprints of the suspect) was driving the analysis, resulting in
reference material (i.e., their DNA profile) so to better fit them, an erroneous identification. For example, in the analysis, a signal
then there is biased interpretation of the biological material from in the evidence was perceived as noise and disregarded because
the crime scene.31 it did not match the targeta typical error emerging from going
Rather than the actual evidence being the driver of the backward and letting the target or expected results drive the
decision making processwhere evidence is interpreted based analysis.
on the data it containsthe suspect’s profile is driving the Furthermore, this source of bias is not limited to circum-
decision making. Hence, rather than going from the evidence to stances that have a “target” suspect per se, but can also arise from
the suspect (from data to theory), the reference materials cause pre-existing templates and patterns, such as in the interpretation
examiners to go backward (circular reasoning) from the target/ of blood pattern analysis or a crime scene. It can even impact
suspect to the evidence, thus biasing how the data are what color is observed. Hence, bias is not limited to the
interpreted.32 interpretation for the appearance of a color (e.g., pink color in
8000 https://dx.doi.org/10.1021/acs.analchem.0c00704
Anal. Chem. 2020, 92, 7998−8004
Analytical Chemistry pubs.acs.org/ac Perspective
the Griess test, incorrectly interpreted as meaning the suspects knowing that a Drug Recognition Expert (DRE) evaluated a
handled dynamite, see below), but can even impact what color is person as being impaired and under the influence of a stimulant,
observed in the first place when Munsell color charts are used as but there was no alcohol in their blood, or when a dog signals the
reference templates.37 presence of drugs, or any other contextual information that
This bias goes beyond impacting only the conclusion and causes an expectation to what the results should be.
interpretation of what the presence of the color means, because Contextual expectations not only impact data collection and
it can also bias and impact the observation of what the color itself testing strategies but also the interpretation and conclusions of
is (for this important distinction between biases impacting the analysis. For example, “a poor-quality chromatographic or
observations vs conclusions, see the Hierarchy of Expert mass spectral match for a drug could be consciously or
Performance (HEP)).6 subconsciously “upgraded” to a positive result based on the
These types of cognitive biases are therefore not limited to a expectation that the sample should be positive for that drug.
target or reference, they can even be caused by having a theory, a Conversely, a similarly poor-quality match might be “down-
chart, or a pattern in mind, and have been shown to impact a graded” to negative if the analyst is not expecting the drug to be
variety of expert domains.38−40 present, or its presence ‘does not fit the circumstances of the
This bias can also occur, for example, in a messy mass case’. In both situations, the context could give the illusion of a
chromatogram or infrared spectrum where an analyst might stronger basis for the decision than is warranted by the data” (ref
“pick out” the peaks they are looking for and ignore the others. 9, page 381). Cognitive biases arise when task-irrelevant context
This is not intentional and happens without awareness. The causes some aspect of analysis to be overweighted, under-
expectation biases cognitive resources and attention toward a weighted, or neglected (e.g., not perceived, determined to be
certain stimulus or signal (while suppressing and ignoring noise, an anomaly, or an outlier). This does not only happen in
others)41 and impacts sensory response to predictable object subjective judgements but can also bias even established
stimuli throughout the ventral visual pathway.42,43 It also biases procedures,10 and criteria for accepting evidence and proper
perception sensitivity for those targets,44 changing sensory judgment.49
representations in the visual cortex,45 and hence impacts what The problem with task irrelevant contextual information is
we actually perceive and how.46 Just as it can bias the visual that it can cause many kinds of biases that impact analysis in
search of a mass chromatogram or infrared spectrum, it can many different ways. Another impact of bias can be overlooking
impact detection in radiography38 and many other domains. In or underweighting the absence of data, not properly confirming
all of these examples, the human examiner is driven by a “target” results, or not considering alternatives. Consider, for example,
they expect (or want) rather than by the actual data. how biasing terrorism contextual information can result in a false
(3) Contextual Information. Experts are often exposed to positive identification of dynamite. The Griess test, a
irrelevant information. In the forensic domain, for example, such presumptive color test for explosives,50 was positive for
information may be that the suspect confessed to the crime, that nitroglycerine. However, the positive result could be gained
they have been identified by eyewitnesses and other lines of from nitrocellulose in a range of innocent non-terrorism related
evidence, or that the suspect has a criminal record.47 Even products. Nevertheless, because of the contextual information, it
knowing the name of the suspect may be suggestive of a specific was wrongly concluded that the appearance of a pink color
race evoking biases and stereotypes. These all cause expectations meant the suspects handled dynamite, when actually the traces
that can impact not only the interpretation of the results were generated from an old pack of playing cards that they had
obtained from the analysis, but also the analysis itself because the been shuffling.51 The suspects were wrongfully convicted and
expectations impact the detection of what goes into the analysis were sentenced to life imprisonment, only to be exonerated and
as well as testing strategies. This source of bias is not derived released from jail after serving 16 years following a review of the
from a target generated by the reference materials (see above) testing used to obtain their original convictions.52 Such errors
but from contextual information that can be irrelevant to the task happen when analysts are biased to accept results that are
carried out by the analyst. expected or wanted, without carrying out proper confirmation or
In toxicology, for example, contextual information can bias considering alternatives.
testing strategies. Consider, for instance, when a post-mortem Another example is hair-strand drug and alcohol testing. Even
case is provided with contextual informant, such as “drug- though there is a known false-positive problem with immuno-
overdose” or/and that “the deceased was a known to have a assays, and therefore, results must be confirmed by another more
history of heroin use.” Such information can impact the specific technique, these were not always property carried out
established testing strategies, such as to go straight to the when results were in line with contextual information.53
confirmation and quantification of a limited range of opiate-type It is important to emphasize that contextual irrelevant
drugs (morphine, codeine, 6-monoacetylmorphine, and other information biases scientists and experts, and it can do so at
heroin markers), without running the other standard testing, an unconscious level−they may not be aware of the impact. The
such as an immunoassay or looking for other opioids that may expectation biases what and how information is represented and
have been present (e.g., fentanyl). Hence, the contextual processed in the brain.41−46 These biases impact experts and
information caused a confirmation bias approach and deviation cannot be properly controlled by mere willpower (see the six
from the standard testing and screening protocols. This has bias fallacies above).
indeed caused errors in toxicology testing cases.10 (4) Base Rate. An important asset that experts bring to thier
In the forensic context, for example, irrelevant contextual work is their experience from previous cases. However, such
information has been shown to impact data collection at the experience brings expectations to new cases, that are not derived
crime scene,48 as well as bias in laboratory DNA analysis35 (for from the specific case or analysis at hand, but nevertheless can
reviews, see refs 5, 6). Similarly, deviation from standard testing still impact their interpretation. Hence, the sampling and
strategies, data collection, and sampling can be biased by analysis being conducted are impacted by factors that have
contextual information when the laboratory does analyses nothing to do with the case at hand, but rather with expected
8001 https://dx.doi.org/10.1021/acs.analchem.0c00704
Anal. Chem. 2020, 92, 7998−8004
Analytical Chemistry pubs.acs.org/ac Perspective
base rates generated from previous unrelated cases, which organizational factors that can impact the work carried out in
influence how this case is conducted. laboratories and other professional settings.
For example, when the cause of death is cerebral hypoxia (6) Education and Training. Education and training play
caused by hanging, then the manner of death is most often an important role in how work is conducted. For example,
suicide. In contrast, when cerebral hypoxia is caused by whether forensic examiners see their role more as supporting the
strangulation, then the manner of death is most often homicide. police rather than as scientists. When approaching a case,
This is not necessarily the case since, although rarely, hanging training and education may instil the pursuit of a single
can be a result of a homicide and strangulation can be a suicide. hypothesis vs examining multiple hypotheses, considering
However, the base rate of associations between the cause and alterative hypotheses (including scenarios proposed by the
manner of death can bias the interpretation and determination opposing side), conducting differential diagnosis, considering
of manner of death. Such base rate biases are common in many categorical decisions (such as “match” and “nonmatch”, often
domains from medical diagnosis to security X-rays at airports.54 used in fingerprinting and firearms) vs using statistics and other
The biasing effects of base rates are not limited to how the methods to determine the strength of the evidence, etc. Digital
results of the analysis are interpreted. They can impact other forensics, firearms, fingerprinting, and many forensic domains
stages of the analysis, even the sampling and data collection, as have actually grown out of police work with minimal-to-no
well as detection of relevant marks, items, or signals, or even proper education and training in science.
verification. For example, low target prevalence base rate bias (7) Personal Factors. Many personal factors impact biases
shows that if in past experiences it was rare or uncommon to and decision making. These include motivation, personal
find, then observers are more likely to miss it in the future.55,56 ideology and beliefs. Furthermore, some people are more risk
Hence, even when observers are searching for an item, mark, or takers and others are more risk averse, and people also vary in
signal, then base rate can bias their search: relatively rare and their tolerance to ambiguity.64 Other individual differences
uncommon items, signals, or objects will make observers more between people can bias results, for example, tests that use color
likely to miss them even when they are present. can be biased because people differ in what color they perceive
Base rate bias derives from expectations generated from past when looking at the same item.37
similar cases.46 The issue is that this case is biased because its In areas where there is more objective quantification and
analysis is actually based on other cases. The crux of the bias is instrumentation, these factors are minimized. However, in areas
that perception and decisions are not based on the case itself. where the human examiner has a greater role in deciding how to
This type of bias is even more potent when the similarity to past collect, sample and interpret the data, and where there is
cases is superficial and more in the eye of the beholder than in subjectivity in evaluatating the data and conclusions, then such
reality.57 personal factors play a greater role in how work is carried out.
(5) Organizational Factors. Organizational factors that can Even when technology is used, human biases are still at play.
cause bias are many and varied, and have been well documented Technology cannot be totally objective as humans are involved
in a variety of domains. When it comes to DNA and other in construction and operation of the technology, as well as
forensic evidence, where analysis and work is often conducted calibrating it, maintaining it, and interpreting the results and
within the adversarial legal system, cognitive bias may emerge deciding if and what action to take.65 Indeed, ISO (International
from an allegiance effect and myside bias.58 Indeed, a study Organization for Standardization) standard 17025:2017 (gen-
showed that when forensic experts are presented with identical eral requirements for the competence of testing and calibration
data, they nevertheless reach conclusions biased toward the side of laboratories that rely more on instrumentation and objective
that retained them59an adversarial allegiance and myside bias. quantification)66 now follows the standard for the more
These are implicit biases, not explicit partiality when one side is subjective laboratory domains, ISO 17020 and ISO 17025,
openly favored over the other. and includes specific requirements for impartiality and freedom
Many forensic science laboratories are part of law enforce- from bias. Hence, it acknowledges the role of the human
ment (or even part of the DA prosecution office). Such examiner, even when quantification and instrumentation are
organizational influences have been recognized as biasing by the used. It recognizes that even the use of instrumentation does not
National Academy of Sciences Report, calling for “removing all guarantee freedom from bias.67
public forensic laboratories and facilities from the administrative Other personal factors that can cause bias in decisions include
control of law enforcement agencies or prosecutors’ offices” (ref the need for closure that can result in premature decisions or
60, Recommendation #4, page 24). The point is that opting to reach inconclusive decisions,68 how people respond to
organizational factors and the administration surrounding stress and fatigue, personality, and a whole range of personal
forensic science induces biases.61 factors that can impact expert decision making.69−71
The impact of organizational factors applies to any and every (8) Human and Cognitive Factors, and the Human
laboratorythey work within a variety of contexts, structures, Brain. The workings of our brain create architectural and
and frameworks that can bias their work. For example, capacity constraints that do not allow it to process all the
laboratories have clear hierarchy and dependencies. If there is incoming information. The brain therefore engages in a variety
a senior person who “signs off” on reports or analyses, there can of processes (mainly known as “top-down”) to make sense of the
be the danger of “writing what that person wants to read” and a world and data around us. The human mind is not a camera, the
lack of challenge of their scientific decisions. Thus, science is active nature of human cognition means that we do not see the
muddled with managerial authority and other organizational world “as it is.”
pressures.62,63 Beyond many cognitive processes and how the human brain is
Other organizational factors relate to time pressure, expect- wired which can cause biases, there are biasing affects related to
ations to reach certain results, stress, budget controls, pressure to social interaction, in-group and availability biases, processing
obtain publications and other targets, and a whole range of fluency, and other biasing influences that impact all of us.72−75
8002 https://dx.doi.org/10.1021/acs.analchem.0c00704
Anal. Chem. 2020, 92, 7998−8004
Analytical Chemistry
■
pubs.acs.org/ac Perspective
SNOWBALL AND CASCADE BIAS want to consider and acknowledge their own biases, let alone
Bias does not impact only the individual in isolation or just one research them. However, it is essential to do so if we are to
aspect of the work; often the bias cascades from one person to combat and minimize bias. It is the hope of the author that this
another, from one aspect of the work to another, influencing paper contributes to that direction.
different elements of an investigation. As people and various
aspects are influenced, they then influence others, turning from
influenced to influencers, perpetuating the bias and impacting
■ AUTHOR INFORMATION
Corresponding Author
others. Then, biases are not only cascaded but gather Itiel E. Dror − University College London (UCL), London WC1H
momentum and snowball.32
■
9EZ, United Kingdom; orcid.org/0000-0003-4866-209X;
Email: i.dror@ucl.ac.uk
OVERCOMING BIAS
First, we need to acknowledge the existence of bias and move Complete contact information is available at:
beyond believing the fallacies about its nature. When people https://pubs.acs.org/10.1021/acs.analchem.0c00704
have a bias blind spot or think that as experts they are immune to
it or that by mere willpower they can overcome it (see the six Notes
bias fallacies in Table 1), then biases only perpetuate. The author declares no competing financial interest.
Second, as a general principle to combat bias, we need to take
actions that will cause us to focus solely on the relevant data and
not work backward. These need to be part of ongoing training
■ ACKNOWLEDGMENTS
I want to thank Hilary Hamnett, Nikolas P. Lemos, Joseph
and laboratory procedures. Accreditation to the appropriate Almog, Roderick Kennedy, and anonymous reviewers for their
standards and certification in the appropriate discipline may not helpful comments on an earlier version of this perspective.
■
solve the problems, but they do force laboratories to document
procedures. External scrutiny can also be very helpful for REFERENCES
illuminating areas of bias (especially given that ISOs (e.g., 17020
and 1702566) specifically require that steps are taken to make (1) McCord, B. R.; Gauthier, Q.; Cho, S.; Roig, M. N.; Gibson-Daw,
G. C.; Young, B.; Taglia, F.; Zapico, S. C.; Mariot, R. F.; Lee, S. B.;
sure there is freedom from bias.67 Duncan, G. Anal. Chem. 2019, 91, 673−688.
Third, specifically, we can combat various sources of bias by (2) Barrio, P.A.; Crespillo, M.; Luque, J.A.; Aler, M.; Baeza-Richer, C.;
the following: Baldassarri, L.; Carnevali, E.; Coufalova, P.; Flores, I.; Garcia, O.;
(A) Using blinding and masking techniques that prevent Garcia, M.A.; Gonzalez, R.; Hernandez, A.; Ingles, V.; Luque, G.M.;
exposure to task irrelevant information.76 Mosquera-Miguel, A.; Pedrosa, S.; Pontes, M.L.; Porto, M.J.; Posada,
(B) Using methods, such as Linear Sequential Unmasking Y.; Ramella, M.I.; Ribeiro, T.; Riego, E.; Sala, A.; Saragoni, V.G.;
(LSU), to control the sequence, timing, and linearity of Serrano, A.; Vannelli, S. Forensic Sci. Int.: Genet. 2018, 35, 156−163.
(3) Butler, J. M.; Kline, M. C.; Coble, M. D. Forensic Sci. Int.: Genet.
exposure to information, so as to minimize “going 2018, 37, 81−94.
backward” and being biased by the reference materials.32 (4) Bright, J.-A.; Cheng, K.; Kerr, Z.; McGovern, C.; Kelly, H.;
(C) Using case managers that screen and control what Moretti, T. R.; Smith, M. A.; Bieber, F. R.; Budowle, B.; Coble, M. D.;
information is given to whom and when. Alghafri, R.; Allen, P. S.; Barber, A.; Beamer, V.; Buettner, C.; Russell,
(D) Using blind, double blind, and proper verifications when M.; Gehrig, C.; Hicks, T.; Charak, J.; Cheong-Wing, K.; Ciecko, A.;
possible. Davis, C. T.; Donley, M.; Pedersen, N.; Gartside, B.; Granger, D.;
(E) Rather than have one “reference target” or hypothesis, Greer-Ritzheimer, M.; Reisinger, E.; Kennedy, J.; Grammer, E.; Kaplan,
having a “line up” of competing and alternative M.; Hansen, D.; Larsen, H. J.; Laureano, A.; Li, C.; Lien, E.; Lindberg,
conclusions and hypotheses. E.; Kelly, C.; Mallinder, B.; Malsom, S.; Yacovone-Margetts, A.;
McWhorter, A.; Prajapati, S. M.; Powell, T.; Shutler, G.; Stevenson, K.;
(F) Adopting a differential diagnosis approach, where all Stonehouse, A. R.; Smith, L.; Murakami, J.; Halsing, E.; Wright, D.;
different conclusions and their probability are presented, Clark, L.; Taylor, D. A.; Buckleton, J. Forensic Sci. Int.: Genet. 2019, 40,
rather than one conclusion.77,78 1−8.
8003 https://dx.doi.org/10.1021/acs.analchem.0c00704
Anal. Chem. 2020, 92, 7998−8004
Analytical Chemistry pubs.acs.org/ac Perspective
(16) Thompson, W. C. Southwestern Uni. Law Rev. 2009, 37, 971− (53) Lang, S. E. Report of the Motherisk Hair Analysis Independent
994. Review; Ontario Ministry of the Attorney General, Canada, 2015.
(17) Kukucka, J.; Kassin, S.; Zapf, P.; Dror, I. E. J. Appl. Res. Mem. Cog. (54) Egglin, T. K.; Feinstein, A. R. J. Am. Med. Assoc. 1996, 276 (21),
2017, 6 (4), 452−459. 1752−1755.
(18) Dror, I. E.; Kukucka, J.; Kassin, S.; Zapf, P. J. Appl. Res. Mem. Cog. (55) Wolfe, J. M.; Horowitz, T. S.; Kenner, N. M. Nature 2005, 435,
2018, 7 (2), 316−317. 439−440.
(19) Dror, I. E. In The Paradoxical Brain; Kapur, N., Ed.; Cambridge (56) Wolfe, J. M.; Horowitz, T. S.; Van Wert, M. J.; Kenner, N. M.;
University Press: Cambridge, UK, 2011; pp 177−188. Place, S. S.; Kibbi, N. J. Exp. Psychol. Gen. 2007, 136, 623−638.
(20) Shanteat, J. In Advances in Design Research; Rohrmann, B., Beach, (57) Shafffi, E. B.; Smith, E. E.; Osherson, D. N. Mem Cog. 1990, 18
L. R., Vlek, C., Watson, S. R., Eds.; Elsevier: Amsterdam, 1989; pp 203− (3), 229−239.
215. (58) Simon, D.; Ahn, M.; Stenstrom, D. M.; Read, S. J. Psych. Public
(21) Soller, J. M.; Ausband, D. E.; Gunther, S. M. PLoS One 2020, 15 Policy. Law 2020, n, na.
(3), e0229762. (59) Murrie, D. C.; Boccaccini, M. T.; Guarnera, L. A.; Rufino, K. A.
(22) Dror, I. E.; Wertheim, K.; Fraser-Mackenzie, P.; Walajtys, J. J. Psych Sci. 2013, 24, 1889−1897.
Forensic Sci. 2012, 57 (2), 343−352. (60) Strengthening Forensic Science in the United States: A Path Forward;
(23) Pronin, E.; Lin, D. Y.; Ross, L. Person. Soc. Psych. Bull. 2002, 28, National Academies Press: Washington, DC, 2009.
369−381. (61) Whitman, G.; Koppl, R. Law Prob. Risk 2010, 9, 69−90.
(24) Zapf, P.; Kukucka, J.; Kassin, S.; Dror, I. E. Psych., Pub. Policy Law (62) Howard, J. Cognitive Errors and Diagnostic Mistakes: A Case-Based
2018, 24 (1), 1−10. Guide to Critical Thinking in Medicine; Springer: New York, 2019.
(25) Thornton, J. I. J. Forensic Sci. 2010, 55 (6), 1663. (63) Cosby, K. S.; Croskerry, P. Acad. Emergency Med. 2004, 11,
(26) Wegner, D. M. Psych. Rev. 1994, 101, 34−52. 1341−1345.
(27) Steblay, N.; Hosch, H. M.; Culhane, S. E.; McWethy, A. Law (64) Saposnik, G.; Redelmeier, D.; Ruff, C. C.; Tobler. BMC Med. Inf.
Hum. Beh. 2006, 30, 469−492. Decis. Making 2016, 16, 138.
(28) Zajonc, R. B. Am. Psychol. 1980, 35, 151−175. (65) Dror, I. E.; Mnookin, J. Law Prob. Risk 2010, 9 (1), 47−67.
(29) Finucane, M. L.; Alhakami, A.; Slovic, P.; Johnson, S. M. J. Behav. (66) General Requirements for the Competence of Testing and Calibration
Decis. Making 2000, 13, 1−17. Laboratories, 3rd ed.; ISO/IEC 17025; International Organization for
(30) Damasio, A. R. Descartes’ Error: Emotion, Reason, and the Human Standardization/International Electrotechnical Commission, Geneva,
Brain; Penguin Books: New York, 2005; pp 1−312. Switzerland, 2017.
(31) Jeanguenat, A. M.; Budowle, B.; Dror, I. E. Sci. Justice 2017, 57 (67) Dror, I. E.; Pierce, M. L. J. Forensic Sci. 2020, 65 (3), 800−808.
(6), 415−420. (68) Dror, I. E.; Langenburg, G. J. Forensic Sci. 2019, 64 (1), 10−15.
(32) Dror, I. E. Science 2018, 360 (6386), 243. (69) Gok, K.; Atsan, N. Intern J. Business Soc. Res. 2016, 6 (3), 38−47.
(33) Starr, D. Science 2016, 7, na−na, DOI: 10.1126/science.aaf4160. (70) Neal, T. PLoS One 2016, 11 (4), No. e0154434.
(34) Hanna, J.; Valencia, N. DNA Analysis Clears Georgia Man Who (71) Miller, A. K.; Rufino, K. A.; Boccaccini, M. T.; Jackson, R. L.;
Served 17 Years in Wrongful Rape Conviction. CNN, January 10, 2020. Murrie, D. C. Assessment 2011, 18 (2), 253−260.
(35) Dror, I. E.; Hampikian, G. Sci. Justice 2011, 51 (4), 204−208. (72) Griffin, D.; Ross, L. In Advances in Experimental Social Psychology;
(36) A Review of the FBI’s Handling of the Brandon Mayfield Case; Zanna, M. P., Ed.; Academic Press: San Diego, CA, 1991; pp 319−359.
Office of the Inspector General, Oversight & Review Division, U.S. (73) Ross, L.; Greene, D.; House, P. J. Exp. Soc. Psych. 1977, 13, 279−
Department of Justice, 2006. 301.
(37) Marqués-Mateu, Á .; Moreno-Ramón, H.; Balasch, S.; Ibáñez- (74) Oppenheimer, D. M. Trends Cognit. Sci. 2008, 12, 237−241.
Asensio, S. Catena 2018, 171, 44−53. (75) Goldstein, D. G.; Gigerenzer, G. Psychol. Rev. 2002, 109, 75−90.
(38) Berlin, L. AJR, Am. J. Roentgenol. 2007, 189, 517−522. (76) Robertson, C., Kesselheim, A., Eds.; Blinding as a Solution to Bias:
(39) Kriegeskorte, N.; Simmons, W K.; Bellgowan, P. S F; Baker, C. I Strengthening Biomedical Science, Forensic Science, and Law; Academic
Nat. Neurosci. 2009, 12, 535−540. Press: New York, 2016; pp 1−388.
(40) Vul, E.; Kanwisher, N. In Foundational Issues for Human Brain (77) Maude, J. Diagnosis 2014, 1, 107−109.
Mapping; Hanson, S., Bunzl, M., Eds.; MIT Press: Cambridge, MA, (78) Barondess, J. A., Carpenter, C. C., Eds.; Differential Diagnosis; Lea
2010; pp 71−92. & Febiger: Philadelphia, PA, 1994; pp 1−800.
(41) Richter, D.; Ekman, M.; de Lange, F. P. J. Neurosci. 2018, 38,
7452−7461.
(42) Luck, S. J.; Ford, M. A. Proc. Natl. Acad. Sci. U. S. A. 1998, 95 (3),
825−830.
(43) Simons, D. J.; Chabris, C. F. Percept. 1999, 28, 1059−1074.
(44) Stein, T.; Peelen, M. V. J. Exp. Psychol. Gen. 2015, 144, 1089−
1104.
(45) Kok, P.; Brouwer, G. J.; van Gerven, M. A. J.; de Lange, F. P. J.
Neurosci. 2013, 33, 16275−16284.
(46) de Lange, F. P.; Heilbron, M.; Kok, P. Trends Cognit. Sci. 2018,
22, 764−779.
(47) Gardner, B. O.; Kelley, S.; Murrie, D. C.; Blaisdell, K. N. Forensic
Sci. Int. 2019, 297, 236−242.
(48) Eeden, C. A. J.; de Poot, C. J.; van Koppen, P. J. J. Forensic Sci.
2019, 64 (1), 120−126.
(49) Morewedge, C. K.; Kahneman, D. Trends Cognit. Sci. 2010, 14,
435−440.
(50) Moorcroft, M.; Davis, J.; Compton, R. G. Talanta 2001, 54, 785−
803.
(51) Almog, J.; Zitrin, S. In Aspects of Explosive Detection; Marshal, M.,
Oxley, M., Eds.; Elsevier: Amsterdam, 2009; pp 47−48.
(52) Lissaman, C. Birmingham Pub Bombers Will Probably Never Be
Found. BBC News, March 14, 2011.
8004 https://dx.doi.org/10.1021/acs.analchem.0c00704
Anal. Chem. 2020, 92, 7998−8004