top of page

Controlling Heuristic Biases

(Originally published January 2014)

In common usage, bias refers to fixed ideas that reflect ignorance and/or prejudice, along with a stubborn rejection of conflicting evidence and rational argument. However, cognitive psychologists such as Daniel Kahneman and Amos Tversky have generated a large body of research regarding heuristic biases, i.e., mental shortcuts, that lead to systematic errors in judgment. Heuristic biases are not substantive beliefs; rather they are (mostly unconscious) ways of processing information automatically and quickly which are accompanied by a strong sense of conviction, despite their fallibility. Approximately 20 heuristic biases associated with intuition have been identified. These biases increase the likelihood of jumping quickly to erroneous conclusions. Any professional who makes high stakes decisions regarding children and families should become aware of several of these heuristic biases and begin to develop approaches to combating them. Daniel Kahneman’s Thinking Fast and Slow (2011) contains an invaluable discussion of these biases and how they influence decision making.

 

One of the heuristic biases that often influence child welfare practice is the halo effect, i.e. “the tendency to like (or dislike) everything about a person – including things you have not observed.” According to Kahneman, “the halo effect increases the weight of first impressions, sometimes to the point that subsequent information is mostly wasted.” When caseworkers like parents and empathize with their struggles, they may ignore, minimize or rationalize risk of abuse and neglect; and if they dislike or disapprove of parents, safety threats and risks to children may be exaggerated. One of the challenges of casework in child welfare is the need to combine dispassionate and unblinking assessments of parents and family functioning while emotionally engaging with family members – including abusive or neglectful caregivers – in non-punitive and helpful ways. Finding a workable balance between intellectual detachment, which facilitates seeing people as they are, and emotional engagement, which overcomes barriers to developing partnerships with troubled family members, is a developmental challenge for inexperienced practitioners. Some level of self-awareness, hopefully encouraged by reflective supervision, is essential to successfully meet this challenge.

 

One reason CPS caseworkers and supervisors may be unusually susceptible to heuristic biases, especially the halo effect and confirmation bias, is that they must make decisions regarding child safety quickly, often with inadequate information. Once caseworkers’ initial impressions regarding child safety develop into beliefs, confirmation bias ensures that these beliefs will be difficult to change. The English scholar, Eileen Munro, has described the impact of confirmation bias on caseworkers’ decision making in 45 child maltreatment fatality cases in England between 1973 and 1994. In these cases, Munro states, “The most striking and persistent criticism (of review teams) was that professionals were slow to revise their judgment. The current risk assessment of a family had a major influence on responses to new evidence.” Similarly, in child welfare agencies, which require initial determinations that children are Safe or Unsafe, these judgments are likely to influence the interpretation of new information that invites reconsideration of child safety assessments.

 

One of the striking characteristics of persons with strongly held but questionable beliefs is that they are likely to believe there is strong evidence to support their judgment. One of the reasons that biases are difficult to undo is that they are frequently reinforced by what may seem compelling evidence. Biased persons scan the environment for evidence that supports their beliefs, and ignore evidence that conflicts with their beliefs. If, despite one’s best efforts, conflicting evidence cannot be avoided, then the tendency is to give the evidence in question little or no weight. Munro quotes Sutherland (1992) on strategies for avoiding challenges to beliefs:

 

  • First, people consistently avoid exposing themselves to evidence that might disprove beliefs.

  • Second, on receiving evidence against their beliefs, they often refuse to believe it.

  • Third, the existence of a belief distorts people’s interpretations of new evidence in such a way as to make it consistent with the belief.

  • Fourth, people selectively remember items that are in line with their beliefs.

 

It is easy to understand why persons with strongly held political and religious beliefs would be resistant to entertaining challenges to ideas and values that have become part of their identity. It is not as evident why professionals whose jobs require quick assessments based on limited information and contact with specific families would tenaciously resist evidence that suggests they may have been initially mistaken about child safety and/or risk to children of child maltreatment. Kahneman’s discussion of two different types of mental functioning, i.e., System 1 and System 2, in Thinking Fast and Slow, provides a plausible answer. System 1 takes center stage in mental life whenever personal safety is threatened and/or whenever a person has little or no opportunity to reflect on decisions. System 1:

 

  • Operates automatically, quickly, seemingly without effort, and is always “on”

  • Draws on ample stores of energy

  • Is intuitive and confident in its intuitions

  •  Is biased to believe and confirm

  • Infers and invents causes and intentions based on whatever information is available

  •  Likes coherent stories

  • Neglects ambiguity and suppresses doubt

  • Ignores absent evidence, i.e., What You See Is All There Is (WYSIATI)

  • Is poor at statistics

 

System 2 on the other hand is:

 

  • Cautious and deliberate

  • Analytical

  • Draws on limited stores of mental energy

  • Able to control impulses

  •  Capable of doubt

 

System 1 is a set of mental functions oriented around quick assessments of people and situations required for immediate action. For the most part, System 2 endorses and supports the assessments and actions generated by System 1, in part to conserve limited energy and also because System 1’s intuitions have led to actions which create an investment in being viewed as “right”. To acknowledge error is to incur a psychological cost, i.e., reduced confidence in one’s intuitive judgments and a social cost resulting from the public admission that one’s mistakes may have resulted in harm to another person, perhaps a child.

 

Moderating Confidence In Intuition

 

According to Daniel Kahneman, “A remarkable aspect of your mental life is that you are rarely stumped. True, you occasionally face a question such as 17 x 24 = ?  to which no answer immediately comes to mind, but these dumbfounding moments are rare. The normal state of your mind is that you have intuitive feelings and opinions about almost everything that comes your way. … Whether you state them or not, you often have answers to questions that you do not completely understand, relying on evidence that you can neither explain or defend.” Practitioners and policymakers need to grasp that intuition, though remarkably powerful and invaluable in some circumstances, is vulnerable to systematic errors that are difficult to control. Unquestioning dependence on intuition leads to overconfidence in decision making and to disregard for evidence that conflicts with strongly held beliefs. System 1, to which intuitive capacities belong, is poor at statistics and given to overestimating the likelihood of extremely rare events. System 1 likes coherent stories with clear causal connections, and is prone to filling in the blanks with assumptions when information is missing, according to Kahneman’s description.  

 

Decision makers should train themselves to carefully examine evidence, and gaps in evidence, before jumping to conclusions even when they have a powerful sense of subjective certainty in the truth of their intuitions. Initial impressions of people and beliefs developed through initial impressions should be viewed and presented to others as provisional, with full awareness of the possibility of error. The capacity to postpone judgment when confronted with ambiguity and complexity should be cultivated despite the uncomfortable emotional tension likely to result.

 

Kahneman describes one characteristic of System 1 that can be used as a check on intuition:

the capacity to be surprised. System 1 employs prototypes, i.e., standard patterns, in making intuitive judgments. Fortunately, System 1 is able to recognize anomalies, that is, behaviors or characteristics that don’t fit expectations. Chronically neglectful parents may be nurturing in some situations; abusive parents may sometimes demonstrate surprising patience and empathy with a disobedient child. Anomalies are indicators that persons, or situations or research findings contain unexpected complexities that require further study, and can activate System 2 : i.e., patient analytical effort to understand someone or something in a deeper way.

One of the antidotes for the halo effect and confirmation bias is the capacity to be surprised, and to be opened to inquiry by the unexpected in people, situations and experiences that have previously seemed familiar and well understood.

 

Developing Expert Intuitions

 

One of the most productive and praiseworthy collaborations between scholars with very different perspectives has been the sustained discussion regarding intuition between Daniel Kahneman and Gary Klein. Klein is the author of Sources of Power: How People Make Decisions (1998) which is a discussion of studies of naturalistic decision making. Klein bases his discussion of expert decision making on studies of chess masters, expert fire-ground commanders and submarine captains. Experts faced with time pressures, in Klein’s account, do not use analytical methods; instead they depend on holistic pattern recognition in which cues ignored or not even perceived by non-experts immediately generate effective responses. Klein has a low regard for actuarial methods of predicting events of interest, methods for which Kahneman has been a strong proponent. Klein emphasizes the power of intuition while Kahneman has written at length about its pitfalls.  Nevertheless, through a sustained civil and rational discussion, Klein and Kahneman have been able to agree on the conditions for developing expert intuitions. 

 

Both Klein and Kahneman agree that subjective certainty is not evidence for the validity of intuitions. Kahneman writes that “If subjective confidence is not to be trusted, how can we evaluate the probable validity of an intuitive judgment? When do judgments reflect true expertise? The answer comes from the two basic conditions for acquiring a skill:

 

  • an environment that is sufficiently regular to be predictable

  • an opportunity to learn these regularities through prolonged practice

 

When both of these conditions are satisfied, intuitions are likely to be skilled.

 

Expert intuition recognizes meaningful patterns that have explicit implications for decision making. For example, expert practitioners will recognize indicators of deliberate cruelty in abusive parents’ punishment of children; they will not confuse common instances of excessive discipline with torture of a child or battered child syndrome. Situational neglect, sporadic neglect and chronic neglect require different responses from child welfare systems; ignoring these differences leads to unskilled and ineffective interventions. Operationally useful typologies are helpful in developing pattern recognition, but development of sound typologies usually requires scholarly help which has been in short supply in child protection during the past two decades. Still, there have been some notable attempts to develop operationally useful typologies, for example Carol Bowdry’s typology of physical abuse; and these typologies deserve more attention than they have received.   

 

Ideally, experts can both intuitively and accurately identify meaningful patterns, and explain the rationale for their intuitions. However, one of the interesting features of intuitive expertise is that skilled pattern recognition often occurs before analytical understanding of this skill is developed. An outstanding baseball player may be a great hitter but a poor coach.  Nevertheless, development of intuitive expertise requires repeated exposure to a standard set of elements, features, (game) moves or processes. Training programs can lay the groundwork for pattern recognition, but there is no substitute for experience, as both Klein and Kahneman have pointed out. Kahneman’s estimate is that 10,000 hours of practice are required to produce a chess master, the equivalent of 5 hours of practice per day for 6 years. 

 

Truth Seeking Is A Social Enterprise

 

Heuristic biases frequently go unrecognized. Any manager who has been part of hiring panels has probably heard job candidates assert that they have no biases, by which they may mean ugly racial biases or biases regarding sexual preferences. I have occasionally heard otherwise intelligent adults in classroom settings confidently state that they lack a culture, a term which perhaps they associated with distinctive food tastes, cultural traditions or information regarding their family’s lineage. Heuristic biases are more difficult to recognize than racist beliefs or cultural traditions because they affect how information is processed and interpreted.  These processes are mostly unconscious; and if there is some degree of awareness of how beliefs are protected from challenges, the influence of biases is likely minimized. Even thoughtful adults tend to be naïve realists:  the world is as it appears.  It’s other people who can’t see the obvious and have weird ideas. 

 

It is not enough to seek self-awareness of biases. The social milieu in which high stakes assessments occur should encourage open expression of differences and debate. Acknowledging intellectual mistakes based on a reconsideration of evidence should be viewed as a personal strength, not a humiliating loss of face.

 

According to Kahneman, “the way to block errors that originate in System 1 is simple in principle: recognize the signs that you are in a cognitive minefield, slow down and ask for reinforcement from System 2.” In practice, this is difficult to do without the help of others. Kahneman suggests that “… it is much easier to identify a minefield when you observe others wandering into it than when you are about to do so.” Observers are less cognitively busy and more open to information than actors,” he states. 

 

Kahneman has a surprising confidence in the capacity of organizations to avoid errors through orderly procedures and quality control, but he also identifies the importance of organizational culture when he asserts in the final sentence of Thinking Fast and Slow that “(Decision makers) will make better choices when they trust their critics to be sophisticated and fair, and when they expect their decision to be judged by how it was made, not only by how it turned out.”

 

References

 

Bowdry, Carole, Toward a Treatment- Relevant Typology of Child Abuse Families, Child Welfare, Volume 69, number 4, pp. 333- 40, July-August 1990. 

 

Kahneman, Daniel, Thinking Fast and Slow, Farrar, Straus and Giroux, 2011.

 

Klein, Gary, Sources of Power: How People Make Decisions, MIT Press, 1998.

 

Munro, Eileen, “Common Errors in Reasoning in Child Protection,” Child Abuse and Neglect, Volume 23, number 8,  pp.745-758, 1999.

  

deewilson13@aol.com

    

bottom of page