PRESS TO LISTEN TO AUDIO
When we make decisions, it is common to expect consistency and therefore to be overly confident in the decision; however, this is often a common error.
As Daneil Kahneman, the psychologist and author, writes in “Thinking, Fast and Slow” Farrar, Straus & Giroux Nov 2011 , ” We are prone to think that the world is more regular and predictable that it really is, because our memory automatically and continuously maintains a story about what is going on, and because the rules of memory tend to make that story as coherent as possible and to suppress alternatives. According to Kahneman, fast thinking is not prone to doubt.
Kahneman, who is a nobel laureate, helps us to understand how we generate intuitive opinions on complex matters. We have the answers to questions that we do not completely understand, relying on evidence that we can neither explain nor defend.
System 1 or unconscious/fast/automatic thinking will find a related question that is easier and will answer it. He calls this the operation of answering an easier question in place of another more complex question. It is an act of substitution with the heuristic question being the simpler question that is answered instead.
The technical description of an heuristic is a simple mechanism that helps to find adequate albeit incomplete or flawed answers to difficult questions.
This heuristic is in play when a clinician confronts a complex patient and is expected to provide an immediate answer to a complex question. Patients want and expect a fast response and they get it. Relying on system 1 or intuitive/fast/automatic thought, the clinician answers an easier question providing the wrong answer, but with lots of confidence and at times moxy.
This substitution of questions leads to systematic errors. Is that surgery necessary or not? The errors matter if you are the patient.
Confidence is a “feeling” determined by the coherence of the story which we tell to ourselves and by the ease with which we recall the details of the story. This confidence occurs even when evidence is lacking. The bias toward coherence of the story generates overconfidence in the decision.
Kaheman calls this “illusion of validity”. He states that when a compelling impression of an event conflicts with general knowledge, the subjective impression commonly prevails. System 1 thinking ( fast/unconscious/automatic) suggests the incorrect intuition and system 2 thinking ( slow/conscious/rational) , if ignorant or lazy, endorses it.
In healthcare, patients interact with doctors who exercise their judgement with evident confidence. That is part of the culture of being a doctor. At times, the doctors will pride themselves on having powerful intuition.
It is important that patients who rely on doctor’s decisions and the doctor’s intuition have the opportunity to question the decision(s). It is important to shift to system 2 or conscious/analytical/slow thinking.
However, another heuristic called the representativeness heuristic may play a role in making errors. System 1 is triggered by stereotypes that are similar to the type of issue being considered. For example, if a stereotype ( a pattern of a disease) is similar to the presentation of a patient ,even though the statistics and other data tell us that the disease is unlikely, then the representativeness heuristic will rule. The fact that stereotypes govern judgements may be another source of clinical errors.
That is the reason that we bring multiple opinions to clinical decisions and challenge all the assumptions and conclusions. Bringing independent multiple opinions to a complex case is a powerful way of avoiding mistakes. This is what Kahneman calls the “principle of independent judgements ( and decorrleated errors). This process of seeking independent opinions makes good use of the value of the diversity of knowledge and opinion in the group.
Kahneman describes the inferiority of expert judgment when considering complex problems. Experts are inconsistent in making summary judgments of complex information. The extent of inconsistency is of real importance. For example, radiologists asked to evaluate xrays as “normal” or “abnormal” contradict themselves 20% of the time when they see the same xrays on separate occasions. This level of inconsistency is typical even when a case is re-evaluated within a few minutes. Unreliable judgments can not be valid predictors of anything.
True intuitive expertise is learned from prolonged experience with effective and timely feedback relating to mistakes. That is why Rupert Case Management (RCM) maintains a database of any and all errors relating to knowledge and process. Our objective is to learn from mistakes and missteps and to imbed that knowledge in our institutional processes to help future clients.
The accuracy of decision making will depend on the doctor’s experience and on the quality and speed with which they discover their mistakes.
Anesthesiologists have a better chance to develop intuition than radiologists because of the rapid feedback which anesthesiologists receive when a patient who is on the operating table starts to fail. Radiologists mistakes might never be discovered, thereby preventing learning from occuring.
Our work at RCM is to assist patients in the comprehensive assessment of the clinical decisions made about their treatment and care.