Listen to your patient; He is telling you the diagnosis. Sir William Osler

The process of reaching a conclusion in science and medicine based on data input should be a rational and reproducible. But is it? Consider a scenario, where you believe adding tacrolimus to posttransplant cyclophosphamide after an HLA-haplotype-mismatched transplant will markedly reduce incidence and severity of acute graft-versus-host disease (GvHD). Your belief is based on biological considerations, data in mice, in vitro experimental data and your experience with tacrolimus in other clinical settings. To see if you are correct you do a phase-2 clinical trial of 50 subjects to whom you give posttransplant cyclophosphamide and tacrolimus. You compare the rate and severity of acute GvHD with your data doing similar transplants with post-transplant cyclophosphamide only. This is referred to as a historical control but, as we shall see, a hysterical control might be more appropriate. You also review the biomedical literature for similar studies. You conclude, as you predicted, adding tacrolimus decreased the risk and severity of acute GvHD. This is a breakthrough and you hastily submit a typescript for publication (not BONE MARROW TRANSPLANTATION which has too high standards, perhaps BIOLOGY of BLOOD and MARROW TRANSPLANTATION) where it is under consideration. Because this is an important clinical observation and you fear being wrong you repeat the trial, again with 50 subjects. This time the data, surprisingly to you, suggest no benefit of adding tacrolimus to posttransplant cyclophosphamide. What should you do: (1) a 3rd trial with more subjects; (2) be more circumspect about your conclusion; (3) withdraw your typescript from consideration; or (4) try to find an explanation for the difference results of the two trials.

Almost all the transplant experts to whom I presented this question recommended trying to find an explanation for the difference between results of the two trials rather than doing a larger trial with more subjects or, better, a large randomized trial. Respondents had so much faith in results of each trial they sought an explanation for the discordance despite contradictory outcomes and conclusions. It turns out clinicians rarely attribute a deviation of results from expectations to sampling variability, because they find a causal explanation for any discordance. The most likely explanation, of course, is sampling variability. To avoid this cognitive error one should repeat the trial with more subjects or do a large randomized trial. However, because most clinical trialists are subject to the way the human mind operates (discussed below) they believe in the law of small numbers. Namely, they fail to appreciate the observations from a small sample which insufficiently represents a population of interest. They are also the victim of a confirmation heuristic or bias as we shall see below.

This cognitive mistake was first described 30 years ago by Tversky and Kahneman in an article in Psychological Bulletin entitled: Belief in the Law of Small Numbers. It describes a heuristic or cognitive shortcut. They wrote:

People have erroneous intuitions about the laws of chance. In particular, they regard a sample randomly drawn from a population as highly representative, that is, similar to the population in all essential characteristics. The prevalence of the belief and its unfortunate consequences …. are illustrated by the responses of professional psychologists to a questionnaire concerning research decisions [1].

The ubiquity of this cognitive error in diverse sciences is such that Kahneman, a psychologist, received the 2002 Nobel Prize in Economics. (Tvserky died in 1996).

A heuristic, from the Greek heurískō (I find or discover), is a rapid approach to problem solving using a practical method such as trial and error, a rule of thumb or an educated guess. A heuristic can be thought of as a cognitive shortcut based of recognizing a prior pattern or experience (more on this below). A heuristic approach is not guaranteed to give the correct answer or to be rational, but it is hopefully sufficient to reach an immediate conclusion or an approximation of the truth. Heuristics reduce the effort of decision-making. They challenge the notion humans come to conclusions through a deliberative, rational process. In some regards they resemble thin slicing which I discussed previously (see below) [2]. Heuristics are useful but as we shall see they sometimes result in cognitive biases or errors with untoward consequence. These errors are common and potentially dangerous in medicine.

Cognitive scientists have identified two processes humans use to process incoming data and which are associated with different brain regions. System 1 is rapid and intuitive. For example, incoming data are viewed in the context of previously observed patterns termed pattern recognition. System 2, in contrast, is slower, analyzing incoming data in greater depth. It is the system we associate with rational decision-making. Because of increased speed and reduced effort the tendency is for us to analyze incoming data using System 1. Although this approach often works it is subject to error, sometimes grave.

When Sir William Osler said: Listen to your patient; He is telling you the diagnosis he was, of course, unaware of recent progress how the human mind works and how cognitive biases might lead you to a wrong diagnosis even when you to listen carefully to your patient. Diagnostic errors are common with an estimated frequency of 10–40% [3, 4]. and are a serious source of morbidity and death [5, 6]. Many of these diagnostic errors are the result of heuristics.

Another important mental process resulting in diagnostic errors is thin-slicing. Thin slicing describes the ability to find patterns in events based only on a thin slice of experience. Making very quick inferences about a subject or question with few data such as: he looks intelligent or he looks honest. Used car salesmen thrive on this. Decisions or judgments based on thin-slicing are like those based on more information and can be as accurate or even more accurate. However, thin slicing can result in monumental errors an example of which is the Warren Harding effect. On 4th March 1921, Warren G. Harding became 29th President of the United States. This confused many people for good reason; Harding lacked experience and intellectual ability; many deemed him unfit for office. So how did he beat his more qualified opponent? Had his campaign team cleverly outwitted the opposition? No, the reason is even though Harding wasn’t the best qualified candidate he looked the most presidential. As many readers will know Harding is judged one of the worst US President but currently receiving strong competition.

At this point readers may wonder what heuristics and thin slicing have to do with haematopoietic cell transplants, a lot. Consider, for example, the role of heuristics in diagnosing acute GvHD. There are no accurate or precise tests to diagnose acute GvHD. Skin and rectal biopsies early posttransplant are unreliable (for example [7,8,9,10,11,12,13]) and radiological studies lack sensitivity and specificity [14, 15]. Biomarkers studies are useful to predict severity and outcome but not diagnosis of acute GvHD (reviewed in [16, 17]). The result is diagnosis of acute GvHD is probabilistic, not deterministic relying on assessment of diverse, often inaccurate and/or imprecise, data inputs. Consequently, it is not surprising there are considerable discordances between clinical observers (reviewed in [18]) and between autopsy findings with clinical diagnoses [19].

Next, assume you are a visiting professor at a transplant center. The house staff ask you to see a transplant recipient they suspect has acute GvHD. You are told the recipient is > 60 years receiving a bone marrow graft from a young HLA-haplotype-matched relative who was cytomegalovirus (CMV)-sero-positive whilst the recipient was CMV-sero-negative. The pretransplant conditioning regimen was fludarabine and cyclophosphamide and posttransplant immune suppression, cyclophosphamide, tacrolimus and mycophenolate mofetil. It is now 21 days posttransplant and the recipient has a diffuse rash and modest diarrhea with normal liver function tests. Your highly anticipated Medicine Grand Rounds lecture for which you traveled 3000 km missing your child’s starring role in a school play starts in 15 min and you need to load the PowerPoint on your thumb drive onto the lecture hall computer. You quickly examine the recipient concluding he has acute GvHD, a textbook case. Off to your lecture followed by lunch with the Head of Medicine and the Dean. Perhaps they want to recruit you?

Let’s consider which heuristics might operate in this setting. The 1st is the representativeness heuristic or bias. What happens is you search for a pattern and declare the recipient has acute GvHD because it matches the pattern you recalled without considering pretest probabilities of other potential diagnoses such as an antibiotic- or CMV-induced rash and diarrhea amongst others.

On to confirmation heuristic or bias. When the house staff presented the case in a small conference room before seeing the recipient you remarked this sounds like acute GvHD. Now you hurriedly examine the recipient and confirm your prior impression of acute GvHD failing to give appropriate weight to findings which might support the other diagnoses discussed above.

Next we have the availability heuristic or bias. When you made rounds with the house staff at your home hospital yesterday afternoon you saw two recipients you diagnosed as having acute GvHD. This memory is rapidly recalled the next day even though the today’s recipient has many differences from yesterday’s recipients. This is also referred to as recall heuristic or bias.

Two other cognitive errors warrant discussion. The 1st is premature closure, closing the diagnostic process before all potentially relevant data are considered. In the recipient we are discussing it was diagnosing acute GvHD before determining which antibiotics the recipient was receiving or reviewing results of posttransplant CMV-testing. The 2nd is anchoring. Anchoring means focusing on one aspect of a diagnostic situation, such as whether the rash involved the palms and soles whilst discounting other findings which might have resulted in different diagnoses.

Although I focused on accurate diagnosis of acute GvHD, heuristics operate in many other setting related to haematopoietic cell transplants. Perhaps the most important is whether to recommend a transplant Verusus an alternative therapy or no therapy. The two most important heuristics in this setting are the confirmation and availability heuristics or biases. Considerable data indicate physicians performing a technical procedure have biased opinions regarding the procedure’s value. Elsewhere I and others discuss how the probability of physician risk-taking corelates with sex and other domains such as a person’s stock market and gambling [20]. Most transplant experts acknowledge the considerable inaccuracy and imprecision predicting the fate of someone with acute myeloid leukemia in 1st histological complete remission [21]. However, a transplant expert’s evaluation of publications of contradictory results of randomized clinical trials comparing transplants versus conventional therapy in this setting is subject to the confirmation heuristic or bias. We accept studies confirming our beliefs and reject discordant results. As Samuel Butler, a critic of Darwin, wrote in the nineteenth century before heuristics were identified:

He that complies against his will, Is of his own opinion still, Which he may adhere to, yet disown, For reasons to himself best known [22].

The 2nd important heuristic in this setting is the availably heuristic or bias. Transplant experts are likely to be influenced in their recommendations by outcomes of their most recent transplants. If the last three recipients died with extensive chronic GvHD they are less likely to recommend a transplant to a new person compared with when the last three transplants were successful even if success cannot be attributed to the transplant with certainty such as when someone with acute myeloid leukemia in 1st complete remission cured by chemotherapy receives an autotransplant.

What are the medical consequences of these cognitive errors? There are books on this [23]. Data from several large surveys indicate about one-half of regarded as standard-of-care medical interventions are not evidence-based, proved ineffective or proved harmful [24]. This batting average includes data from large randomized trials published in high impact factor journals like the New England Journal of Medicine [25]. It is difficult to do no harm under these circumstances.

Here, I discuss only a few common heuristics which apply to diagnosis and clinical decision-making. More heuristics are described and many others potentially operate in the setting of haematopoietic cell transplants [26]. The bottom line is transplant physicians need to be aware of these potential cognitive errors in clinical decision-making.

We would do well to recall the policy of John Locke, seventeenth century British physician and philosopher. Locke believed in the ideology of science whereby something must be capable of being tested repeatedly and that nothing is exempt from being disproved. He said, admirably:

Whatever I write, as soon as I discover it is not to be true, my hand shall be the forwardest to through it in the fire [27].

Would that we would be so quick to abandon our belief in the face of facts and move to System 2. And if you doubt the potential import of the confirmation heuristic I refer you to a recent survey of transplant experts participating in Blood and Marrow Transplant Clinical Trials Network (BMT CTN) randomized clinical trials, where 60–90% of respondents had an opinion which arm of an ongoing trial would be better [28].

For readers wanting to learn more on heuristics, cognitive biases, thin slicing and related topics I suggest books by Daniel Kahneman (Thinking, Fast and Slow), Michael Lewis (The Undoing Project), Jerome Groopman (How Doctors Think) and Malcolm Gladwell (Blink).

And remember to be on guard against the human tendency to want to be right even when you’re wrong. For this I recommend Robert Burton’s: On Being Certain: Believing You Are Right Even When You’re Not. I’m certain you will find it interesting.