Monday, March 5, 2012

Goldfinger's pan-man-scan: Available at a hospital near you

As my regular readers know all too well, I think a lot about our proneness to cognitive biases and how they impact our healthcare choices. I have been reading Sam Harris's book The Moral Landscape (Free Press, 2010), where he tries to lay a scientific foundation behind our values and beliefs. It is rather a dense book, full of linguistic convolution and intellectual contortions. And although I do not agree with Harris on some of his points (I think he would say that my disagreement stems more from my preconceived notions and values than from the weakness of his arguments), there are some very enlightening ideas in the book.

On page 123 he discusses bias:
...we know that people often acquire their beliefs about the world for reasons that are more emotional and social than strictly cognitive. Wishful thinking, self-serving bias, in-group loyalties, and frank self-deception can lead to monstrous departures from the norms of rationality. Most beliefs are evaluated against a background of other beliefs and often in the context of an ideology that a person shares with others.
Doesn't this echo my assertion that scientific knowledge acquisition tends to be unidirectional? What occurs to me is that the combination of these emotional and ideological factors along with our cognitive predispositions gangs up to steer us toward doom. Too nihilistic? Well, that's not what I am going for. Bear with me, and I will show you how it is driving us into bankruptcy, particularly where the healthcare system is concerned.

We know a lot about how the human brain works now. Creatures of habit, for the purpose of conserving energy, we develop many mental shortcuts that drive our decisions. Think about them like the impact of talking on your cell phone while driving: most of the time under usual circumstances you can pay enough attention to both driving and talking. But on occasion something unexpected comes along, and you find yourself plowed into the backside of an eighteen-wheeler, your face firmly pressed into an airbag, if you are lucky. So are these mental shortcuts: most of the time they serve us well, or at least don't get us into trouble, while just when we least expect it, the truth ambushes us and we suffer from an error in this habitual way of deciding.

Let's bring it around to medicine. Back in 1978, the dark ages, an intriguing paper was published by Casscells and colleagues in the New England Journal of Medicine. Here is what they did:
We asked 20 house officers, 20 fourth-year medical students and 20 attending physicians, selected in 67 consecutive hallway encounters at four Harvard Medical School teaching hospitals, the following question: "If a test to detect a disease whose prevalence is 1/1000 has a false positive rate of 5 per cent, what is the chance that a person found to have a positive result actually has the disease, assuming that you know nothing about the person's symptoms or signs?"
Here are the results:
Eleven of 60 participants, or 18 per cent, gave the correct answer. These participants included four of 20 fourth-year students, three of 20 residents in internal medicine and four of 20 attending physicians. The most common answer, given by 27, was 95 per cent, with a range of 0.095 to 99 per cent. The average of all answers was 55.9 per cent, a 30-fold overestimation of disease likelihood.
Since we are all experts on the positive predictive value of a test, I hardly have to go through the calculation. Briefly, though, and just to be complete, among 1,000 people, 1 has the disease and additional 50 test falsely positive. The PPV then is 1/51 = 2%. This is the chance that a person found to have a positive result actually has the disease. Yet nearly 1/2 of the responders said 95%. What did you estimate? Be honest. And this was Harvard! What chance do the rest of us, mere mortals, stand before such formidable cognitive traps? We might as well throw in the towel and prepare for an onslaught of tail-chasing physicians and patients spinning their gears in pursuit of false positive results. Because once you let a positive finding out of the bag... Well, you get the picture. I would wager that this is at least in part what has hijacked our healthcare systems, its finances and our reason. After all, as Sam Harris pointed out, it doesn't take much for us to form beliefs: no rational thinking required. How can I be so sure, you ask?

Casscells and his team in their paper bring up a phrase "the pan-man scan." They use it to refer to a battery of 25 screening tests and the chances that all of them come back normal if the person is perfectly healthy. Care to take a guess? Twenty-eight per cent. That's 28%! This means that out of 10 patients who walk into the office healthy, 7 will walk out with an abnormal finding and either be assigned a disease or be sent for further testing, even if it is only a repeat of the previously abnormal test. Now think back to your last "routine physical." Did you get "routine" blood work, cholesterol test, urinalysis? How many separate values were run? What was your risk of having an abnormal value and what ensued when you did?

The point here is once again to think about the pre-test probability of the disease and the characteristics of the test. Without this information there is no way to make an informed decision about a). whether or not to bother getting the test, and b). what to make of the result. But how do you rewire clinicians' and the public's brains to get out of these value-laden habitual errors that drive us to all kinds of bad behaviors?

Well, a psychologist from Northwestern University in Chicago by the name of Bodenhausen studies the human brain's proclivity for stereotyping. How is this relevant to medical decision making? Well, in medicine we tend to think in stereotypes. What I mean is that when a middle-aged male smoker with diabetes and hypercholestrolemia comes to the ED with a crushing substernal chest pain, we invoke the stereotype of an acute MI patient, and in this case such stereotyping allows us to make some good choices very rapidly. Where we fall into a trap is in the example above, where we stereotype a patient with a positive screening test into a group with the disease. While it is natural for our habit-addicted brains to do so, it lands us in hot water, individually and as a society. Well, Bodenhausen in this paper (subscription required) in the journal Medical Decision Making states plainly that there are certain circumstances that predispose us to stereotyping: complex decisions, cognitive overload, and happy, angry and anxious moods. Do you recognize any of these risk factors for stereotyping in medicine? I thought so.

So, what do we do? Here are a 5 potential solutions to consider:
1. Go back and think some more.
Metacognition training early and often would be a great addition to the medical curriculum.
2. Get our medical appointment out of the microwave
Give the clinician more cognitive time. Not possible, you say? Pity! That is the simplest answer.
3. Educate the public about these cognitive thought-traps and rational approaches to making medical decisions.
4. If we are so committed to the proliferation of medical stuff, then we really need better HIT infrastructure. We need bedside decision aids that automatically calculate the patient's probability of having a disease given their pre-test probability, as well as posterior probability adding the test characteristics into the mix. Having these data before ordering tests might reduce some of the waste and tail-chasing. It may also decrease some of the harm that comes with over-utilization in our dogged pursuit of a definitive diagnosis.
5. Finally, the cost-benefit equation of innovations in medicine must begin to incorporate these cost to cognition. They may be higher than we have admitted so far.

 

6 comments:

  1. While I respect and agree with the majority of this article, I totally disagree with your comment about cell phones. We cannot, and should not, talk on a cell phone while driving and expect our attention to be unimpaired. It is not.

    Witness the increasing number of accidents that occur while people are talking on cell phones. Apologists will say, "Well, we talk to others in our car while we're driving, so what's the difference?"

    It is a huge difference. When talking on the phone (any phone) we disconnect ourselves from the reality of what we're actually doing, which is fine if we're at home, on our feet, and perhaps pacing the kitchen with a landline's long, twisted cord, or roaming the house on a cordless (or a cell) phone. The dynanics change when we're in charge of a motor vehicle barreling down the highway at 60 miles/hour, or even in a parking lot, when we're yapping with a neighbor while trying to back into a parking space and then wondering why the hell we just creamed a car in the next slot.

    Okay .. rant over ... back to your normal programming.

    ReplyDelete
  2. Maggie, thank you for your comment. I am sorry I came across as condoning mobile calls while driving. I should clarify that I do not, even though I do occasionally slip into the behavior myself. The point I wanted to get across was that when everything is as it has always been, slipping into autopilot gets us by. But you cannot predict the unpredictable, so autopilot in many situations is a cognitive error. When the stakes are high, as they are while barreling down a highway at 65 mph, we should not rely on habits. So, for the record, hang up and drive, everyone!

    ReplyDelete
  3. I agree wholeheartedly with your sentiments on education. The importance of teaching humans often and early about critical thinking cannot be overemphasized or overrated. For the past few years I have done my best to rewire my own metacognition to be more critical and less biased, but such student-driven methods *by themselves* won't work for society as a whole. After all, he materials that I have used to educate myself largely had to be purposefully sought out. They were seldom given to me (with a few notable exceptions). In our modern age of cheap and ubiquitous information technology, that kind of oversight is unacceptable. (Access to information is effortless. Understanding it: as priceless as it is rare.)

    The biggest complaint I've had regarding higher education throughout my studies so far has consistently been, "why did no one tell me this earlier?" Nowhere has this been more prevalent than in the realm of learning how to combat cognitive bias. Neuroscience, psychology and sociology have been telling the world for at least a decade about the dozens of mental pitfalls that we are *all* subject to, as flawed biological organisms that evolved equipped to deal with an entirely different era. It's time schools actually started listening.

    To date, my own higher learning has been less about 'education' than about self-directed 're-education'. We, and in this case I mean the larger public, should not have to re-educate ourselves. Our educational system should get it right in the first place!

    ReplyDelete
    Replies
    1. "After all, [t]he materials that I have used to educate myself largely had to be purposefully sought out."

      Will you recommended reading materials?

      Delete
  4. Hey, Qath, great to have you here! I agree completely, as you know. Learning to metacognate should be the core curriculum for all, the earlier the better.

    Hey, everyone, meet my fabulous niece Qatherine Dana -- she is only getting started. Some day you will say "I met her on Healthcare, etc."

    ReplyDelete
  5. Q's jaw-dropping way with words leaves me stunned with admiration. In a word: gobsmacked.

    ReplyDelete