Learning from How Doctors Think

When I picked up Jerome’s Groopman’s How Doctors Think, I imagined that it would give me a useful window into the mind of the busy clinician. On medical projects we often find it a bit challenging to get enough research time with physicians. (Aside from maybe lawyers and CEO’s, there are no better exemplars of the “time is money” mentality—American doctors, in particular.)

Dave Cronin, Doug LeMoine and Noah GuyotDave, Doug and Noah learning how surgeons think.

While I appreciated the informal history of medical education, interesting anecdotes of diagnostic challenges and satisfying dose of medical atmosphere, I learned just as much about design and design research as I did about medicine. (I’m not surprised to discover that I’m not the first to make this connection. In her blog, Elegant Hack, Christina Wodke discusses how she thinks design education should be thought of more like medical education, with a focus on gaining experience over several years in industry, rather than just technical skill in a design program. She’ll get no argument out of me there.)

The part of the book that I found most striking is Groopman’s discussion of what he calls “classic cognitive errors” in diagnosing and treating medical conditions. I have made and seen each of these errors in understanding people and devising products and services to meet their needs. While explicit knowledge of these categories of flawed thinking isn’t a guarantee against them, I do think that by naming them and affirming their reality (often by reference to the work of psychologists), this book can help us remember the kind of mistakes that top-notch professionals make when they’re tired, stressed, egotistical, or just lazy.Here are a couple of my favorite cognitive errors (all of which, it turns out, Alan discussed in his talk at Agile 2008):

  • Confirmation bias: “Confirming what you expect to find by selectively accepting or ignoring information.” This runs rampant in discussions about users and their needs. This is a fancy way of saying that your research observations can be swayed by stereotypes. A great (and relevant) example of this is when people say “doctors hate computers.” It’s true that lots of doctors have learned the hard way to avoid them in a clinical environment, but I suspect that a real study would show that doctors’ attitudes towards technology would follow a similar pattern to any demographically similar audience.
  • Anchoring: “Where a person doesn’t consider multiple possibilities but quickly and firmly latches on to a single one, sure that he has thrown his anchor down just where he needs to be.” This is often exacerbated by confirmation bias– “your skewed reading of the map ‘confirms’ your mistaken assumption that you have reached your destination.” If you’ve ever even met a designer, I probably don’t need to explain that most designers do this at least occasionally, especially when under time pressure. The good news is that most folks I’ve had the pleasure of working with can be talked down from this condition, though it might take some gentle words and a little space.
  • Availability: “The tendency to judge the likelihood of an event by the ease with which relevant examples come to mind.” If a left-hand document organizer pane was useful on 3 of the last 5 software design projects, it’s probably going to come to mind as a likely solution on the 6th. (But then again, you might be really bored of the organizer pane and do everything within your power to avoid it.)

Interestingly, when I’m teaching the Interaction Design Practicum, I often relate the appropriate stance of a designer to that of a doctor. Imagine this: A patient walks into a doctor’s office and says “Doc, my stomach really hurts. I think it’s my appendix. I’ve got to have it our right now.” Obviously, the good doctor does not immediately slice open the patient. She thinks to herself “Hmm… this guy must be in a lot of pain to submit to surgery right now, but his hand is over his stomach,” and then she probably asks some questions and uses diagnostic imaging or lab tests to come to some conclusions about his condition.

Unfortunately in the technology industry, the most common approach to involving user feedback in product definition seems to be take the patient’s diagnosis at face value and roll him into the operating theater without a lot of thought. Here’s to hoping that we can all watch out for our cognitive errors and take a little more responsibility for having good judgment.

Dave Cronin

Learn more

Subscribe to our mailing list and stay up to date on our latest thought leadership and learning opportunities.

Connect with us

Want to know how we can help your company drive real business progress?

Let’s talk

Stay up to date on our latest thought leadership and learning opportunities