Learning from How Doctors Think

When I picked up Jerome’s Groopman’s How Doctors Think, I imagined that it would give me a useful window into the mind of the busy clinician. On medical projects we often find it a bit challenging to get enough research time with physicians. (Aside from maybe lawyers and CEO’s, there are no better exemplars of the “time is money” mentality—American doctors, in particular.)

Dave Cronin, Doug LeMoine and Noah GuyotDave, Doug and Noah learning how surgeons think.

While I appreciated the informal history of medical education, interesting anecdotes of diagnostic challenges and satisfying dose of medical atmosphere, I learned just as much about design and design research as I did about medicine. (I’m not surprised to discover that I’m not the first to make this connection. In her blog, Elegant Hack, Christina Wodke discusses how she thinks design education should be thought of more like medical education, with a focus on gaining experience over several years in industry, rather than just technical skill in a design program. She’ll get no argument out of me there.)

The part of the book that I found most striking is Groopman’s discussion of what he calls “classic cognitive errors” in diagnosing and treating medical conditions. I have made and seen each of these errors in understanding people and devising products and services to meet their needs. While explicit knowledge of these categories of flawed thinking isn’t a guarantee against them, I do think that by naming them and affirming their reality (often by reference to the work of psychologists), this book can help us remember the kind of mistakes that top-notch professionals make when they’re tired, stressed, egotistical, or just lazy.Here are a couple of my favorite cognitive errors (all of which, it turns out, Alan discussed in his talk at Agile 2008):

  • Confirmation bias: “Confirming what you expect to find by selectively accepting or ignoring information.” This runs rampant in discussions about users and their needs. This is a fancy way of saying that your research observations can be swayed by stereotypes. A great (and relevant) example of this is when people say “doctors hate computers.” It’s true that lots of doctors have learned the hard way to avoid them in a clinical environment, but I suspect that a real study would show that doctors’ attitudes towards technology would follow a similar pattern to any demographically similar audience.
  • Anchoring: “Where a person doesn’t consider multiple possibilities but quickly and firmly latches on to a single one, sure that he has thrown his anchor down just where he needs to be.” This is often exacerbated by confirmation bias– “your skewed reading of the map ‘confirms’ your mistaken assumption that you have reached your destination.” If you’ve ever even met a designer, I probably don’t need to explain that most designers do this at least occasionally, especially when under time pressure. The good news is that most folks I’ve had the pleasure of working with can be talked down from this condition, though it might take some gentle words and a little space.
  • Availability: “The tendency to judge the likelihood of an event by the ease with which relevant examples come to mind.” If a left-hand document organizer pane was useful on 3 of the last 5 software design projects, it’s probably going to come to mind as a likely solution on the 6th. (But then again, you might be really bored of the organizer pane and do everything within your power to avoid it.)

Interestingly, when I’m teaching the Interaction Design Practicum, I often relate the appropriate stance of a designer to that of a doctor. Imagine this: A patient walks into a doctor’s office and says “Doc, my stomach really hurts. I think it’s my appendix. I’ve got to have it our right now.” Obviously, the good doctor does not immediately slice open the patient. She thinks to herself “Hmm… this guy must be in a lot of pain to submit to surgery right now, but his hand is over his stomach,” and then she probably asks some questions and uses diagnostic imaging or lab tests to come to some conclusions about his condition.

Unfortunately in the technology industry, the most common approach to involving user feedback in product definition seems to be take the patient’s diagnosis at face value and roll him into the operating theater without a lot of thought. Here’s to hoping that we can all watch out for our cognitive errors and take a little more responsibility for having good judgment.

4 Comments

Vijay
Very nicely written Dave. The relation between doctors and designer you made, reminded me of Robert Martin's key note speech at Agile 2007. In which he was relating developers to Doctors in the context of writing clean code. Doctors have time to wash hands even though they are very busy. Imagine a doctor performing a surgery coming straight from the morgue, which apparently used to happen. -Vijay
Doug LeMoine
@Vijay: I like your metaphor a lot. Today, it's unthinkable that doctors would fail to wash their hands. In Louis Pasteur's day, hand-washing would likely have been seen as superstitious (and therefore a waste of time). In doing design projects for medical informatics, it's often difficult to demonstrate to doctors that the additional time and energy to learn and apply a bit of technology, however small, is worth it. Often, it's because they're thinking about near-term cost, when the benefits of using a tool are realized through consistent, long-term application. Kind of like washing one's hands before providing care.
Michael Long
I believe it was Agile 2008, not 2007 where alan discussed this topic (probably just a type-o). This is a great start for a broader discourse on the effects of cognitive thinking errors in professions where people and their needs are paramount. And how we, as practitioners, hold a great deal of responsibility to ensure needs are met through our solutions (hospital and/or user interface). As someone who is currently working on a product for Hospital Care Providers, I am now looking for opportunities where these cognitive errors could inform new features, or at least improve my own thinking (another good "feature"). Thanks for drawing the connection.
Dave Cronin
Nice catch, @Michael. Of course, it was Agile2008. I think I may be having some cognitive issues with what feels like the rapidly accelerating pace of time. @Vijay, @Doug, interestingly enough in Atul Gawande's Better: A Surgeon's Notes on Performance, he talks about how difficult it actually is to get doctors and nurses to properly wash and sanitize their hands. Some of the best results he saw there were facilitated by an industrial designer.

Post a comment

We’re trying to advance the conversation, and we trust that you will, too. We’d rather not moderate, but we will remove any comments that are blatantly inflammatory or inappropriate. Let it fly, but keep it clean. Thanks.

Post this comment