Interface design as a life or death proposition

In the mid-1980′s, a team of physicians, lawyers, and public health experts conducted a lengthy study of the nature and causes of medical errors. They published their findings, entitled "Incidence of adverse events and negligence in hospitalized patients," in the New England Journal of Medicine in 1991.[1] Their research indicated that "there is a substantial amount of injury to patients from medical management, and many injuries are the result of substandard care." While the industry evaluations and renovations sparked by these findings have taken effect, physicians and clinicians have simultaneously adopted more sophisticated technologies to provide more accurate and efficient care. [2]

Since we place a great deal of trust in physicians and clinicians, most of us would rather not imagine a scenario in which our lives depend on the successful navigation of a series of menus, selections, or form fields. Yet consider this chilling description of an event involving the Therac-25, a machine that performed both radiation treatment and X-rays:

[The technician got the patient] set up on the table, and went down the hall to start the treatment. She sat down at the terminal, and hit "x" to start the process. She immediately realized she made a mistake, since she needed to treat [the patient] with the electron beam, not the X-ray beam. She hit the "Up" arrow, selected the "Edit" command, hit "e" for electron beam, and hit "Enter", signifying she was done configuring the system and was ready to start treatment.

The system presented the technician with a "Beam Ready" prompt, indicating it was ready to proceed; she hit "b" to turn the beam therapy on. She was surprised when the system gave her [an] error message …

Although the machine told the operator it was in electron beam mode, it was actually in a hybrid proton beam mode. As a result, the system was delivering a radiation blast of 25,000 rads with 25 million electron volts, more than 125 times the normal dose. ["Human Error: Designing for Error in Medical Information Systems,"5]

While the most critical errors in this example are engineering-related, numerous interface pitfalls are evident. Switching between the two device modes—electron beam and hybrid proton beam—is something that clearly merits rich feedback. "Beam Ready?" Which beam? In the situation described above, the user only needs to ensure that the device is in the correct mode, and then activate it. The complex series of interface actions—"Up" arrow, "Edit" command—introduces more points of failure than necessary.

In a 1993 edition of IEEE Computer, two software engineers published an analysis of the software and interface design of the Therac-25. They discovered that the device’s software was built for an earlier generation of the device, and then modified so that it ran with the newer models. There was very little documentation for the application, and its designers failed to consider basic usability or human factors principles. The authors of this analysis preface a lengthy list of recommendations for medical device development by stating that more sophisticated design procedures "must be incorporated into safety-critical software projects."[7]

In recognition of this need for design, the FDA is stepping up its oversight of medical device companies by overhauling its requirements for user research documentation of new medical devices. In a report entitled "Do It by Design: An Introduction to Human Factors in Medical Devices and Radiological Health," the FDA notes that "poor user interface design greatly increases the likelihood of error in equipment operation."[4] While the FDA has always required thorough documentation of product development, recent initiatives have instituted a more prescriptive, design-focused procedure encouraging extensive user research at the beginning of the development process.

FDA Design Control

By requiring companies to spend more time and money to understand the needs and behaviors of physicians and clinicians, the devices they use, their information systems, and the environments in which they work, the FDA hopes to error-proof the technology used for the hands-on care of patients.

The FDA’s objective is to "require manufacturers to establish and maintain procedures to control and verify that a design appropriately addresses the needs of device users."[9] Their five-step procedure, called "Design Control," is intended to provide a clear and repeatable process for developing medical equipment:

  • Concept – What do users need?
  • Design input – How do we appropriately address these needs?
  • Design output – How do we build it?
  • Verification & Validation – Are we addressing the right needs in an appropriate manner? (i.e., Does it work for our users?)

In this article I focus on the Concept phase, in which a company comes to thoroughly understand the needs and behaviors of patients and clinicians before designing a device to meet those needs. Turning behavioral research into a solid rationale for making product design decisions isn’t easy, and there are few models from any industry worth emulating. But it’s clear that careful documentation of user research can provide advantages above and beyond FDA approval. In my experience, I have found that through understanding and documenting the goals, motivations, and needs of users, a product team can significantly reduce the duration of the product definition and development cycle, resulting in better products with a shorter time-to-market.

Using design research

To satisfy the Concept phase of the FDA’s Design Control process, a company has to answer this question: How can the new product help physicians or clinicians provide better care? The only way to do this is to fully understand the product’s usage environment.

I recently had the opportunity to help facilitate this process during a design project for a large medical equipment manufacturer. By conducting field research and interviews at hospitals around the country, my teammate and I gained valuable insights into the complexities of the usage environment of our client’s product. Below are some of the strategies that we used in our research:

  • Decide who the user is. Is the user a physician, a nurse, a nurse assistant? Each of these roles represents a vastly different set of skills and expectations, and often the user turns out to be someone other than who you first expected. In our case, the client had targeted the first version of the device at nurses. But through our research we found that nurses were delegating its use to nurse assistants, a group that tends to have less training and higher turnover than nurses. By understanding exactly who the user is, the product development team can focus its energy on the workflows and skill sets of that specific group.
  • Get into the usage environment. Imagine that a researcher asks you to come into a conference room and explain how you use the steering wheel in your car. How would you respond? Could you list all the kinds of turns you make? Could you describe how your hands grip and turn the wheel?

    Most medical device companies conduct this kind of research, sending marketers to interview clinicians while they’re on break. Unfortunately, the kind of information derived from a question like "Explain how you use this device," is vastly different, and usually inferior, to the data generated by asking, "Do you mind if I watch you use this device?" Watching a clinician interact with the product in situ exposes the unconscious behaviors that most users will never think to mention in a conference room. This kind of information is invaluable in fleshing out usage scenarios for the FDA.

  • Understand the usage context. Watching a clinician operate a device will give you some information about its ergonomics and interface, but even more valuable is understanding how this task fits into the clinician’s overall workflow. What does she do before she uses the device? What does she do next? What does she have in her hands? Answers to these questions point to the objectives that users are trying to achieve, and reveal opportunities to help them do their jobs better.
  • Ask users about problems, not solutions. While it’s tempting to ask current users of a product questions like "What features do you want in the next version?" all you’ll end up with is a long list of features with no good basis for prioritizing them. Worse, the features users come up with tend to be mere quick fixes for their current pains. Much more useful information is generated by asking questions like "How does the current product frustrate you?" and then following up with an inquiry into what the user is actually trying to accomplish. This approach can lead to solutions that provide value in ways users couldn’t have imagined, but that they recognize as soon as they see the product.

    Also, don’t be fooled by the tantalizing, but spurious, precision of marketing data like "78% of respondents say they want a touchscreen in the next version." What statistics like this don’t provide is any information on what problem this feature would actually be solving. Without that background, there’s no way to determine whether adding a touchscreen is actually worth the extra cost.

  • Distinguish tasks from goals. Performing a successful blood test is not a nurse’s goal; her goal is to help the patient get better. She uses different tests, devices, instruments, and tools to achieve this goal. Understanding the ways in which a device can help the user in this broader context can expose new opportunities and provide a rationale for addressing engineering and development challenges.

Thoroughly understanding the needs and goals of users

While it’s unclear what portion of the 44,000 to 180,000 deaths caused by medical errors each year are due to interaction problems, a 1998 article on medical error in the Journal of the American Medical Association found that the absence of human factors research is also "fairly common in other safety-critical systems." It refers to research of "organizational accidents" revealing that many systems are not designed for safety but rely on "error-free performance enforced by punishment" rather than error-free user interface design and software workflows that support, rather than thwart, the typical behaviors of users.[1]

Consider a recent friendly-fire tragedy in Afghanistan:

[An] Air Force combat controller was using a Precision Lightweight GPS Receiver … to calculate the Taliban’s coordinates for a B-52 attack.

Minutes before the fatal B-52 strike … the controller had used the GPS receiver to calculate the latitude and longitude of the Taliban position in minutes and seconds for an airstrike by a Navy F/A-18 … The controller [then] did a second calculation in "degree decimals" required by the bomber crew. The controller had performed the calculation and recorded the position, the official said, when the receiver battery died.

Without realizing the machine was programmed to come back on showing the coordinates of its own location, the controller mistakenly called in the American position to the B-52. The JDAM landed with devastating precision. [Vernon Loeb, Washington Post, March 24, 2002]

While it’s all too clear the device worked accurately in its delivery of coordinates, it clearly fails in providing context for these coordinates. The user of the device was a combat controller, and his success depended on relaying accurate information about the location of the enemy. During the device research phase, interaction designers would have surfaced this user’s crucial need: precise enemy coordinates in context with his current position. Moreover, recognition of the device’s usage environment—the battlefield—would have identified the need for smart power-off and battery failure defaults.

In safety-critical systems, a thorough understanding of user needs and goals helps to establish workflows, environments, and behaviors that need to be supported. As technology improves and systems change, these needs will remain the same, whether the user is a clinician who needs to administer a radiation treatment or a combat controller who needs to bomb the enemy. Such accurate documentation of user research enables a company to keep user needs at the fore of the development process while incorporating technological advances and reducing the need to reinvent the product during every development cycle.

[1] Lucian L. Leape, MD; David D. Woods, PhD; Martin J. Hatlie, JD; Kenneth W. Kizer, MD, MPH; Steven A. Schroeder, MD; George D. Lundberg, MD; "Promoting Patient Safety by Preventing Medical Error," Journal of the American Medical Association, 28 October 1998.

http://jama.ama-assn.org/cgi/content/extract/280/16/1444

[2] Brennan, Troyen A.; Leape, Lucian L.; Laird, Nan M.; Herbert, Liesi; Localio, A. Russell; Lawthers, Ann G.; Newhouse, Joseph P.; Weiler, Paul C.; Hiatt, Howard H "Incidence of adverse events and negligence in hospitalized patients: Results of the Harvard medical practice study," The New England Journal of Medicine 1991 Feb 7; 324(6): 370-376

[3] Rodney Howard, MD; Timothy Hofer, MD, MS; "Estimating Hospital Deaths Due to Medical Error: Preventability Is in the Eye of the Beholder," Journal of the American Medical Association, July 25, 2001.

[4] Sawyer, Dick, Officer of Health and Industry Programs, et al. "Do It by Design: An Introduction to Human Factors in Medical Devices." Center for Devices and Radiological Health, U.S. Food and Drug Administration, Public Health Service, U.S. Department of Health and Human Services. Pages iii, 4, 33, 18.

http://www.fda.gov/cdrh/humfac/doitpdf.pdf

[5] Ramon M. Feliciano, "Human Error: Designing for Error in Medical Information Systems." http://www-smi.stanford.edu/people/felciano/research/humanerror/humanerrortalk.html – RTFToC6

[6] Ron Kaye, Jay Crowley. "Medical Device Use-Safety: Incorporating Human Factors Engineering into Risk Management: Identifying, Understanding and Addressing Use-Related Hazards." Division of Device User Programs and Systems Analysis, Office of Device and Industry Programs, Center for Devices and Radiological Health, U.S. Food and Drug Administration, Public Health Service, U.S. Department of Health and Human Services. Page 14.

http://www.fda.gov/cdrh/humfac/

1497.pdf

[7] Nancy Leveson, Clark S. Turner, "An Investigation of the Therac-25 Accidents," IEEE Computer, Vol. 26, No. 7, July 1993, pp. 18-41.

http://courses.cs.vt.edu/~cs3604/lib/Therac_25/Therac_1.html

[8] Christine Engelke, Daniel Olivier, "Putting Human Factors Engineering Into Practice," Medical Device & Diagnostic Industry, July 2002,

http://www.devicelink.com/mddi/archive/02/07/003.html

[9] Peter B. Carstensen, "Overview of FDA’s New Human Factors Program Plan: Implications for the Medical Device Industry,"

http://www.fda.gov/cdrh/humfac/hufacpbc.html

[10] "Design Control Guidance for Medical Device Manufacturers," March 1997,

http://www.fda.gov/cdrh/comp/designgd.html

(design waterfall image)

1 Comment

www.g77.fr
For abundance some of our your friends already know mankind; when trouble young children and can some of our your friends. www.g77.fr http://www.g77.fr/

Post a comment

We’re trying to advance the conversation, and we trust that you will, too. We’d rather not moderate, but we will remove any comments that are blatantly inflammatory or inappropriate. Let it fly, but keep it clean. Thanks.

Post this comment