Mark Wess, MD1, Jason Saleem, PhD, MS
2, Sara E. Luckhaupt, M.D.
3, Joseph A. Johnston, MD, MSc
4, Ruth Shaull, MSN, RN
5, Jonathan Kopke
1, Joel Tsevat, MD, MPH
1, and Mark Eckman, MD, MS
1. (1) University of Cincinnati, Cincinnati, OH, (2) Cincinnati Veterna Affairs Medical Center, Cincinnati, OH, (3) University of Michigan, Ann Arbor, MI, (4) Eli Lilly, Indianapolis, IN, (5) University of Cincinnati College of Medicine, Cincinnati, OH
Purpose: Decision support tools (DST) are more widely available due to electronic information management and computer access. Some decision-making is complex and has uncertainty that needs to be communicated. Our goal was to perform usability testing on a DST for the complex decision of anticoagulation in patients with nonvalvular atrial fibrillation. In usability testing, typically 3-5 participants provide the majority of information. Methods: We developed a web-based DST in which patient-specific recommendations regarding anticoagulation were conveyed using a red-yellow-green graphic. We performed usability testing with 8 physicians on 9 hypothetical outpatient visits. Some visits contained hidden and/or conflicting information to mimic real-world patients. We recorded the patient visit simulations and debriefed participants. Two independent observers reviewed the tapes and discrepancies were resolved. Critical incidents were summed by frequency. Finally, we compared results across training levels. Results: By the fifth participant, we obtained the majority of unique information. We identified 14 positive and 29 negative critical incidents. The most common positive observations included: participants utilized the ability to change patient risk factors to understand the effect on anticoagulation recommendations (5 of 8) and users generally trusted the tool's results but sometimes wanted additional transparency of the tool's calculations (5 of 8). Negative observations included: alert messages were confused with Windows" alerts and ignored (6 of 8) and users wanted to further classify a diagnosis as treated or remote for some cases (3 of 8). Pre-populated information from the medical record was assumed to be accurate and not regularly checked (4 of 8). Participants felt the red-yellow-green graphic effectively communicated risk-benefit tradeoff and uncertainty. Interns tended to question recommendations less frequently, senior residents tried to learn from the tool, whereas staff physicians were more likely to ignore tool recommendations and use their own judgment. Conclusions: Usability testing can identify: design gaps prior to clinical implementation, potential errors caused by DST use, and assist in guiding revision. Factors identified that would be common to other DSTs are: incorporating pre-populated data from an electronic medical record is acceptable, clinicians wish to understand the logic behind the recommendations, and graphics can be constructed to communicate recommendations and uncertainty. Further study is needed on how differing levels of training influence the needs and use of DSTs.
See more of Poster Session II
See more of The 27th Annual Meeting of the Society for Medical Decision Making (October 21-24, 2005)