CAPTURING THE PERCEPTIONS OF CANCER PATIENTS TO GUIDE THE DESIGN OF DECISION AIDS THAT INCORPORATE EVIDENCE ABOUT COMPARATIVE EFFECTIVENESS

Tuesday, October 26, 2010
Sheraton Hall E/F (Sheraton Centre Toronto Hotel)
Karen B. Eden, PhD1, Anais Tuepker, PhD, MPH2 and David H. Hickam, MD, MPH1, (1)Oregon Health and Science University, Portland, OR, (2)Portland VA Medical Center, Portland, OR

Purpose: In developing interactive tools to promote shared decision making, a fundamental design goal is to ensure that the tool presents decision-critical evidence  in ways that are easy to understand.  Patients who have recently made cancer treatment decisions can be useful informants for testing presentation approaches.

Method: We previously had developed an interactive Web-based decision aid on treatment choices for newly diagnosed prostate cancer. The decision aid includes evidence on the comparative effectiveness of prostatectomy, radiation therapy, and expectant management.  It obtains patient preferences about mortality and side effects. We conducted cognitive interviews with a convenience sample of 17 men who had either undergone or were currently deciding on treatment options for recently diagnosed localized prostate cancer.   Each participant reviewed the decision aid and described his understanding of the information included in the decision aid.  We then performed inductive coding of the full interview transcripts using a grounded theory method.

Result: The men frequently placed emphasis on information that justified their previous or planned treatment choices. Men positioned themselves as self-taught experts, commenting on the strength of evidence, to minimize or emphasize data on outcome risks.  Men also positioned themselves as possessors of experiential knowledge which allowed them to minimize or emphasize the impact of an outcome should it occur, framing comparative effectiveness data as inherently “not the full picture.” These descriptions indicated that the men used a variety of strategies to combine preferences and data, and they had difficulty separating their own beliefs about treatment results from the information provided by the decision aid. 

Conclusion: These findings indicate that patients rarely neatly divide “evidence,” “preferences,” (personal) experience and (expert) knowledge into separate domains that function as discrete variables in a decision-making algorithm. Past experience has a strong influence on how patients interpret comparative effectiveness data.  Developers should account for these tendencies when evaluating new decision aids. Our findings also have implications for how patient expertise is conceptualized and activated by decision aids. Decision aid designers need to be aware that, when obtaining user feedback, the question “what does this information mean?” is not limited just to comprehension.