PM9 MARKOV DECISION PROCESSES - ANALYTIC METHODS FOR SEQUENTIAL DECISIONS

Sunday, October 24, 2010: 2:00 PM
Gingersnap (Sheraton Centre Toronto Hotel)
Course Type: Half Day
Course Level: Advanced

Format Requirements: Participants should be comfortable with basic notions of probability. In addition, they should be familiar with one or more techniques for modeling in medical decision making, such as Markov models or simulation.

Background: Participants will learn how to model Markov decision processes, how to solve them, and previous applications to medical decision making.

Description and Objectives: Markov decision processes are mathematical techniques used to solve sequential decisions under uncertainty. While their use is quite common in operations research, there have been relatively few successful medical applications. However, many medical problems appear well suited to this technique; for example, questions related to the timing of an intervention are essentially sequential decisions. We will begin with a general description of Markov decision processes, and show how they differ from other common modeling techniques, such as simulation and Markov models. For example, an MDP is a perfect structure to model a problem which would require the typically forbidden “embedded decision nodes” in standard decision analysis structures. We will work through an inventory example that is commonly encountered in industrial settings. We will describe the various types of MDP models, and which might be appropriate for certain applications. We will then focus on the five components of an MDP, and how MDPs model sequential decisions under uncertainty. We will describe solution techniques for solving MDPs, as well as potential difficulties in implementation. We will conclude with case studies of previous successful applications of MDPs to medical decision making.

Course Director:
Andrew Schaefer, PhD
Course Faculty:
Mark S. Roberts, MD, MPP and Lisa Maillart, PhD