Markov Models: Overview Gerald F. Kominski, Ph.D. Professor, Department of Health Services Markov Models: Why Are They Necessary? Conventional decision analysis models assume: - Chance events - Limited time horizon - Events that do not recur What happens if we have a problem with: - An extended time horizon, say, over a lifetime - Events can reoccur throughout a lifetime Decision Tree for Atrial Fibrillation State-Transition Diagram for Atrial Fibrillation p11=0.7 Well p12=0.2 p13=0.1 p22=0.9 PostStroke Dead p23=0.1 p33=1.0 The probabilities for all paths out of a state must sum to 1.0. Death is known as an absorbing state, because individuals who enter that state cannot transition out of it. Transition Probabilities State of Next Cycle Well State of Current Cycle PostStroke Dead Well PostStroke Dead 0.7 0.2 0.1 0.0 0.9 0.1 0.0 0.0 1.0 Transition probabilities that remain constant over time are characteristic of stationary Markov models, aka Markov chains Markov Model Definitions Any process evolving over time with uncertainty is a stochastic process, and models based on such processes are stochastic or probabilistic models If the process is both stochastic and the behavior of the model in one time period (i.e., cycle) does not depend on the previous time period, the process is Markovian - The process has “lack of memory” - Even processes where the previous state does matter can be made Markovian through definition of temporary states know as tunnel states Tunnel States Well PostStoke 1 PostStroke 2 Dead PostStroke 3 PostStroke Defining a Markov Model Define the initial states Determine the cycle length Consider possible transitions among states Determine transition probabilities Determine utilities, and costs (if cost-effectiveness analysis), for each state Evaluating Markov Models: Cohort Simulation State Cycle Well PostStroke Dead Sum of Years Lived Survival 0 10,000 0 0 1 7,000 2,000 1,000 9,000 0.9000 2 4,900 3,200 1,900 8,100 0.8100 3 3,430 3,860 2,710 7,290 0.7290 4 2,401 4,160 3,439 6,561 0.6561 5 1,681 4,224 4,095 5,905 0.5905 6 1,176 4,138 4,686 5,314 0.5314 7 824 3,959 5,217 4,783 0.4783 93 0 1 9,999 1 0.0001 94 0 0 10,000 0 0.0000 The data in the last column is used to produce a survival curve, aka a Markov trace. Estimating Markov Models: Monte Carlo Simulation Instead of processing an entire cohort and applying probabilities to the cohort, simulate a large number (e.g., 10,000) cases proceeding through the transition matrix - Monte Carlo simulation - TreeAge will do this for you quickly, without programming The advantage of this approach is that it provides estimates of variation around the mean Monte Carlo simulation is most valuable because it permits efficient modeling of complex prior history - Such variables are known as tracker variables Example of a 5-State Markov Source: Kominski GF, Varon SF, Morisky DE, Malotte CK, Ebin VJ, Coly A, Chiao C. Costs and costeffectiveness of adolescent compliance with treatment for latent tuberculosis infection: results from a randomized trial. Journal of Adolescent Health 2007;40(1):61-68. Key Assumptions of the Markov Model Variable Value (Range) Reference Efficacy of IPT Cost of treating active TB Cost of IPT 0.85 (0.75-0.98) $22,500 ($17,000-$30,000) Varies by study group and whether 6-month IPT is completed 250 (120-560) 0.0045-0.16 (varies with age) 19-15,476 (varies with age) 19 17 Current study TB cases per 100,000 TB case fatality rate All-cause mortality rate per 100,000 Hepatotoxicity of IPT Hepatitis fatality rate Cost of treating IPT-induced hepatitis QALY – Healthy QALY – Positive Skin Test, but Incomplete IPT QALY – Active TB 0.0008 (age<35, started IPT) 0.0012 (age<35, completed IPT) 0.002 $11,250 ($8,500-$15,000) 20 17 National Center for Health Statistics, 1999 mortality tables 21 21 Authors’ assumption 1.00 (0.95-1.00) Authors’ assumption 0.90 (0.80-0.95) 0.50 (0.20-0.90) Authors’ assumption Harvard Center for Risk Analysis QALY – IPT-induced hepatitis 0.75 (75-0.90) Harvard Center for Risk Analysis Discount rate 0.03 (0.00-0.07) Panel on Cost-Effectiveness