Slide 1 - CCEB - University of Pennsylvania

advertisement
Comparative Effectiveness Research:
Scope of the Problem
J. Sanford Schwartz, MD
Leon Hess Professor of Medicine and
Health Management & Economics
School of Medicine & The Wharton School
University of Pennsylvania
Penn
Evaluation of Medical Care
Safety
Side effects acceptable?
Efficacy
Can it work?
(Net benefit optimal conditions)
Penn
Penn
Evaluation of Medical Care
Safety
Efficacy
Effectiveness
Efficiency
Side effects acceptable?
Can it work?
Does it work?
Is there sufficient value?
Efficacy:
Net benefit optimal conditions
Effectiveness: Net benefit average conditions
Penn
Evidence Based Medicine
“…conscientious, explicit, and judicious use of
current best evidence in making decisions about…
individual patient.”
“The practice of evidence-based medicine means
integrating individual clinical expertise with the
best available external clinical evidence from
systematic research and individual patients’
predicaments, rights and preferences in making
clinical decisions about their care…”
Sackett, Rosenberg, Gray et al. BMJ. 1996
Penn
Comparative Effectiveness Research:
“…comparison of effective interventions among
patients in typical patient care settings, with
decisions tailored to individual patient needs.”
“…generation and synthesis of evidence that
compares the benefits and harms of alternative
methods to prevent, diagnose, treat and monitor a
clinical condition, or to improve…delivery of care.
“The purpose of CER is to assist consumers,
clinicians, purchasers, and policy makers to make
informed decisions that will improve health care at
both the individual and population levels.”
Institute of Medicine. Initial National Priorities for Comparative
Effectiveness Research. National Academies Press. 2009.
Penn
CER, HTA and EBM
Comparative effectiveness research (CER)
• Absolute and relative clinical effectiveness of alternative
management strategies across patients, populations and
routine practice settings
• Evidence generation and synthesis
Evidence based medicine (EBM)
• Individual clinical decision making and policy and groupfocused evidence-based decision process (clinical
guidelines, reimbursement coverage decisions)
Health technology assessment (HTA)
• Explicit, comprehensive assessment long-term benefit–
risk tradeoff (benefits; harms)
International Working Group for HTA Advancement. Luce BR, Drummond MF,
Jonsson B, Neumann PJ, Schwartz JS, Siebert U, Sullivan SD. EBM, HTA, and CER:
Clearing the Confusion. Milbank Memorial Fund Quarterly. In press.
Penn
Evidence-based medicine (EBM)
“An evidence synthesis and decision process…to
assist patient/physician decisions… considers
evidence on… effectiveness of interventions and
patient values… mainly concerned with individual
patient decisions, but is also useful for developing
clinical guidelines...”
International Working Group for HTA Advancement. Luce BR,
Drummond MF, Jonsson B, Neumann PJ, Schwartz JS, Siebert U,
Sullivan SD. EBM, HTA, and CER: Clearing the Confusion. Milbank
Memorial Fund Quarterly. In press.
Penn
Comparative Effectiveness Research
“CER includes both evidence generation and
evidence synthesis. It is concerned with the
comparative(absolute and relative) assessment of
interventions and alternative management
strategies in routine practice settings.
The outputs of CER activities are useful for clinical
guideline development, evidence-based medicine
and the broader social and economic assessment
of health technologies…”
International Working Group for HTA Advancement. Luce BR,
Drummond MF, Jonsson B, Neumann PJ, Schwartz JS, Siebert
U, Sullivan SD. EBM, HTA, and CER: Clearing the Confusion.
Milbank Memorial Fund Quarterly. In press.
Penn
Health technology assessment
“HTA is a method of evidence synthesis that
considers evidence on clinical effectiveness,
safety, cost-effectiveness and, when broadly
applied, may include social, ethical and legal
aspects of the use of health technologies…in
informing reimbursement and coverage
decisions… HTA should include economic
evaluation.”
International Working Group for HTA Advancement. Luce BR, Drummond MF,
Jonsson B, Neumann PJ, Schwartz JS, Siebert U, Sullivan SD. EBM, HTA, and
CER: Clearing the Confusion. Milbank Memorial Fund Quarterly. In press.
Penn
Relationships of Evidence Processes:
EBM, CER, and HTA
International Working Group for HTA Advancement. Luce BR, Drummond MF,
Jonsson B, Neumann PJ, Schwartz JS, Siebert U, Sullivan SD. EBM, HTA, and CER:
Clearing the Confusion. Milbank Memorial Fund Quarterly. In press.
Penn
Evidence Processes
International Working Group for HTA Advancement. Luce BR, Drummond MF, Jonsson
B, Neumann PJ, Schwartz JS, Siebert U, Sullivan SD. EBM, HTA, and CER: Clearing the
Penn
Confusion. Milbank Memorial Fund Quarterly. In press.
Evidence Assessment: Goal
To provide a rigorous scientific basis for clinical
and policy decision making to:
• Optimize health outcomes
• Reduce incorrect decision making and resulting
missed opportunities and harms
• Improve quality of care, patient and population
health and efficiency of health services delivery
Penn
Comparative and Cost–Effectiveness
Assessment of the most effective (and efficient?)
care, defined in terms of patient outcome (and
cost?)
• How much benefit and value?
• In which patients?
• Under which conditions?
Penn
Comparative Effectiveness Research:
How will the evidence be used?
• Individual patient care
• Clinical guidelines
• Reimbursement
Outcomes of interest and quality and integrity of
data analysis and interpretation may differ across
uses and users
Penn
Penn
National Health Expenditures:
$ Billions
% GDP 1960-2015 (est)
National Health Expenditure % GDP
5.2
7.2
9.1
12.3
13.8
15.9 17.1 18.7
Source: Adapted from Kaiser Family Foundation CMS Office of the Actuary, National
Health Statistics Group, at http://www.cms.hhs.gov/NationalHealthExpendData/
Penn
(see Historical; NHE summary including share of GDP, CY 1960-2006; file nhegdp06.zip).
CER Costs and Rationing: Theory vs. Reality
Amendments to U.S. Healthcare Reform Bills
“in no case may any research conducted,
supported, or developed by the Center on CER
…be used by the federal government to deny
or ration care”
“CMS may not use federally funded clinical
CER data… for medical treatments, services,
items… on the basis of costs”
Penn
Outcomes of Care
Mortality
Morbidity
Functional status
• Cognition
• Emotional/Psychological function
• Energy, fatigue & vitality
• Physical activities
• Role activities/Social activities
• Sexual function/Sleep pattern
• Symptoms/Health perceptions
Cost
Penn
Randomized Clinical Trials:
Causal/Etiology Reference Standard
• Internal validity
• Data reliability
• Adjustment for confounders
Penn
Randomized Clinical Trials:
Limitations as Traditionally Conducted
Representativeness / Generalizbility
• Patient selection / eligibility criteria
• Comparators
• MD/Patient adherence
• ‘Real world’ practice patterns / variations
Data Limitations
• Outcomes assessed
• Time horizon / follow–up
• Clinically–relevant subgroups
• Resource use/ cost
Ethical and logistical barriers
Penn
Comparative Effectiveness Research:
Methodological Challenges and Development
Outcomes assessed:
• Physiologic
• Function / patient reported outcomes (PROs)
• Economic
Range of methods:
• Experimental (RCT; pragmatic/practical)
• Observational (case–control, cohort, registry,
administrative claims, EMR)
• Synthesis (meta–analysis; systematic review)
• Modeling
Result consistency across methods confirmatory;
disagreement requires understanding / explication
Penn
Comparative Effectiveness Research:
Methodological Challenges and Development
Design
• All relevant alternatives
• Clearly defined, rigorous, analytical methods
• Best available experimental, quasi–experimental,
observational, qualitative data
• Incremental impact/trade-offs
• Clinically relevant outcomes
• Relevant clinical endpoint and validated surrogate
data evidence and outcomes
Explicitly characterize uncertainty
Address issues of generalizability and transferability
Adapted from Drummond MD, Schwartz JS, Jonsson B, Luce B, Neumann PN, Siebert U,
Sullivan SD. International Journal of Technology Assessment in Health Care. 2008;24:244-58;
Penn
discussion 362–368.
Comparative Effectiveness Research:
Methodological challenges
Evidence definition
• Rigor
• Validity
• Reliability
Evidence generation
• Pragmatic (routine practice settings)
• Relevant patients (targets of intervention)
• Patient reported outcomes and preferences
• Clinically-relevant subpopulations
• Validity vs. Reliability vs. Generalizability
• Timely
Penn
Comparative Effectiveness Research:
Methodological challenges
Evidence Analysis
• Intent-to-treat vs. actual Rx received
• Adjust for confounding inherent in observational
data
• Indirect comparisons
• Data synthesis and integration (modeling)
Evidence Interpretation
Penn
Absolute Risk vs. Relative Risk
“All policy decisions should be based on
absolute measures of risk;
relative risk is strictly for researchers only.”
– Geoffrey Rose
Penn
Traditional Comparative Trials
• Expense
• Length
• Risk (for proprietary sponsor)
Penn
“Without major changes in how we
conceive, design, conduct, and analyze RCTs,
the nation risks spending large sums of money
inefficiently to answer the wrong questions—
or the right questions too late.”
Luce BR, Kramer JM, Goodman SN, Connor JT, Tunis S, Whicher D,
Schwartz JS. Rethinking randomized clinical trials for comparative
effectiveness research: The need for transformational change. Annals
of Internal Medicine. 2009;151:206-209.
Penn
Transforming Clinical Trials;
Objectives:
Enhance structural, organizational and operational
efficiency
• Enrollment
• Standardization of agreements
• Quality monitoring
Enhance analytic design and efficiency
• Bayesian and adaptive approaches (reduce sample
size, time, cost)
Make trials more useful for decision-makers
• Pragmatic clinical trials
Luce BR, Kramer JM, Goodman SN, Connor JT, Tunis S, Whicher D, Schwartz
JS. Rethinking randomized clinical trials for comparative effectiveness
research: The need for transformational change. Annals of Internal Medicine.
Penn
2009;151:206-209.
“In the midst of every challenge
lies opportunity”
-Albert Einstein
Penn
Download