PPT from James Macinko - Third Global Symposium on Health

advertisement
RECOGNIZING RESEARCH PARADIGMS,
METHODS, AND IMPACT FOR
PEOPLE-CENTRED HEALTH SYSTEMS
James Macinko, PhD
Associate Professor of Public Health & Health Policy
New York University
Main themes/guiding questions
• What are some contributions and challenges of
quantitative research traditions to Health Policy and
Systems Research in/for people-centered health
systems?
• What are some new approaches to the use of quantitative
data that may facilitate more people-centered health
systems?
• How can we strengthen existing and emerging
quantitative research traditions and tools to promote
better programs and policies to achieve and improve
people-centered healthcare?
What have we learned from (mostly)
quantitative approaches ?
• How to quantify differences in healthcare costs and quality
• How to weigh healthcare payments to reflect the
underlying socio-economic and health needs of different
populations
• Approaches (albeit still imperfect) to understanding the
burden of disease.
• Some policy lessons learned:
• User fees should be abolished.
• Health workers need to be paid.
• ITNs reduce mortality and should be free of charge.
• Early child development programs work.
• Many others! (over 2500 impact evaluations and 200 systematic
reviews at 3ie alone-but are we taking full advantage of this?)
4
The field of quantitative analysis
(especially impact evaluation) is in turmoil
 The quality of many empirical analysis is weak because
assessment of impact requires the construction of a
counterfactual.
 This is extremely hard to accomplish and has led to a
scramble for opportunities to use experimental methods
and exploitation of natural experiments.
Meet the perfect natural experiment
Monster randomly attacks
Tokyo, but leaves other
comparable cities
unscathed.*
*Note: this usually happens
just after you’ve completed
your study and published
the results. The monster
attack will dominate the
news and, consequently,
there will be no press
coverage of your important
research finding.
The rise of the “Randomistas”
 Randomized controlled trials seem to offer solutions to many
weaknesses of impact evaluation designs
 They can establish causality
 They provide solutions to statistical problems of bias, selection, omitted
variables (confounding), etc.
 But their promise may have been blown out of proportion
 “The World Bank is finally embracing science” Lancet, 2004
 “Britain has given the world Shakespeare, Newtonian physics, the theory of
evolution, parliamentary democracy—and the randomized trial” BMJ
editorial, 2001.
 There are major limitations to these methods
 Well-designed RCTs can tell us what happened, but not why
 Limited generalizability (and problem of heterogeneous effects)
 RCTs may fail just as often as any other research method
 And this may have led to irrelevant research and other unintended
consequences
 Sometimes absurd generalizations based on small special RCTs
 Could lead to poor/irrelevant policy choices, stigmatization of vulnerable
groups, wasted resources, and missed opportunities.
Source: adapted from Deaton, A. 2009; Picciotto 2014
HE USES STATISTICS AS A
DRUNKEN MAN USES LAMP POSTS:
FOR SUPPORT RATHER THAN FOR
ILLUMINATION.
- Andrew Lan
Counting and accounting: strengthening
people-centered approaches to health
policy and health systems
• With the systematic use of linked datasets and improved
data architecture for more reliable and systematic data
collection, integration, and analysis, there are powerful
opportunities to improve outcomes, reduce harms, and
promote greater equity in health.
• Several promising technologies offer some tantalizing
opportunities to bring people into health policy and system
research in new and potentially meaningful ways.
• These and other approaches offer some complimentary
ways (NOT SUBSTITUTES) to bring data use and
interpretation into participatory forums and to strengthen
health systems’ analytic capacity.
1. Making data work for the people
• Data must communicate more
clearly their intent and purpose.
• This requires more productive
interaction with data producers
and users at all levels.
2. Linking health and social protections: the
potential of clinical records for peoplecentered health systems
• Facilitating electronic medical record–based shut-off protection letters
(link with legal protections)
• Using electronic medical records to improve team-based care for
homeless veterans (screening for risk and linking with housing
services)
• Brazil: CCT conditions certified through medical records, CHW
provides link to programs at the household level.
Screening
Triage
Referral
Tracking
Sharing
EMR
triggers
social
screening
EMR
Triage
referral
EMR
automates
social
referral
EMR
Tracks
referrals
and
outcomes
EMR
facilitates
data
sharing for
advocacy
Adapted from: Gottlieb LM, Tirozzi KJ, Manchanda R, Burns AR, Sandel MT. Moving Electronic Medical
Records Upstream: Incorporating Social Determinants of Health. Am J Prev Med. 2014 Sep 9.
3. Crowdsourcing: bringing outsiders into
the research team
Though 200+ years old, crowdsourcing has only begun to be used in
public health. Examples include:
• Supplement traditional survey research (promote inclusion of underrepresented groups in research)
• Formative research (solicit feedback about health promotion/education
materials)
• Evaluating the scientific literature (solicit additional literature and
updates on systematic reviews)
• Program evaluation (annotating public webcam images to determine
how addition of a bike lane changed transportation patterns)
• Data processing (classifying polyps on computed tomography (CT)
colonography images; identification of malaria infection in RBCs)
• Surveillance/monitoring (tracking influenza via cellphones and social
media, monitoring teacher attendance and physician availability)
• Identifying resources (mapping automated external defibrillators, ).
Ranard BL, et al. Crowdsourcing--harnessing the masses to advance health and medicine, a systematic
review. J Gen Intern Med. 2014 Jan;29(1):187-203
4. Enhancing learning and experimentation
through simulations and games
Results from “HealthBound” policy
simulation game:
“expanding insurance coverage and improving health care quality is cost-effective, but …if
implemented without other interventions would likely yield… increasing costs and worsening
health inequities. Expanding primary care capacity for the disadvantaged could dramatically
improve access and equity and would help to lower costs…”
Bobby Milstein, Jack Homer, and Gary Hirsch. Analyzing National Health Reform Strategies With a Dynamic Simulation Model.
American Journal of Public Health: May 2010, Vol. 100, No. 5, pp. 811-819
Barriers to making quantitative
approaches more people-centered
• Most countries don’t have the capacity to take advantage of many of
•
•
•
•
•
•
these approaches in a systematic way
Current investments in health systems do not tend to favor
development of local health information infrastructure and capacity
Fragmented and low-resourced health systems have fragmented and
incomplete data systems
There are large global inequities in basic STEM training. Other
sectors (besides health) exhibit strong pull forces.
People could be entirely cut out of participation except to create and
provide increasingly vast amounts of data about themselves.
Use of new technologies can create dependencies on the (mostly
commercial) suppliers of this technology
None of these approaches can be should be performed in a
laboratory setting
What do we need to do to make
quantitative approaches more relevant to
people-entered health systems?
• Integrate implementation science into impact evaluations•
•
•
•
•
•
expand from efficacy to effectiveness to empowerment
Embrace multiple types of evidence and the complexity of
different methodological approaches
Embed new forms of transparency and accountability into
impact and other evaluations
Institutionalize multi-sectorial approaches to quantitative
monitoring and evaluation alongside other approaches
Democratizing data will require links with education to increase
numerical literacy through better learning tools
Seek global commitments to support more people-centered
evaluations
Expand and enhance global communities of practice to
demonstrate how these approaches might work
NOT EVERYTHING THAT CAN BE
COUNTED COUNTS, AND NOT
EVERYTHING THAT COUNTS
CAN BE COUNTED.
- Albert Einstein
Acknowledgments
References:
• Angus Deaton (2009) “Instruments of development:
Randomization in the tropics and the search for the elusive
keys to economic development”
• J. Larry Aber (2014) “Child development and social policy:
building science for action”
• Guiji and Roche (2014). Does impact evaluation in
development matter?; Camfield and Duvendack (2014).
Impact evaluation—are we off the gold standard? European
Journal of Development Research; 26 (1): 1-11; 46-54 .
• Thanks to Diana Silver, Dina Balabanova, Lucy Gilson, Rene
Loewenson, and Barbara McPake for comments and
conversations that helped to shape this presentation.
THANK YOU!
Questions?
James.macinko@nyu.edu
Download