Power to the People Evidence from a Randomized Field Experiment on Community-Based Monitoring in Uganda Martina Björkman, IGIER, University of Bocconi, & CEPR Jakob Svensson, IIES, Stockholm University, NHH, & CEPR Background Millions of children die from easily preventable causes Weak incentives for service providers Top-down approach to monitoring also lacks appropriate incentives Recent focus on strengthening providers’ accountability to citizen-clients Beneficiaries lack information Inadequate participation by beneficiaries Research Questions Can an intervention that facilitates community-based monitoring lead to increased quantity of health care? Increased quality of health care? Did the intervention increase treatment communities’ ability to exercise accountability? Did the intervention result in behavioral changes of staff? Intervention 50 rural dispensaries in Uganda Drawn from 9 districts Households w/in 5 km catchment area 18 local NGOs Provide communities with information on relative performance Encourage beneficiaries to develop a plan that identified steps the provider and community should take to improve service performance and ways to get the community more actively involved in monitoring Intervention Specifics Pre-intervention survey data used to compile unique “report card” for each facility Translated into community’s main language Posters by local artist for non-literate Information provided to community through participatory / interactive meetings Community: suggestions summarized in action plan Staff: review & analyze performance Interface: contract outlining what needed to be done, how, and by whom Timing Intervention intended to “kick-start” community monitoring Mid-term review after 6 months, but no other outside presence in communities Not able to document all actions taken by communities Data Pre-intervention survey to collect data for report cards Post-intervention survey 1 year after intervention Quantitative service delivery data from facilities’ own records Households’ health outcomes, perceptions of health facility performance parameters Whenever possible supported by patient records Child mortality (under 5) Weight of all infants Roughly 5000 randomly-sampled households in each survey round Evidence of Increased Monitoring More than 1/3 of Health Unit Management Committees in treatment communities reformed or added members; no change in control communities 70% of treatment communities had some sort of monitoring tool (such as suggestion boxes, numbered waiting cards, duty rosters); only 16% in control communities Performance of staff more often discussed at local council meetings in treatment communities NGO reports suggest that discussions shifted from general to specific issues regarding community contract Treatment Practices At facilities in treatment communities significantly: More likely to have equipment used during exam (19% increase) Shorter wait times (10% decrease) Less absenteeism (14%age points lower) More on-time vaccinations Larger share received information on dangers of selftreatment and family-planning Also possibility of less drug-leakage Utilization At facilities in treatment communities significantly: Higher utilization of general outpatient services (16%) More deliveries at the facility (68%) From household surveys: Consistent increases in use of treatment facilities Reduction in visits to traditional healers & the extent of self-treatment Health Outcomes Child mortality 3.2% in treatment communities 4.9% in control communities 90% confidence interval for difference ranges from 0.3%-3.0% Corresponds to roughly 540 averted deaths (per 55,000 households in treatment communities) Infant weight Compare distributions of weight-for-age (z score) Difference in means is 0.17 z score Reduction in average risk of mortality based on risk of death from infectious disease among underweight children estimated to be 8% Institutional Issues Did district or sub-district management react to intervention? Check that treatment & control communities have comparable: Monthly supply of drugs Funding Construction or infrastructure improvements Visits from government or Parish staff Employment (dismissals, transfers, hiring) External Validity Idiosyncratic process differed from community to community in experiment In another context, process could play out entirely differently Cultural factors key Scaling Up What actually caused the observed effects? How to replicate the intervention? Process dependent on NGO facilitators No way to know which components of monitoring were influential An Alternative Explanation Possible (but unlikely) that intervention directly influenced providers’ behaviors Outcomes not necessarily result of increased monitoring Considered additional treatment of staff meetings only but decided against it Financial reasons Ethical reasons Conclusion ? Impressive effects, but intervention difficult to replicate Important piece of causal chain undocumented