Achieving rigour in qualitative research

advertisement

Achieving rigour in qualitative research: an examination of tensions in the international development sector

Susan Johnson and Saltanat Rasulova

Qualitative Methods Symposium

University of Bath

27 th January, 2015

Background

• Qualitative methods used widely in ID especially research and evaluation

• Evidence-based policy making has become increasingly important

• Has raised demand for rigour in qualitative research

95% of ID interventions not evaluable using these methods.

Pressure to broaden range of acceptable methods. (Stern et al,

2012)

• Knowledge Transfer Partnership between UoB and a private sector international development consultancy

to improve the practices of qualitative research and evaluation,

by developing and implementing protocols and procedures for conducting rigorous research

• Review of five case study projects undertaken to assess the quality of existing practice and the issues involved in improving it

The politics of evidence and the quality debate

• The politics of evidence – who has the power to determine?

• Disagreement over whether standards/ guidelines are appropriate.

• Foundationalists – all research can conform to shared criteria (Cochrane and

Campbell collaborations in this space)

• Quasi-foundationalists – a specific set of criteria unique to qual research is needed

• Non-foundationalists – need understanding not prediction; “inquiry and its evidence is always political and moral” (Denzin, 2011)

• Denzin: Turning any evidence into data is a complex interpretive process and lack of understanding of this creates a self-validating view of scientific process as about testing and refining theory

• Concern about compliance with audit culture at expense of values and purpose and concern with transformation and justice

Approaching quality: how to engage?

• Cabinet Office Guidelines (Spenser et al 2003) – narrows paradigms / what is covered/not covered. Questions asking how well aspects of enquiry have been conducted based on explanations

• Criticised by Torrance (2011:574)

• For resolving the issues in a “bloodless, technical and strangely oldfashioned counsel of perfection”

• As inoperable – no report could cover everything listed

• reality of making choices amid contingencies and pressures is absent

• How then to bring this context of the power relations and the dynamics it produces into view?

• AEA Program Evaluation Standards: useful, practical, ethical and accurate

• Fourth generation evaluation: contracting process to address objective, methodology, ethics and starting points (as expected to be emergent)

Lincoln and Guba: trustworthiness as rigour

Approaching methodology

• Case studies of five qualitative research and evaluation projects

• Selected to reflect a range of types of project within social sector with respect to approaches to the research, scale, scope and involvement of stakeholders

• Data collection:

• Project documents review (project reports, qualitative research guides, training manuals, analysis sheets)

• 6 semi-structured interviews with lead consultants in research organisation

(RO) (including skype)

• 6 semi-structured interviews with commissioning organisation (CO_ (skype)

• Double-researcher analysis of data using NVIVO

• Case studies and theme construction: operational and methodological/strategic tensions

Country

India

Indonesia

Somalia

Cape Verde

Africa multicountry

Research objective Commissioner to formulate communication strategies International to raise awareness and empower organisation families to prevent and reduce child labour to assess whether targeted social assistance increased social conflict among communities to assess impact of a social assistance programme

Multiple stakeholders

International organisation to understand the experience of poor and vulnerable people in a water and sanitation programme

International organisation to assess impact of a social assistance programme

International organisation

Methods/approaches

In-depth interviews, group activities with participatory tools, district/village workshops to discuss findings

Comparative (across communities) qual case study approach, interviews,

KIIs, FGDs with participatory tools

KIIs, FGDs with participatory tools

KIIs and FGDs, interactive analysis in the field with community participation

Comparative (across 6 countries) qual study with KIIs, FGDs and participatory tools

Principle

Credibility

Confirmability

Dependability

Transferability

Strategies in use

Peer review in developing design

External peer examination (within client’s organisation)

Independent external examination

Joint analysis between OPM and local researchers in the field (village, district and national levels)

Immediate analysis in the field with fresh recollections and memories of activities

Negative case review

Triangulation of sources and methods

Member check and consultation (including nationwide consultations)

Structural coherence and consistency between data and interpretations

Using quotes in writing up

FGD and KII notes, debriefing notes, district notes (audit trial – analysis specific)

Researcher journal documenting major decisions, changes in design, methods, etc.(audit trial – process specific)

Systematic coding and data reduction materials

Peer reviewing of data analysis: triangulation of researchers (stepwise replication technique)

Peer reviewing of reports (thematic and methodological)

Audio recording, visual materials e.g. diagrams

Think description of the context and methods

Comparison of sample against wider population

Purposeful sampling

Key issues in case studies

• India

• Access to working children without families

• Initial request for random sample negotiated to be purposeful

• RO would like more time for developing deeper rapport with children

• Tight budget did not allow RO to hire more local researchers

• Indonesia

• CO wanted large sample to cover diversity across country

• Large national programme meant sufficient budget for sample and coverage

• Diverse skills sets among local researchers

• Somalia

• Initially MM design, but quant dropped in discussion between CO and RO

• Safety risks, limited consultants’ engagement in field and supervision more difficult

• Lack of qualitative skills among local researchers

Key issues in case studies

• Cape Verde

• Most heavily participatory / some analysis done in field and checked back

• CO had to argue for a bigger budget as qual not seen as important

• Community involvement led to strong sense of ownership

• First in-depth study on gender and social issues in the WASH sector

• Study produced a model for community participation in WASH design

• Africa multi-country

• Greater involvement of CO’s manager in research design and process

• Different expectations regarding methodology

• Led to different expectations of skills sets

• Different perspectives on extensive thick description

Quality challenges

• Methodology:

• “Numbers do not speak!” - Qual valued to understand context and social processes

• Pressure for representativity : “stakeholders wanted representative evidence, we explained that….it is about depth”

• Post-positivist approach dominates

• But expectations vary – methodological backgrounds of CO managers matter

• Resources:

• Budget adequacy – qual seen as cheaper /less important

• Tender processes: price easier to assess than quality / negotiate on deliverables later - how to break the low level equilibrium trap?

• Training: at all levels field / RO and CO

• Reflects wider environment of training and understanding of issues of quality

Conclusions

• Dilemma: does having criteria necessarily undermine the potential for research to promote transformation / justice?

• Role of qualitative is valued but adopting technical criteria ignore the processes and choices involved

• Case studies show context and highlight issues surrounding quality

• A vicious circle in place – lack of discussion on underlying methodological assumptions; budget and human resources.

• How to break the circle and engage in a discussion over how quality is judged?

• Require greater transparency from COs about process

• Adapt Fourth Generation Evaluation criteria to include ethics of the process eg willingness to share power; discuss paradigms …

• Denzin concludes that there is a need to resist the new orthodoxy in which

“everything we do is inextricably empirical” (ibid:652) - but resisting it takes resources and must be empirical!

Download