The Myth of Research-Based Practice: The Critical Case of Educational Inquiry Martyn Hammersley The Open University, UK ‘Evidence and Judgment: When evidence-based research meets the everyday practice of teachers’, Conference organised by the Department of Education, Aarhus University and the Laboratory for Research-based Development of Schools and Pedagogic Practice, Aalborg University, December 2013 Key issues • What is, and what should be, the relationship between educational research and practice? • What are the implications of this for educational research? What form(s) should it take? • What are the implications for educational practice? Must it be transformed? • How does, and should, educational policy relate to research and practice? The Research-Practice Relationship • There is a long history of debate about the role of research in educational policy and practice (see Nisbet and Broadfoot 1980). • The focus of this has been the ‘researchpractice gap’. • In the past, a range of strategies have been proposed for addressing this ‘gap’: a) improving the dissemination of findings b) developing mediation strategies; and c) transforming the character of research to make it more ‘relevant’ and ‘effective’. Complaints about research • Not closely enough focused on the concerns of policymakers or practitioners; • Fails to produce findings at the time they are needed; • Generates conflicting and confusing evidence; • Provides evidence that is at odds with what is well known to policymakers and practitioners, so that its validity seems weak; • Produces conclusions that are inaccessible to practitioners, for example because too elaborate and qualified, or jargon-ridden. Complaints about policymakers and practitioners • Closed-minded or set in their ways, and therefore resistant to new perspectives; • Committed to the dominant ideology and unwilling even to consider radical challenges that research findings may imply; • Untrained in the capacity to understand and make use of research; • Lacking in the motivation required to seek out research evidence and to reflect on their decisions in light of it. The idea that educational practice can and should be ‘research-based’ • This idea, in various forms, can be traced back to the first half of the 20th Century, for example in efforts to identify the nature of ‘effective teaching’, and in the various forms of action research that became influential from the 1950s onwards. • The most recent version of the argument, emerged from the evidence-based medicine movement, and in the UK this led to a major crisis for educational research. Evidence-based Medicine The evidence-based medicine movement, from the 1980s onwards (Pope 2003), required that: • Clinicians must access research evidence about ‘what works’, and use only what has been scientifically validated, rather than relying upon their own past experience or outdated training. • Funding must be directed into research aimed at discovering ‘what works best’. This research should use randomised controlled trials (RCTs), and funds must be allocated for systematic reviews designed to synthesise findings from multiple studies. Government take-up of ‘evidence-based practice’ • While evidence-based practice began as a movement within medicine, championed particularly by clinical epidemiologists, it came to be supported by health service managers and government policymakers; and was extended to new areas, including education. • In large part, this was because it seemed to fit with the ‘new public management’ that became influential in the 1990s, with its concern to make public sector professionals more ‘transparently’ accountable (Pollitt 1990). Changes in UK educational context • 1980s onwards: A sequence of policies aimed at increasing competitive pressure on schools to boost test and examination results. • An environment of continual change in policies as regards both curriculum and pedagogy. • Gradual shift of teacher-training into schools. • University research increasingly restructured via an ‘investment’ model based primarily on the example of industrial science and technology. David Hargreaves’ intervention • 1996: A public lecture, sponsored by a quasigovernmental body, calling for change in the funding and organisation of educational research, since much of it is ‘frankly secondrate’, ‘does not make a serious contribution to fundamental theory or knowledge; [and] is irrelevant to practice.’ (Hargreaves 1996:7) • What is required instead is research that: ‘demonstrates conclusively that if teachers change their practice from x to y there will be a significant and enduring improvement in teaching and learning.’ (p.5) More from Hargreaves • ‘The £50-60 million we spend annually on educational research is poor value for money […]’ (2007:3) • ‘The teaching profession has, I believe, been inadequately served by [researchers]’ (pp3-4). • Educational research is ‘a private, esoteric activity, seen as irrelevant by most practitioners’ (p6) • Teachers justify their practices by appeal to ‘tradition’, ‘prejudice’, ‘dogma’, and ‘ideology’ (p12). Instead these practices must be grounded in research evidence. Crisis in education research (UK) • Recurrent attacks on academic educational research and moves to transform it into policyor practice-focused inquiry. • Charles Clarke, then Parliamentary UnderSecretary of State, declared that the aim was to ‘resurrect educational research in order to raise standards’ (Clarke 1998, my emphasis). • Chris Woodhead (1998), then Chief Inspector of Schools, announced that 'considerable sums of public money are being pumped into research of dubious quality and little value’. Subsequent developments in UK • Funding of teacher research and its dissemination via government-sponsored organisations and web-sites. • Funding of units devoted to coordinating the development of systematic reviews on educational and other issues. • ESRC Teaching and Learning Research Programme directed funds into applied studies. • The National Education Research Forum, aimed at identifying research priorities, specifying quality, and increasing ‘impact’. Resulting trends • A sustained shift in research funding towards a focus on teaching and learning, and on policy issues. • Increased emphasis on researcherengagement with policymakers and practitioners to ensure the ‘impact’ of research-based knowledge. • At the same time, a liberalising of the original rhetoric (now, ‘evidence-informed practice’), as regards both what kinds of research are of value and the relationship between research evidence and the work of practitioners. The current situation Signs of a revival of the original model of evidence-based policymaking and practice: • The ‘behavioural insights team’ (the ‘nudge unit’) in the Cabinet Office (Haynes et al 2012; Goldacre 2013) promoting RCTs in various areas of Government policy. • Randomised controlled trials being funded on educational issues by other agencies, for example by the Educational Endowment Fund. My position • The original model of evidence-based practice made unreasonable and unnecessary demands on both researchers and practitioners. • Both professions could probably be improved, but there is no strong evidence that the reforms have advanced the quality of their work, and there have been damaging consequences. • Research, of different kinds, can contribute to practice in a variety of ways (see Hammersley 2000), but rarely in the direct and conclusive manner assumed by the idea of research-based practice. Myth of research-based practice The myth = that research can tell us what is the best way to teach, or to do anything else. Three reasons why this is misleading: a) Research cannot validate value conclusions: the ambiguity of ‘what works’. b) It can only provide limited and fallible evidence about the effects of particular policies or practices; and this is not its main function. c) Research evidence must always be combined with local knowledge in professional judgments about what is best in particular contexts. Consequences for research • A threat to quality from ideological bias; • Further decline in the funds available for research that is not directly related to what are currently high priority policy issues; • An increase in the amount of research which attempts to answer questions that simply cannot be answered effectively at the present time; • A further reduction in the turn-around times demanded of research projects, so that sustaining the quality of research becomes ever more difficult. The Fallacy of the Gold Standard: Failings of RCTs 1. The problem of studying specific interventions: the drug trial model doesn’t apply in education. 2. There is a trade-off between internal and external validity (Cartwright and Hardie 2012). 3. There are major practical problems in enforcing control over variables (Gueron 2002), not least as regards blinding. 4. As with other forms of educational research, there are severe measurement problems. School-based action research as an alternative • This was central to Hargreaves’ position in the 1990s, and was widely sponsored by quasi-governmental bodies after the crisis. • The focus on the practical issue of ‘what works’ was retained here, but most teacher research did not employ RCTs. In this respect it fails to match the original evidencebased practice model. Moreover, the quality of much of it is open to question in broader methodological terms (see Foster 1999) Action research: a contradiction? There is a fundamental tension between the demands of practice and those of research. This tends to result in there being a predominant focus either on action or on research: it is impossible to maintain a balance between the two so that they can both be pursued well (Hammersley 2013:ch7). Action research is no substitute for more traditional kinds of investigation. To the extent that it seeks to conform to Lewin’s scientific model, it also raises ethical questions analogous to those surrounding RCTs. Further questions • Lewin: ‘There is nothing so useful as a good theory’ (Lewin 1951:169). This contrasts with the position of many advocates of evidencebased practice, who treat theory as unimportant (see Oakley 2000; Chalmers 2003). • If theory is the intended product of practitioner inquiry, theory of what kind? Scientific or normative? Is the task of teachers ‘knowledgecreation’ or improving their practice? • Is the focus only on ‘what works’, or also on the goals of education: the dangers of instrumentalism (Biesta 2007). Types of inquiry • Inquiry-subordinated-to-another-activity • Practical research: concerned with providing specific information of value for policymaking or practice • Academic research: aimed at contributing to a body of knowledge about particular social and educational topics: for example, social class, ethnic, and gender inequalities. All are valuable, but much action research is inquiry-subordinated-to-another-activity disguised as research (Hammersley 2013:ch7). Mode 2 inquiry? • There are currently pressures for academic research to be transformed into practical research, or even into inquiry directly serving policy or practice. • Some have argued that this is an essential feature of post-modern society (Gibbons 2000). But, even if this trend is a fact, it does not tell us what form educational research should take. • It is increasingly difficult to obtain sufficient funding for academic educational research to be carried out properly. And, generally, it has to be policy- or practice-focused, and claim impact Conclusion • The notion of research-based policymaking and practice was prompted by the rise of the evidence-based medicine movement. • It was taken up by the UK Government as a result of the ‘new public management’. • This caused a crisis in, and substantial reorientation of, UK education research. • The movement was based upon a fanciful notion of the role that research can play. • It has had damaging consequences for research, and perhaps also for educational practice. References Biesta, G. (2007) ‘Why “what works” won’t work’, Educational Theory, 57, pp1-22. Cartwright, N. and Hardie, J. (2012) Evidence-Based Policy, Oxford, Oxford University Press. Chalmers, N. (2003) ‘Trying to do more good than harm in policy and practice’, Annals of the American Academy of Political and Social Science, 589, 22-40. Dunne, J. (1997) Back to the Rough Ground: practical judgment and the lure of technique, Notre Dame IN, University of Notre Dame Press. Foster, P. (1999) 'Never Mind the Quality, Feel the Impact': A Methodological Assessment of Teacher Research Sponsored by the Teacher Training Agency’, British Journal of Educational Studies, 47, 4, pp380-398. Gibbons, M. (2000) ‘Mode 2 society and the emergence of context-sensitive science’, Science and Public Policy, 26, 5, pp159-63. Goldacre, B. (2013) ‘Building evidence into education’, available at (accessed 28.11.13): http://www.badscience.net/2013/03/heres-my-paper-on-evidence-and-teaching-for-the-educationminister/#more-2849 Gueron, J. (2002) ‘The Politics of Random Assignment’, in Mosteller, F. and Boruch, R. F. (eds.) Evidence Matters: Randomized trials in education research, Washington D.C., Brookings Institution Press. Hammersley, M. (1993) ‘On the Teacher as Researcher’, Educational Action Research, 1, 3, pp425-445 Hammersley (2000) ‘The Relevance of Qualitative Research’, Oxford Review of Education, 26, 3-4, pp393-405. Hammersley, M. (2002) Educational Research, Policymaking and Practice, London, Paul Chapman/Sage. Hammersley, M. (2005) ‘Is the evidence-based practice movement doing more good than harm? Reflections on Iain Chalmers’ case for research-based policymaking and practice’, Evidence and Policy, vol. 1, no. 1, pp1-16. References contd. Hammersley, M. (ed.) (2007) Educational Research and Evidence-based Practice, London, Sage. Hammersley, M. (2013) The Myth of Research-Based Policy and Practice, London, Sage. Hargreaves, D.H. (1996). Teaching as a research-based profession: possibilities and prospects (Annual Lecture). London: Teacher Training Agency. [Reprinted, along with some responses, in Hammersley ed. 2007.] Hargreaves, D. H. (1998) Creative Professionalism, London, Demos. Available at (accessed 28.11.13): http://www.bucksgfl.org.uk/pluginfile.php/2366/mod_resource/content/0/creativeprofessionalism. pdf Hargreaves, D. H. (1999) ‘Revitalizing educational research’, Cambridge Journal of Education, 29, 2, pp239-49. Haynes, L., Service, O., Goldacre, B., and Torgerson, D. (2012) Test, Learn, Adapt: Developing Public Policy with Randomised Controlled Trials, London, Behavioural Insights Team, Cabinet Office, UK Government. Lewin, K. (1951) Field Theory in Social Science, New York, Harper & Row Nisbet, J. and Broadfoot, P. (1980) The Impact of Research on Policy and Practice in Education, Aberdeen, Aberdeen University Press. Oakley, A. (2000) Experiments in Knowing, Bristol, Polity. Pollitt, C. (1990) Managerialism and the Public Services, Oxford, Blackwell. Pope, C. (2003) ‘Resisting evidence: evidence-based medicine as a contemporary social movement’, Health: An Interdisciplinary Journal, 7, 3, pp267–282;