‘On the research side, evidence-based education seems to favour a technocratic model in which it is assumed that the only relevant research questions are questions about the effectiveness of educational means and techniques, forgetting, among other things, that what counts as ‘‘effective’’ crucially depends on judgements about what is educationally desirable. On the practice side, evidence-based education seems to limit severely the opportunities for educational practitioners to make such judgements in a way that is sensitive to and relevant for their own contextualized settings. The focus on ‘‘what works’’ makes it difficult if not impossible to ask the questions of what it should work for and who should have a say in determining the latter.’
Biesta (2007)
‘In short, social interventions are complex systems thrust amidst complex systems.’
Pawson et al. (2004)
The Learning Development team, and the LLI as a whole, have been discussing how we might devise meaningful and workable approaches to evaluating the work we do. As well as considering practical considerations, models, frameworks etc. this process also inevitably invites critical reflection on the nature of what we’re studying (student learning within complex social and institutional contexts), the possibilities for knowledge about this, and what, if anything, should be done in light of our knowledge. This, in turn, has prompted me to return to some of the literature I’ve encountered in recent years (thanks, in large part, to completing the University of Leicester’s excellent MA International Education) concerning the limitations and potential harms of taking too mechanistic an approach to the relationships between research, evidence, policy and practice.
Some problems with ‘evidence-based’ policy and practice
The belief that the knowledge we claim and the decisions that we make should be grounded in legitimate and credible evidence carries an understandable, and justified, power in HE. It explains, for example, why such methodologically flaky instruments as the NSS are met with so much ire and frustration. However, as numerous authors have pointed out (for a sample, see below), the common insistence that educational and other social policies and practices should be ‘evidence-based’, or more specifically that evidential bases should be geared towards telling us ‘what works’ in order that we may be ‘led by the evidence’ in deciding what to do, is frequently far less straightforward, and far less benign, that it might at first appear. For a number of important reasons, we should be more cautious and critical when deciding how much credence we give, and practical utility we afford, to so-called ‘what works’ approaches to developing policy and practice in light of research evidence. Some of the work listed below targets specifically the claims, assumptions and agendas of those advocating ‘evidence-based policy and practice’ and the ‘what works’ agenda, whilst some is more generally critical of certain research paradigms and assumptions. The varied critiques deal with the following broad themes:
- The naïve ontological and epistemological assumptions often underpinning much ‘what works’-oriented empirical research, and the relationships it’s believed this research can and should have to matters of policy and practice.
- The similarly naïve and empiricist theories of causation the ‘what works’ movement seem often to base their arguments on.
- The limitations of (necessarily and unavoidably) complexity-reducing and historical research when it comes to informing the decisions and future actions of practitioners and policy-makers operating in ineluctably complex, unpredictable, open and values-laden social settings.
- The apparent belief (sometimes tacit, sometimes more explicitly stated) that questions of ‘what works’ can somehow be disaggregated and therefore considered in isolation from questions of ethics, values and politics – what we might call those questions concerning ‘what matters’ in education (Smyers and Smith, 2014).
Importantly, these critiques reach well beyond technical discussions about ‘sampling procedures’, ‘effect sizes’ etc. (important though these are, too) to pose more fundamental questions concerning the ontological, epistemological and axiological assumptions that inform so much social research. I take from this and similar scholarship the important reminder that we need to work with the complexity and dynamism of social research settings and to understand that questions of policy and practice are always moral and political ones, framed and responded to in arenas of struggle, characterised by numerous multi-layered and intersecting power relations. Learning and teaching do not take place in neutral, ahistorical, values-free spaces, in which we can simply implement what the evidence ‘instructs’ us are the most technically efficacious interventions. It would be self-deluded and dangerous to pretend otherwise.
So, where does this leave us?
None of this is to suggest that scholarly enquiry (of many varieties, including, but not limited to, the kinds of empirical studies usually favoured by proponents of evidence-based policy and practice) shouldn’t play a central role in helping us make context-aware, ethically-informed, judgements about what can and should be done. Nor is it to argue that developing educational policies and practices should henceforth become an exercise in evidence-free guess work! Quite the reverse: it’s precisely because evidence matters so much that the complexities of its production and interpretation need to be acknowledged and taken seriously, in order that we might reflect more critically on the research we encounter. Work such as that cited below helps empower us to ground our critical reflections in: a) more philosophically literate understandings of the assumptions that underpin different types of research; b) a recognition of the necessary limitations of all complexity-reducing research when it comes to informing practical, value-laden judgements about what we can and should actually do in complex and unpredictable social settings like schools and universities; and c) an alertness to the ethical and political dangers inherent in pretending we can suspend normativity and develop policies and practices from the purely instrumental, and mythically neutral, perspective of ‘just tell me what works’. Engaging in this fashion should – far from marginalising the role of scholarship in helping us to make our (always fallible, always provisional) decisions – actually help to re-affirm scholarship’s true value as a resource to help stimulate and enable critical praxis. Indeed, I would argue that such approaches – rooted as they are in better understandings of what evidence actually is (and isn’t), how and why it’s produced, and what it can and can’t tell us about what we should do – are more authentically evidence-engaged than those often favoured by the more reductionist voices within the ‘what works’ movement.
References and suggested further readings:
Biesta, G.J. (2007) Why “what works” won’t work: Evidence‐based practice and the democratic deficit in educational research. Educational theory, 57(1), 1-22.
Biesta, G.J. (2010) Why ‘what works’ still won’t work: From evidence-based education to value-based education. Studies in Philosophy and Education, 29(5), 491-503.
Clegg, S. (2005) Evidence‐based practice in educational research: a critical realist critique of systematic review, British Journal of Sociology of Education, 26(3), 415-42.
Clegg, S., Stevenson, J. and Burke, P.J. (2016) Translating close-up research into action: a critical reflection. Reflective Practice, 17(3), 233-244.
Bronwyn. D. (2003) Death to Critique and Dissent? The Policies and Practices of New Managerialism and of ‘Evidence-based Practice’, Gender and Education, 15(1), 91-103.
Parkhurst, J. (2017) The Politics of Evidence: From evidence-based policy to the good governance of evidence. Abingdon, UK: Routledge. OPEN ACCESS VERSION AVAILABLE
Hammersley, M. (2005) The Myth of Research‐based Practice: The Critical Case of Educational Inquiry, International Journal of Social Research Methodology, 8(4), 317-330.
Pawson, R., Greenhalgh, T., Harvey, G. & Walshe, K. (2004) Realist synthesis: an introduction. Manchester: ESRC Research Methods Programme, University of Manchester.
Smyers, P. (2008) On the Epistemological Basis of Large‐Scale Population Studies and their Educational Use. Journal of Philosophy of Education, 42(1), 63-86.
Smyers, P. and Smith, R. (2014) Understanding Education and Educational Research. Cambridge: Cambridge University Press.
Comments are closed, but trackbacks and pingbacks are open.