About the HE Education Research UK Blog Series
To raise awareness of the HE Education Research Census and contribute to a conversation about HE education research in the UK, this blog series explores a wide range of issues at the forefront of education research today. It includes blogs from colleagues at all career stages, research areas and nations of the UK. Please get in touch if you too would like to contribute.
In 2010 we produced a 20-page guide, summarising thousands of studies, to provide teachers with evidence-informed best bets for improving the progress of poorer pupils. It went on to become the Education Endowment Foundation (EEF) Teaching and Learning Toolkit. In both the 2014 and 2021 research evaluation framework the toolkit was judged to be a world-leading impact case study. Around 70% of schools in England report using it. Our follow-up What Works? book won a prize. And as Nature magazine recently documented, the toolkit has also been used across the world as Governments try to address stark learning gaps in the post pandemic era.
Told this way, the impact of the toolkit might indicate that evidence-informed policy and practice has been nothing short of a complete success. Yet there is another side to its development that highlights the fundamental challenges that remain for the effective communication of research findings to benefit education practice. These tensions are usefully framed in the model below (Higgins, 2017). We use this to list some questions researchers and teachers need to ask as they aim to work together. The key point is that these can be in tension with each other; addressing one may compromise another.
Is it accessible?
The inaccessibility of research is the first stumbling block in efforts to improve links between research and practice. Journal articles are often impenetrable to teachers, and researchers are given scant training (or time) to communicate beyond their narrow discipline.
A critical decision when designing our toolkit was translating average effect sizes for different school approaches into a simple scale of school progress. Without this translation, the guide would have likely ended up as another academic publication gathering dust on a bookshelf. Yet this translation is certainly controversial (e.g. Lortie-Forgues et al., 2021). Researchers need time to consider how best to communicate their research, engaging directly with practitioners, to decide what compromises and assumptions are warranted.
Is it accurate?
This last point also relates to the second key question which is how accurate the research is. We must aspire to use the best knowledge to inform learning. Research produced by university academics, reviewed by other experts, is a good place to start. But what has been found to work in one classroom for one set of teachers and pupils may not apply to others elsewhere in different contexts. Summaries of research inevitably simplify the detail and this may limit how useful such summaries can be.
A weakness of education research is the absence of replication studies to provide universal replicable findings (Perry et al., 2022). In the toolkit we struck a compromise: producing approximate estimates of which school approaches had worked best on average, using meta-analysis. The danger is that these can be interpreted as sure fire conclusions of ‘what works’. Teachers need to understand that it’s never going to be that simple.
Is it actionable?
Providing specific practical steps for teachers to take is challenging for researchers. It’s hard to make strong claims from social science on the exact causal pathways leading to improved learning in the classroom. Research is a messy business with so many human interactions at play.
In our What Works? book we offered overall principles to consider when implementing different strategies. Our Bananarama Principle highlighted that ‘it’s not what you do but the way you do it’ that counts. The challenge is that this applies particularly to the things that have the biggest impact on pupil learning: such as developing metacognition or providing effective feedback in the classroom for example. This related directly to the heterogeneity of effects found in educational meta-analysis.
Is it applicable?
A key question for teachers meanwhile is to what extent research from other fields applies to their particular classroom and curriculum context. They can be bombarded with lots of ‘brain-based learning’ packages presenting bold claims for example. But neuroscience research is not necessarily applicable to the classroom.
Finding a theory that backs up intuition can be useful. A recent fad among teachers is to learn about cognitive load theory (CLT). This suggests we shouldn’t overload children with too much information at once. But teachers have always known this! How do you know that CLT has improved practice? Which particular aspects of CLT might help your pupils?
Is it appropriate?
It is also important to identify whether research is appropriate for the particular teacher and pupils involved. It should meet an identified need or a perceived problem, rather than being plucked at random from successful research findings.
A classic example is shiny iPads, introduced into classrooms because it’s thought that they are good for children, but not for specific learning aims. What education challenges are they seeking to address?
Is it acceptable?
To stand a chance of being successful research findings have to be acceptable to the teachers involved. If the findings conflict with deeply held beliefs about effective practice then they may either be rejected and not tried, or adopted resentfully and set up to fail.
The toolkit confirms that grouping children into different sets by ‘ability’ leads to equivocal academic gains. The progress seen for higher achievers flourishing in the top sets is offset by the damage done to pupils languishing in the bottom classes. Yet this practice remains stubbornly present in secondary schools due to the enduring belief that teaching is more effective and efficient with a narrower range of attainment in a class.
Is it achievable?
Finally, we must ask how achievable impact from evidence-informed practice is given all the tensions highlighted. The story of the toolkit reveals the non-linear, unpredictable, relational, complex and long-term nature of processes that underlie good communication between researchers and teachers.
We must acknowledge these uncertainties and the extensive time they take for both academics and practitioners. The danger otherwise is that the endeavour risks becoming a tick-box and ultimately illusory exercise. That would end up benefitting no-one at all.
Lee Elliot Major is Professor of Social Mobility at the University of Exeter; Steve Higgins is Professor of Education at Durham University.
 We estimated that a year of progress is about equivalent to one standard deviation.
Find out more:
Have you had your say yet?
The HE Education Research Census is live. If you engage in any form of education research and/or scholarship, and are a paid employee of a UK university (on any contractual basis), we want to hear from you!
Please click here to visit the survey page:
Lee Elliot Major is the country’s first Professor of Social Mobility. Appointed by the University of Exeter to be a global leader in the field, his work is dedicated to improving the prospects of disadvantaged young people. As a Professor of Practice he focuses on research that has direct impact on policy and practice, working closely with schools, universities, employers and policy makers.
Steve Higgins is Professor of Education at Durham University. Before working in higher education, he taught in primary schools in the Northeast where his interest in children’s thinking and learning developed. His research interests include the use of evidence from research to support policy and practice decisions in education, the effective use of digital technologies for learning in schools, understanding how children’s thinking and reasoning develops, and how teachers can be supported in developing the quality of teaching and learning in their classrooms.
Reference and further reading
- Elliot Major, L. & Higgins, S. (2019). What Works?: Research and evidence for successful teaching. London: Bloomsbury Publishing.
- Higgins, S. (2018), Improving Learning: Meta-analysis of intervention research in education. Cambridge: Cambridge University Press.
- Higgins, S., Kokotsaki, D. and Coe, R. (2011), ‘Toolkit of strategies to improve learning: Summary for schools spending the Pupil Premium’. London: Sutton Trust. www.cem.org/attachments/1toolkit-summary-final-r-2-.pdf
- Lortie-Forgues, H., Sio, U. N., & Inglis, M. (2021). How should educational effects be communicated to teachers?. Educational Researcher, 50(6), 345-354.
- Perry, T., Morris, R., & Lea, R. (2022). A decade of replication study in education? A mapping review (2011–2020). Educational Research and Evaluation, 27(1-2), 12-34.