Meta-analysis: Don’t do it or Do it more carefully?

Featured Image

Meta-analyses (like the popular Sutton Trust Tool Kit and Hattie’s Visible Learning) apparently offer a more sophisticated and orderly approach to our world than the messy reality of day to day practice. So do similar approaches, like Best Evidence Syntheses (such as Viviane Robinson’s seminal work on school leadership), Systematic Research Reviews and even reviews of reviews, such as Developing Great Teaching. But what and how does that help?

To answer the question posed by The Big Evidence Debate team, we also need to ask, as Monty Python asked about the Romans, “What do meta-analyses and Best Evidence Synthesis actually do for us?”

By providing a systematic and transparent overview of all the evidence from a given field, meta-analyses and related approaches help educators and researchers do two things: take a more strategic, bird’s eye view of research findings, and frame future research questions and designs more accurately and cumulatively.

Why does taking a bird’s eye view of research findings in a given field matter?

First, they help us stop ricocheting between the (often contradictory) results of different experiments or analyses about the effects of x or y.

For example, they help us make more strategic personal decisions about healthy diet or lifestyles – as opposed to hanging on serendipitously to the latest ‘finding’ about the impact of a couple of glasses of red wine.  An equivalent in the professional sphere might be help with navigating approaches to group work and collaborative learning.

Before these reviews came to education, we and our leaders were stuck with (or free to - depending on your perspective) grabbing hold of the specific study findings that look comfortably self-justifying or conveniently challenging. Now we can weigh up the evidence about the latest fads by checking individual, intriguing studies such as the original retrieval/memory study by Sweller and/or plausible think pieces and practical theories, from multiple intelligences to the much more plausible, but still contestable, expanding theories about memory and retrieval - against systematic overviews of all the evidence.

Framing future research questions and designs to be more cumulative.

Second, we are all admonished to Stand on the Shoulders of Giants and root our current research in what is already known.  In the absence of meta-analyses and BES, this is about as easy as it sounds.

Whilst I do vaguely remember standing on the shoulders of others in gym clubs in my youth, I was too busy balancing to look around and benefit from the view!

While education researchers have always been required to lay out clearly which previous research they were building on (usually as a ‘literature review’), this was almost inevitably a haphazard or partial process; marshalling all the evidence is a technical and time-consuming process.

Meta-analyses and systematic reviews provide an important handrail helping researchers take a much more comprehensive and systematic approach to drawing on previous research findings to frame their questions for future studies; an important step towards genuinely cumulative knowledge building. They also help researchers and research funders compare weights of evidence and gaps between fields. This also matters, or at least should matter, for deciding what research gets funded and or promoted to practitioners and policy makers and taken up by them.

So this challenging, technical work is important. Carry on!

Doing them better - a research user perspective

In that case, what might doing them better mean?

Meta-analyses aren’t perfect, but they are quite new and the more we do the better we understand their strengths and limitations. Technical aspects of reviews can always be improved; science is cumulative and involves identifying and learning from mistakes.

Other blog writers will have important things to say about detailed aspects of methods and statistical technicalities. Meta-analyses can be over-driven by the purity of the evidence and algorithms in the analysis. There is no use doing them better if no one takes any notice of or understands them.

My key concern is about what quality means from a user perspective. Having an overview is a helpful starting point. But the averaging effect of meta-analysis - and the degree of abstraction and lack of detail - limits their value in specific school and classroom contexts.

Effective and accurate interpretation and implementation of key messages from meta-analyses and related syntheses depend just as much on teachers’ and school leaders’ own, more precise, evidence about their pupils’ starting points and capacities. And teachers and school leaders need contextual detail to make these connections.

So, the high quality, mixed methods, individual studies on which meta-analyses depend, also have an important role to play in fleshing out the abstractions and filling in the gaps in detail that are an inevitable outcome of the process of creating an overview. Using meta-analyses remains a demanding professional learning process. Meta-analyses need to acknowledge and allow for this; for example, they need to signpost the textured detail that the high-level overview irons out. It is interesting and important, in this context, that EEF has commissioned a more in-depth analysis of the individual studies behind the reviews on which the Sutton Trust Toolkit stands.

The needs of users

The other key to improving meta-analyses from a user perspective is ensuring that the foci, the research they mine and the reporting forms and protocols are mindful of the needs of users.

Let’s look at how this is similar and different for three major groups of practice rather than research-based users:

  • For practitioners, the priority will always be research findings that help them solve real problems arising through the actions they take to plan for and support learning interactions with pupils and/or findings that help make their work with learners more efficient and effective in enhancing children’s learning and life chances, ideally both. They want an overview of a research field organised around the problems they experience.
  • For school leaders the priority is evidence that helps them make choices between alternative strategies for improving the quality/quantity of teachers and teaching, organising, articulating and sequencing learning cumulatively within and across key stages, engaging stakeholders in deeper learning or deploying scarce resources more creatively and effectively.
  • For policy makers the priority is helping to build ownership of values-based policies through better-informed implementation, increasing both efficiency and effectiveness or, of course, achieving election promises.

Through these lenses doing meta-analyses better means paying more attention to which meta-analyses get done and how this maps on to practitioner and policy maker concerns alongside consideration of what might be interesting from the perspective of researchers who are expert in such work. 

It also means much more extensive involvement of users and much more extensive analysis of how the work will help to make a difference.

It is not an accident that some of the New Zealand Best Evidence Syntheses of research into leaders’ contributions to student success or the contributions of CPD to students’ success have been acknowledged and promoted by UNESCO.

At each step of commissioning and execution, payments to the research teams were dependent on feedback from a panel of teachers, school leaders, parents and policy makers that they could see how the outputs would be useful.

It isn’t that these user panels ignored all the important technical issues about quality. It is rather the case that they set these alongside user needs and interests and insisted on a clear, concise explanation of the technical issues so that practitioner and policy users could understand what was involved and the provision of considerable, illustrative detail.

References

Melby-Lervåg, M., Hulme, C. (2013) Is working memory training effective? A meta-analytic review. Developmental Psychology, 49, pp. 270–291. [Online].
https://www.apa.org/pubs/journals/releases/dev-49-2-270.pdf

Newton, I. (1965) Letter to Robert Hooke. [Online].
https://digitallibrary.hsp.org/index.php/Detail/objects/9792

Sweller, J. (2011). Cognitive load theory. In J. P. Mestre & B. H. Ross (Eds.), The psychology of learning and motivation,55. The psychology of learning and motivation: Cognition in education, pp. 37-76, San Diego, CA, US: Elsevier Academic Press. [Online].
https://doi.org/10.1016/B978-0-12-387691-1.00002-8

Timperley, H. (2008) Teacher professional learning and development. Educational Practice Series 18, International Academy of Education and International Bureau of Education, Paris, UNESCO [Online].

About the author:

Philippa Cordingley (@PhilippaCcuree)

Philippa Cordingley is an expert in research and evidence informed policy and practice and Chief Executive of The Centre for the Use of Research and Evidence in Education. Current research includes a comparison of the characteristics of Exceptional Schools and those seeking and gaining momentum in school improvement for Teach First, a UK wide investigation of subject specific professional development and R&D support for developing capacity for research and evidence informed practice across school districts.

She is Chair of the Research Council for the National Foundation for Leadership, a member of OECD’s Expert Group for Country reviews of teacher preparation and development, and of Eton College Research Advisory Group.

Find out more:

Read more about meta-analysis on the CEM blog: 

Systematic Reviews and Weather Forecasts – how purpose shapes the significance of systematic reviews for different education stakeholders, By Philippa Cordingley, Paul Crisp & Steve Higgins

Is meta-analysis the best we can do? By Steve Higgins

Sign up for regular updates on the CEM blog