By Dr Gary Jones
A major challenge for anyone interested in the use of research within both schools and the wider education system is to try and make some kind of judgment about the trustworthiness and quality of research.
Indeed, this is a real problem, as it is highly unlikely that teachers, heads of department, or school leaders will have the required level of expertise to critically examine the claims that are being made.
So, to help get around this problem of a lack of expertise and increase the chances of being able to judge the usefulness of research, in my new book, Evidence-based School Leadership and Management, I have looked at a helpful framework which provides you with a useful scaffold to help you make efficient and effective use of research evidence in bringing about improvements within your classroom, school or educational setting:
The 6 A’s of the usefulness of research evidence
Professor Steve Higgins, of Durham University, developed the framework the '6 A's of the usefulness of research' by asking a number of questions:
Is the research you are attempting to use:
I have subsequently worked with Professor Higgins to develop a more fleshed-out version of the 6 A’s, which draws upon both the TAPUPAS framework developed by Pawson, Boaz, et al. (2003) and Critical Reading and Writing for Postgraduates (Third Edition) by Wallace and Wray (2016) and which is illustrated in the following table.
The 6 A’s framework in use
It needs to be remembered that the ‘6 A's' is just a framework to help you think through the usefulness of a piece of research report for your setting and associated problems of practice.
With that in mind, the framework should not be used in a ‘tick-box’ manner, answering all the sub-questions. Rather it should be used as an aide-memoire, which is to be dipped into as time, knowledge and understanding allows. That said, we would recommend that you attempt to answer one sub-question within each section.
Physical: Google scholar, Chartered College of Teaching and the EBSCO database, Open source journals
Is it a contribution to policy, theory or practice?
What do the authors assume about the knowledge of the readers?
How robust is the evidence?
Are the methods used suitable for the research aims?
To what extent are the claims made supported by others' work?
What evidence that challenges their claims is not mentioned?
Does the knowledge generated meet the specific standards of that type of knowledge?
What groups of learners might benefit from the findings?
What degree of certainty do the authors make for their claims?
How generalisable are their claims?
Context: System, Age, Phase, Type of school, Subject, Content
Level of use: Teacher, Head of Department, Business Manager, Executive Head, CEO, Board of Trustees, Governing Body, Policy-maker
What values stance is being adopted – are they implicit or explicit?
How might the values stances taken by the authors affect their claims and acceptability by colleagues?
To what extent are the claims consistent with my experience?
Is the research relevant to the problem which is the most interesting to you and your colleagues?
Are there any ethical issues arising from the research? Conduct, Subsequent implementation
Might you be rejecting something because it doesn't 'feel' right?
Is the research relevant to the most recurring problems in your department, key-stage, school or multi-academy trust?
Is the research related to a problem within your sphere of influence most relevant to your sphere of influence?
Is the research relevant to problems for which resources are available? Staff, Time, Expertise and Finance
Does the research specify causal statements – If this … then ..?
Are concrete behaviours specified to bring about the intended outcomes?
Do the teaching staff have the skills required/or can they be taught the skills required to put the research into effect?
Amended from Jones (2018)
Some observations about the framework
Observant readers will notice that the original order of the first two A’s has been switched, with accessibility now coming before accuracy. This was done for a very practical reason, you are unlikely to be able to judge the accuracy of the research unless you have first been able to access it.
Furthermore, it needs to be remembered that this is not the ‘only framework in town’ which you can use to appraise the usefulness of research.
Willingham (2012) has come up with the following four-step process:
First, ‘strip it’, get rid of the fluff surrounding the idea and get right to the heart of the claim being made. What specific intervention, strategy or actions is the school leader being asked to adopt and what outcomes, for student learning or achievement, staff well-being or other outcomes are being promised?
Second, ‘trace it’, where did the idea come from? Is the idea supported by a leading educational authority? Unfortunately, in education, this can be a weak indicator of validity and reliability? Do other ‘experts’ support the idea?
Third, ‘analyse it’, what are you – the evidence-based school leader - being asked to believe? What is the evidence to support the claims being made? How does this evidence relate to your experience as a school leader?
Fourth, ‘should I do it?’ Is it something which is already being done? Is it an old idea wrapped in new language and terminology? Has it worked previously in other settings with other students? Has it failed previously in other settings with other students? What are the opportunity costs – the things necessary to give up or forego - in pursuing this intervention or strategy?
What is hopefully clear from this blog post – no matter your knowledge and skill level - there are ways in which you can make structured judgments about the usefulness of research.
However, what is great about these frameworks is that the more you use them and become familiar with them, the better you will get at using them, which will in turn positively contribute to helping you make efficient and effective use of research evidence in the improvement of schools.
Jones, G. (2018). Evidence-Based School Leadership and Management: A Practical Guide. London. SAGE Publishing.
Pawson, R., Boaz, A., Grayson, L., Long, A. and Barnes, C. (2003). Types and Quality of Social Care Knowledge. Stage Two: Towards the Quality Assessment of Social Care Knowledge. ESRC Centre for Evidence Based Policy and Practice: Working Paper.
Wallace, M. and Wray, A. (2016). Critical Reading and Writing for Postgraduates (Third Edition). London. Sage.
Willingham, D. (2012). When Can You Trust the Experts: How to Tell Good Science from Bad in Education. San Francisco. John Wiley & Sons.
Higgins, S. (2018) Improving Learning: Meta-analysis of Intervention Research in Education Cambridge: Cambridge University Press
You might also like
Classroom observation: it’s harder than you think
by Professor Robert Coe We’ve all done it: observed another teacher’s lesson and made a judgement...
Making a positive primary to secondary transition in Music
by Dimitra Kokotsaki Moving from Key Stage 2 to Key Stage 3 can be a big upheaval for many...
At the Festival of Education 2018, Rob Coe’s presentation followed on from his recent blog post But...