What is value-added and how does it help?
We all know that academic progress is an individual thing.
Making progress relies on a whole range of influencing factors, and students make progress at different times and at different rates. A focus on final exam results often fails to take account of the huge steps students and teachers may have taken along the way.
So, in the interests of fairness, value-added measures are intended to offer a fairer indication of how far a student has come, a fairer measure of how well the school has brought that student on, and a fairer way to make comparisons between different schools’ performances.
What is value-added?
If we looked at all pupils of a certain age and measured their progress over a certain period of time we would find that this progress had a ‘Bell Curve’ or Normal Distribution. Some pupils would make a small amount of progress, most would make a near average progression and a few would make exceptional progress.
As well as being subject to natural variation, the improvement in achievement owes a lot to the environments in which pupils find themselves.
The quality of teaching, availability of resources and many other factors may have an effect on the progress of individual pupils. In fact, if we consider a fixed period of time we can talk of the amount of improvement a school has added to the pupil.
All schools improve their pupils in this way. However, if one school is increasing the achievement level of its pupils more than other schools are, then its pupils gain an additional advantage.
It is this relative advantage that has come to be called value-added.
The residual, or value-added score
We call this value-added measure the residual, because it is what is left over after we have taken into account the pupil’s baseline ability.
The residual, or value-added score, will be influenced by many factors such as effort, health, home experiences and help from others, in addition to the teaching provided by the school. The residual will also be influenced by chance ‘error’ such as guessing, luck with item spotting, careless mistakes and so on.
However, the residual is the best available indicator of the net effect schools have on the progress of a pupil.
What method is used to calculate value-added?
Value-added is based on a regression line, which represents the relationship between every student’s baseline score (Alis, Yellis or MidYIS) and the actual grade achieved in that subject.
Our concept of value-added information sheet further explains the regression line that we use to calculate value-added.
Value-added reports available from CEM assessments
MidYIS, YELLIS, Alis and CEM IBE all provide value-added reports by subject and cohort, once you have uploaded your exam results.
The average standardised residual chart enables you to compare value-added achievement across subjects.
The standardised residuals also enable you to compare value-added between qualification types and across years. So you can confidently compare value-added between this year and previous years’ qualifications.
All our value-added analysis is based on a nationally representative sample of schools.
We calculate value added differently for independent schools
For MidYIS and Alis, we also generate further value-added analysis specifically for independent sector schools (in addition to the nationally representative analysis that is available for all schools).
The data from independent sector schools shows that in general, the ability intake of their student population is skewed towards the higher abilities and the final results achieved by this group of students tends to be skewed towards the higher grades.
This means that the relationship between the CEM assessment (such as MidYIS or Alis) and final exam results benefits from being modelled separately, resulting in separate, additional value added reports for independent schools.
For more information on how value added is modelled in MidYIS for the independent sector, see our concept of value-added for independent schools information sheet.
Tracking value-added performance over time
Statistical process control charts enable you to monitor school value-added performance over time, by subject. They can help you identify when value-added is not down to chance but is a reflection of the quality of teaching and learning, enabling you to share best practice across departments.

Evidence, consistency and stability
Objective value-added progress measures provide vital evidence for internal monitoring, inspections and some of those tricky conversations with governors, as well as providing consistency and stability during a time of ever-changing government policy on assessment, tracking and accountability.
Assisting school leadership and management
CEM value-added reports play an important role in providing evidence for self-evaluation, school improvement plans and inspections.
The breakdown of feedback supports school leaders in driving improvements by understanding progress across the whole institution, identifying performance above or below expectation across all curriculum areas.
The value-added feedback can be easily imported into management information systems and allows you to monitor trends over time with year on year comparisons.
Supporting effective teaching and learning
Value-added is a fair measure of the progress that students have made. Rather than relying solely on exam results, it takes account of where each student started from and the progress they made relative to other, similar students.
The value-added reports help you ask the right questions about individual subject strengths, share best practice between departments, support judgements about assessment and support, and tailor aspirational target-setting.
Find out more
Full user guides for all our assessments are available for CEM assessment users through our secure sites (login via the menu).
Training and support to help you understand CEM reports and translate them into powerful information for use in the classroom.