Value-added calculations (all key stages)
Value added and contextual value added Value added/progress is a model which calculates the amount of progress made by pupils between key stages compared to the average of their statistical peers. Where a measure appears in the performance tables, Aspire uses prior attainment in establishing similar pupils as per the DfEs funlovestory.comted Reading Time: 40 secs. 'Contextual value added might not be the holy grail when it comes to holding schools to account' Rather than trying to find the perfect measurement of progress, we need a system where pupil performance data is seen as only one individual part of a much larger picture, .
The new measure takes in many factors The secondary school league tables for England include a new measure this year: "contextual value added". What is it and how does it work? Value added has been with us for some time - comparing each pupil's results with those of all pupils nationally with similar attainment when they left primary school. The usual outcome has been Critics pointed out that the measure made no allowance for the very different circumstances in which schools operate, and over which they have no control.
Thus contextual value added CVA was conceived. The idea is in effect to create a level playing field so that how they actually performed - better or worse than the others - can be attributed to the school's influence. It is important to grasp that this is entirely a statistical, mechanistic model. It is not as if someone in Whitehall sits down and decides what weighting to give to the results of, say, a Bangladeshi girl with behavioural what does tantric massage mean who gets free school meals.
Instead, the formula looks at the actual prior attainment and current attainment of all pupils with such characteristics. This gives a predicted outcome for any individual pupil. Their CVA score is the difference between that predicted figure and their actual results. To get the score for the whole school, all its pupils' What is contextual value added scores are averaged then a so-called "shrinkage factor" is applied, related to the number of pupils.
The final outcome is a number based around Unexpected outcomes The effects of the various coefficients can be, officials point out, counter-intuitive. Remember they are measures of progress. For example, you might expect older pupils to attain better GCSE results, but the predictions show that younger pupils make more progress. This makes sense because younger pupils tend to have been further behind at Key Stage 2 and close the gap as they move up through secondary school. Likewise, those for whom English is not their first language tend to make more progress at each successive key stage as their grasp of it improves.
So, for instance, Chinese children on free school meals make more progress than anyone else - so their predicted grades are adjusted upwards more than anyone else's by almost 40 points - so they have to do really well for their schools to get any credit. Conversely, travellers do extremely badly. The biggest loading - an adjustment of predicted GCSE points of more than - is for travellers of Irish heritage.
Concerns There is a warning in the official explanations about all this, in bold, that CVA must not be used to set expectations for any pupils - they must all be treated equally. But inevitably, critics point out that using CVA makes what the church believes and teaches likely.
And they say there are other things wrong with it. For example, no account is taken of different funding levels per pupil.
But schools in deprived areas or with certain types of children get extra funding to take account of such things - so why count them again in the progress measure? There are no independent schools in the CVA tables, by the way. The government does not have the necessary data on them. But why do grammar schools not give as good a showing under CVA as under the old, simpler value added system?
Officials at the Department for Education and Skills have suggested this is because they and other highly popular state schools in better off areas, with good headline results, often attract pupils who have done well in the past.
Consequently their scope for adding value is reduced. Another factor is that each pupil's what is contextual value added is capped, tire measurements what do they mean their best eight GCSE results. Many of the high-flying schools enter pupils for more exams than that - so capping the score artificially holds down their CVA indicator.
Not surprisingly, some are crying foul. Others just say parents will be baffled. Statisticians will be pleased to see that the government has included in its tables "confidence intervals" alongside the raw CVA scores. These allow people to decide how much significance to attach to the basic indicator. For most, the rule of thumb is that schools with a CVA of more than are doing better than expected, those below are doing worse.
How much better or worse? Officials say a score of means that on average each of the school's pupils achieved the equivalent of one GCSE grade higher in one subject than the average attained by similar pupils.
So a score of plus 6 x 8 means that on average each pupil achieved one GCSE grade higher in each of their best eight subjects than the average attained by similar pupils. And a score of means that the school's pupils achieved one grade lower, and so on.
Low graphics Accessibility help. News services Your news when you want it. News Front Page. E-mail this to a friend Printable version. The new measure takes in many factors. The best GCSE-level results. The worst GCSE-level results. Schools that add the most value. Schools that add the least value. The 'most improved' schools. Falling down the new table. Top A-level results. How different areas performed.
Tables turned: changes this year. How the new CVA scoring works. The best of the best. The best results. The worst results. The best and worst results. Change on way in tests and tables 04 Jan 07 Education. Concerns over pupil progress data 25 Jun 06 Education. Department for Education and Skills tables. The BBC is not responsible for the content of external internet sites. Ghost town. The guerilla plant. Walking away. BBC Copyright Notice. One-Minute World News.
Printable version. E-mail this to a friend. The guerilla plant How the world's oldest clove tree defied an empire.
Should we bring back contextual value added?
Did CVA really lead to lower expectations? Jan 11, · The secondary school league tables for England include a new measure this year: "contextual value added". What is it and how does it work? Value added has been with us for some time . Value-added is the additional features or economic value that a company adds to its products and services before offering them to customers. Adding value to a product or service helps companies.
Contextual value added CVA is a technique used to analyse the progress made by pupils which takes into account a wider range of factors than value added. Typically, value added only takes into account pupil prior attainment whereas CVA can include things such as ethnicity, free school meals status, mobility and first language.
We FFT were calculating CVA before it was introduced into performance tables and have carried on calculating it since the government abandoned it. But do the reasons given for binning it stack up? And should the government bring it back. Well, CVA is undoubtedly a much more complex calculation than value added, but if as argued by supporters of CVA it was a more accurate reflection of school effectiveness then surely it would have been better to find ways of explaining it more clearly instead of dropping it?
The second reason suggests a misunderstanding of the role of CVA. Like any value added measure it is a retrospective measure and tells us only what happened in the past. The third reason that CVA contributed to lower expectations for pupils was summarised by DfE in For example, we do not think it right to expect pupils eligible for free school meals to make less progress from the same starting point as pupils who are not eligible for free school meals particularly once the introduction of the Pupil Premium ensures that schools receive extra resources for pupils from poorer backgrounds.
We should expect every child to succeed and measure schools on how much value they add for all pupils, not rank them on the make-up of their intake.
The Importance of Teaching white paper, I would argue that this represents a misunderstanding of the role of CVA. Thinking about what pupils might attain in the future is part of the process of setting targets.
But, like any value added calculation, CVA provides an evaluation of what has already happened. And while some schools and local authorities undoubtedly did use it in that way, anecdotal evidence would suggest that this was only a minority of schools. So, is there any evidence that DfE were correct in the assertion that CVA entrenched low aspirations? If this was true then we might expect to see some changes when we compare trends over two periods:. Typically, schools like this would have a high proportion of white British pupils from disadvantaged backgrounds.
We can look at this question by dividing schools into five equal-sized groups, or quintiles, based upon the percentage of their pupils who are white British and entitled to free school meals focusing in particular on two of these groups:.
When analysing KS4 data over the period in question we need to think about the impact of early exam entry. The chart below shows value added scores for these two groups of schools for the period from to It shows standardised value added scores that is, the scores are in terms of standard deviations from the mean, which is zero.
Looked at another way, the chart below shows the average annual change in value added for the CVA years, to , and the value added VA years, to , for these two groups of schools.
Could it be that, instead of reducing expectations, the message that CVA gave to schools serving economically disadvantaged areas was that they were doing well despite the impact of disadvantage? And that this resulted in both the school staff and inspectors having a more positive view of the school, aiding recruitment and retention?
It would, though, seem reasonable to conclude that there is no evidence to support the view that CVA led to lowered expectations for disadvantaged pupils. So, should we bring back CVA? As Dave Thomson shows here , taking a CVA approach to Progress 8 leads to a view that differences between schools are less than is seen from a value added perspective. Might the way forward be to use CVA, along with value added and attainment, but only in the context of inspection and self-evaluation?
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website.
These cookies do not store any personal information. Should we bring back contextual value added? And should the government bring it back Why did the government stop calculating CVA? The main reasons for dropping CVA as discussed in this paper by George Leckie and Harvey Goldstein were: It was hard to understand It was a poor predictor of success though of what type of success the DfE never made clear It expected different progress from different pupil groups Do these stand up to scrutiny?
Did CVA entrench low expectations? If this was true then we might expect to see some changes when we compare trends over two periods: to , the era of CVA. Overall, changes in value added over time are relatively modest.
Conclusion Could it be that, instead of reducing expectations, the message that CVA gave to schools serving economically disadvantaged areas was that they were doing well despite the impact of disadvantage? About the Author: Mike Treadaway. A former teacher, lecturer and LA adviser, Mike has over 20 years of experience in working with education data in the context of school improvement. Since founding the FFT Data Analysis Project he has developed models for analysing pupil progress, has led the processing, matching and data analysis for the National Pupil Database, and has provided advice to the Department for Education to support the development of new school accountability indicators and analysis of national trends in school performance.
Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website.
These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may have an effect on your browsing experience. Necessary Necessary. Go to Top.