Geraint Talfan Davies casts an eye over the school banding data and asks what can we learn about local education authorities
The controversy over the Welsh Government’s insistence on banding schools according to their levels of ‘performance and progress’ has centred around a professional distaste at the ‘naming and shaming’ of individual schools, and a theological debate as to whether the bands are a league table or not. The Welsh Government’s website says:
“Banding is NOT (sic) about ranking, naming and shaming. Banding IS (sic) about being transparent about the relative performance of our schools”.
I see. In diplomatic circles this would be known as creative ambiguity, even if, in this case, a bit short on ambiguity. Either way, shedding more light on the true performance of public services in our ‘land of the pulled punch’, especially when it is a service so vital for the future of our children and our society, has my support.
But while much of the debate has been about what the new banding tells us about individual schools, little has been said about what this tells us about local authorities. This is surprising given the other current debate about the value of local authorities, set against their cost which, it is argued, diverts money from the schools themselves.
Such an evaluation of local education authorities may need another exercise, since additional criteria will come into play. But what can we glean about our councils from the school scores that have emerged already – significantly, as a result of a Freedom of Information request by journalists at BBC Wales? (Who says we don’t need journalism?)
If you start to aggregate scores by local authority you can start to get a rough measure, although I would have to admit that it is a fairly crude approach. But just to get the debate going here are some quick observations.
The five bands set out in the scheme are determined by the following scores, where the lower the score the better the result.
- Band 1 11 to 17.6 points
- Band 2 17.6 to 24.2
- Band 3 24.2 to 30.8
- Band 4 30.8 to 37.4
- Band 5 37.4 to 44.0
The 220 secondary schools are spread across the bands in a classic bell curve:
|No. of schools||% in band|
It is not surprising, therefore, that when you aggregate the scores for all schools within an authority and produce an average number of points per school, you see the same pattern, although with the extremities trimmed: that is, no local authorities in Bands 1 or 5, three local authorities in each of Bands 2 and 4 and 16 out of the 22 local authorities in Band 3.
This is very simple arithmetic – perhaps too simple – and it is beyond my expertise to come at this with some form of regression analysis. But whichever way you look at it, the same authorities come to the top.
The local authority with the best average score – Neath Port Talbot – is the authority with the largest proportion of its schools in Bands 1 or 2 (72.7 per cent), and the lowest proportion in Bands 4 or 5 (9.1 per cent).
Remarkably, the authority placed second for its average score is Denbighshire that was subject to a damning report into its schools as recently as 2007. Clearly, there has been something of a turnaround, since now it is the fourth highest authority in terms of schools in Bands 1 and 2 (50 per cent) and fourth best in terms of a low proportion in Bands 4 and 5 (25 per cent). In fact, only 2 of its 8 schools are in Band 4, and none in Band 5.
Third place in terms of average score goes to Conwy, which comes second to Neath Port Talbot is having a high proportion in Bands 1 and 2 (71.4 per cent), and in having a low proportion (14.3 per cent) in Bands 4 and 5. Only 1 of its 7 schools is in Band 4 and none in Band 5.
The top of the table looks like this:
|Neath PT||21.2||1||Neath PT||72.7||1||Neath PT||9.1||1|
|V. of Glam||24.6||4||Denbs.||50||4||Denbs.||25||4|
|Wrexham||25.8||7||V. of Glam||50||4||V. of Glam||25||4|
At the other end of the scale there are some surprises. One might expect some of the more disadvantaged Valley areas to struggle for good scores, and some do, but one would not have expected Monmouthshire or Pembrokeshire to score so badly.
|Pembs.||30.3||19||Bl. Gwent||20||19||Bl. Gwent||60||19|
This also confirms that it is perfectly possible for good schools – and good local authorities – to surmount the challenges posed by economic and social disadvantage – something that was evident in last year’s report by the IWA on Key Stage 3 performance. One has only to put alongside these scores the data from the Welsh Index of Multiple Deprivation. This plots deprivation across 1,896 small areas within local authorities – L0wer Layer Super Output Areas (LSOAs) – all of roughly equal size of around 1,500 population. One chart in the index tells you what proportion of the LSOAs in that authority are in the most deprived 50 per cent in Wales.
For instance, 87.2 per cent of Blaenau Gwent’s LSOAs fall within the most deprived 50 per cent in Wales, followed by Merthyr Tydfil (77.8), Rhondda Cynon Taf (73.7), Caerphilly (68.2) and Neath Port Talbot (68.1). The least deprived is Monmouthshire (22.4). Yet Neath Port Talbot is at the top of the education performance list and Monmouthshire at the bottom. Meanwhile, Merthyr Tydfil is in joint fourth place in terms of schools in Bands 1 and 2 (50 per cent) – though admittedly it has only four schools in its care.
It is also interesting to look at our bigger towns and cities. In terms of multiple deprivation Newport is the most deprived with 56.4 per cent of its LSOAs amongst the worst 50 per cent in Wales, followed by Swansea (49.0), Cardiff (44.3) and Wrexham (43.5). Wrexham scores well, but in the two biggest cities, Swansea seems to come out best with five schools in Band 1, against Cardiff’s two. Cardiff also has a much higher proportion of its schools in Bands 4 and 5 – 35 per cent against Swansea’s 26.6 per cent.
It would be foolish to suggest that this is in any way a definitive assessment of the performance of local authorities, but it is a factor that we should be looking at. There will be as many ways of assessing them as of assessing their schools. But a core issue will be how well they can make best practice travel through all schools in their area. Creating greater consistency is, surely, a central responsibility for every education authority. The different spread of attainment in each council area surely tells us that some education authorities are providing a lot more value than others.