This piece summarises my reaction to Ofsted’s recently published report evaluating the impact of the focus on the curriculum within the EIF.
Observers have been pressing for an evaluation of the impact of the EIF for a number of years. In fact, Ofsted indicated it would produce such a report but in reality what it has published is just an aspect of the Framework. It is limited to the curriculum and is narrow in its consideration of impact in that it doesn’t consider the contribution the curriculum makes in other areas and vice versa.
I have a long history of writing and editing subject and surveys within Ofsted. I am deeply concerned by this report in that its lack of robustness makes it a pretty meaningless report. I would not encourage any school leader., teacher or researcher to use this report for creating or confirming a view on the effectiveness of curriculum work in English schools. The size and nature of the cohort are deeply worrying. No empirical study worth its salt would try to draw conclusions from such a small number of schools.
The report draws attention to all of this here ‘It is worth noting the limitations of this study, especially its small scale. This must be considered when interpreting our findings. The experiences of the schools in this study are not necessarily representative of schools more broadly.’ In addition, we have no context information on the schools visited and nothing on their improvement journey. Surely, it is fair to assume that these are on the whole schools that are, or were improving, so is the attention on curriculum just down to a staging post on the improvement journey? We don’t know, and Ofsted doesnt know, but the inspectorate appears keen to make it look as though some or much of the improvement identified by them is actually down to them. If ever there was review that required an external evaluator it must have been this one.
The report is clear on the type of schools selected even if we are not told the name or location of the schools ‘we have a higher proportion of schools with an overall effectiveness grade of ‘outstanding’ and with high scores for overall curriculum quality in our sample. It is notable that we saw improvement despite the limited room for improvement in these high-performing schools. It may be that these schools were more likely to have made improvements than other schools that did not take part.’ Isn’t this the fundamental pointlessness of the evaluation? It’s flawed from the outset and the report provides the evidence to make this case.
The elephant in the room is why has Ofsted chosen to undertake such a small survey when all inspections inspect the Curriculum? Is it because the evidence bases drawn from school inspections are not as focused or secure in terms of identifying what has made the difference in the quality of provision? Or is it that the evidence was gathered without an underlying hypothesis for the evaluative work that was required for this report? We don’t know because Ofsted doesn’t explain why it didn’t use the significant evidence base that exists from school inspections.
The report contains a number of Key Findings. I have added a personal commentary on each one. These are often questions and possible explanations for why they have been identified. My comments are in red font.
Key findings
We saw improvements in many of the areas emphasised in the EIF
Overall:
School leaders were positive about the greater focus on the quality of curriculum during inspection
This is probably inevitable because the selected sample for the evaluation drew from schools awarded strong grades, so respondents would view this as positive. The small sample and the emphasis on strong schools largely invalidates this finding.
We saw broader, more in-depth, ambitious curriculums, in most of the schools we visited.
Yes, it is more likely to find this in more successful schools which tend to be in more affluent areas so the lack of information on the context of those schools makes this finding rather meaningless.
School leaders prioritised reading, which continued to be one of the highest-quality areas of curriculum practice we measured.
Most schools had a deep dive in reading and the schools in the report are heavily weighted towards Outstanding, so it would be unusual not to find this.
Schools had improved how they sequence and map subject knowledge and skills; the greatest improvements were in foundation subjects (all subjects beyond English, mathematics and science) in primary schools. School leaders’ views demonstrated that the EIF has played a part in influencing these improvements:
Bearing in mind many of the schools in the sample were judged highly by Ofsted it is not unusual to find a strong emphasis on sequencing, especially in Foundation subjects, because that was a determinant of a strong Ofsted grade. Ofsted is reporting on the internal bias in its grade descriptors here. If you do not sequence then the curriculum must be poor.
Around a third of the schools we revisited had made major changes as a direct result of the focus on high-quality curriculum in the inspection framework.
Another way of saying this is that in around seven schools that Ofsted visited, many of which were outstanding, this feature was observed.
Many of the schools were already taking an evidence-informed approach to developing a high-quality curriculum. The inspection framework helped them to affirm and speed up the changes they were making.
This is suggesting that these schools were already making improvements to their curriculum and the EIF increased the speed of attention on this issue. Another way of looking at this is that schools already improving their curriculum approach were forced to reprioritise to meet Ofsted’s requirements. The downside being they had to delay or stop other essential developments.
The concepts of curriculum intent, implementation and impact, as set out in the EIF, influenced almost all school leaders’ curriculum thinking. Many had not thought in detail about the rationale of their curriculums before we introduced the framework.
What!!! So, Ofsted is saying that leadership training and development had not covered any issues regarding curriculum development until Ofsted focussed on it? This cannot be substantiated can it?
Leaders told us that the EIF gave them a shared language that they could use to facilitate change, and to collaborate on the curriculum with other schools. This finding supports our aim, set out in the theory of change, that the EIF would contribute to a shared concept of quality in the sector.
No, it enabled a common language to be shared. it didn’t necessarily lead to a concept on quality. Surely, this is what this report should be commenting on even if it is difficult to judge the quality of a curriculum especially from such a small sample.
Many school leaders said reading was a curriculum area that the EIF had influenced. Some further credited the EIF with encouraging them to use standardised phonics programmes.
Really!! The DfE has been promoting approved phonics programmes for a number of years.
We heard that the EIF’s focus on the curriculum, across all subjects, led to subject leaders having more responsibility for decisions about the curriculum. In many schools, this had helped to increase professional development opportunities for subject leaders.
This was especially true for subjects other than English and mathematics. ‘We heard’ is an interesting phrase because it ignores the impact of that engagement, doesn’t it? I assume ‘we heard’ is a way of saying that we don’t know its real impact but a few colleagues thought it was positive so we will write that up.
The EIF also had a role in helping schools to improve their curriculum planning, mapping and progression across subjects.
But the sample is so small Ofsted is admitting here that it really doesnt know how big a role?
However, unintended consequences can arise from what we include in our inspection frameworks:
Some school leaders told us that our focus on curriculum quality across all subjects put pressure on staff who lead multiple subjects or who are not subject specialists.
This is a fundamental weakness of the EIF and here it skates over the issue. I’d have expected it to say that this was a particular issue in small primaries, in areas where recruitment is tough and where there has been a history of weak leadership. Cos the sample is so small, and probably didn’t include too many small schools in coastal areas, it can’t make these points.
In the early stages of the EIF, schools did not always fully understand the concepts of intent, implementation and impact. This was particularly true of ‘intent’. It sometimes led to schools producing ‘intent statements’ for subjects, rather than setting out clearly what pupils should learn in a logical order. For some schools, ‘intent’ seemed to dominate other areas of curriculum thinking.
So, this is a weakness in the way Ofsted explained the focus on curriculum isn’t it? Schools didn’t invent the ‘intent’ term, it was an Ofsted creation. Wouldn’t it have been more accurate for Ofsted to admit that their roll out of the EIF had weaknesses especially for schools already under pressure from the inspectorate or where leadership capacity was light (namely small primary and some special schools).
Although the focus on reading in the EIF has led to further improvements, it may have resulted in other areas being left behind. For example, mathematics did not receive the same attention as reading across the curriculum in the schools we visited. We do recognise that it takes time for schools to develop their curriculum.
In other words, the deep dives on reading that occurred in all inspections ended up with schools ignoring other important aspects of the curriculum. No wonder we now have pupils not too keen to attend or where there is a drive for greater freedoms to evolve the curriculum offer further.
After commenting on all of these points I am left with that same feeling that Ofsted is keen to identify weaknesses in others but not in itself. Some of the weaknesses it identifies are down to the way Ofsted introduced and carried out the EIF. The way it reached out to schools, the way it trained its inspectors, the way it piloted the Framework, the way it didn’t self evaluate in a meaningful way. What’s clear is that this sample is hardly worth referencing as a compelling review of the EIF’s focus on curriculum. It enables Ofsted to say it is reviewing its own work but if this is the best it can do I suggest they get back to ensuring their inspectors are up to the job.
0 Comments