David Kernohan is Deputy Editor of Wonkhe

The Office for Students has been evaluating the last iteration of the Teaching Excellence Framework (TEF), which happened in 2023.

The 2023 TEF was a very different beast to previous iterations, focusing more on qualitative (submissions from providers and students) evidence and less on the quantitative experience and output measures. But to be clear, this work does not appear to assess the impact or likely effects of these changes – it treats the 2023 exercise very much as a one off event.

We get an independent evaluation report, written by IFF research. There’s the findings of a survey of students involved in preparing the student submissions (aspects of which contribute to a student guide to evidence collection for TEF), findings from a survey of applicants (conducted with Savanta), and an analysis of the estimated costs to the sector of TEF2023. The whole package is wrapped up with a summary blog post, from OfS TEF supremo Graeme Rosenberg.

Of all this, the blog post is the only bit that touches on what most of us probably care about – the future of the TEF, and the wider idea of the “integrated quality system”. Perhaps predictably, OfS has heard that it should

“build on the elements of the TEF that worked well and improve on areas that worked less well for some providers.

The top-line summary of everything else is that OfS is pleased that TEF seems to be driving change in institutions, particularly where it is driven by student perspectives. There’s less confidence that the TEF outcomes are useful for prospective students – the regulator wants to explore this as a part of a wider review of information provision. And while institutions do find TEF valuable, the cost involved in participation is considerable.

How much does TEF cost then?

It cost OfS £3.4m, and the mean estimate for costs to the wider sector was £9.96m. That’s about £13.4m in total but with fairly hefty error bars.

What else could the taxpayer buy for £13.4m? There’s the much-needed Aylesbury link road, an innovation hub in Samlesbury near the new National Cyber Force headquarters (promising jobs paying upwards of £3,000 according to the headline), or enough money to keep Middlesbrough Council solvent for a while. In the higher education world, it’s equivalent to a little under 1,450 undergraduate annual tuition fees.

The sector numbers come from a survey involving 32.3 per cent of providers (73: 52 higher education providers, 21 FE colleges) involved in the 2023 TEF conducted in September and October 2024 (so significantly after the event). It looked at both staff costs and non-staff costs (stuff like consultancy fees).

As you’d probably expect, costs and time commitments vary widely by institution – one provider spent 30 staff days on the exercise, while for another it was 410 (the median? 91.6). Likewise, there was variation in the seniority of staff involved – one institution saw senior leaders spend a frankly astonishing 120 days on the TEF. Your median higher education provider spent an estimated £37,400 on the exercise (again, huge error bars here). It is asserted that Gold rated providers spent slightly more than Silver rated providers – the data is indicative at best, and OfS is careful not to assert causality.

We also get information on the representations process – the mechanism by which providers could appeal their TEF rating. The sample size here is necessarily tiny: 11 higher education providers, 8 colleges – we are given a median of £1,400 for colleges and £4,400 for higher education providers.

Was it worth it?

The picture painted by the independent IFF evaluation is positive about the TEF’s role in driving “continuous improvement and excellence” at providers. The feeling was that it had encouraged a greater use of data and evidence in decision making – but in some cases these positive impacts were negligible given the volume of the input required. Students were also broadly positive, citing limited but positive impacts.

The evaluation also made it clear that the TEF was burdensome – a large drain on available staff or student resource. However, it was generally felt that the TEF was “worth” the burden – and there was a broad satisfaction about the guidance and support offered by OfS during the process (although as you might expect, people generally wanted more examples of “good” submissions – and the “woolly” language around learning gain was difficult to deal with, even though the purpose was to drive autonomous reflection on measures that made sense in a provider context).

One of the big 2023 cycle innovations was a larger role for the student submission – seen as a way to centre the student perspective within TEF assessment. This wasn’t as successful as OfS may have hoped – responses were split as to whether the process had “empowered the student voice” or not – the bigger institutions tended to see it as replicating pre-existing provider level work.

Students themselves (not many of them, there were 20 interviews of students involved in preparing the submissions) saw this empowerment as being limited – greater student involvement in quality systems was good, but largely the kind of things that a good provider should be doing anyway.

But the big question, the overall purpose, really needs to be whether TEF2023 raised the value of the student experience and outcomes. And the perspective on this was… mixed. Commonly TEF complemented other ongoing work in this area, making it difficult to pick out improvements that were directly linked to TEF, or even to this particular TEF. Causality – it’s difficult.

If we are going to have a big, expensive, exercise like TEF it is important to point to tangible benefits from it. Again, evidence isn’t quite there. About half of the providers surveyed used TEF (as a process or as a set of outputs including the “medals” and the feedback) to inform decision making and planning – but there were limited examples of decisions predicated on TEF offered. And most student representatives were unable to offer evidence of any change as a result of TEF.

Finally, I was gratified to note that coverage in “sector publications like Wonkhe” was one key way of sharing good practice around TEF submissions.

The value to applicants

Any attempt within the sector to provide a better experience for, or better outcomes for students is surely to be welcomed. However, for a large and spendy intervention the evidence for a direct contribution is limited. This is perhaps not surprising – there have been numerous attempts to improve student experience and outcomes even since the birth of the OfS: by the regulator itself, by other sector bodies with an interest in the student experience (the Quality Assurance Agency, Advance HE, the sector representative bodies and so forth) and autonomously by institution or parts of institutions.

Somewhat curiously, the main evaluation document has little to say about the realisation of TEF’s other main proposed benefit – supporting applicants in choosing a provider to study at. Providers themselves are unsure of the value of TEF here (feeling that it was unlikely that applicants would understand TEF or be able to place due weight on the findings of TEF) though there is some suggestion that a “halo effect”, drawing in part from the liberal use of logos and that job lot of gold paint, could help present a positive image of the provider. It is a hell of a reach, but some noted that the fact that institutional marketing and recruitment efforts used TEF and the logos presents evidence that someone, somewhere, thinks it might work.

The thing to do here would be to ask applicants – which OfS commissioned Savanta to do on its behalf as a separate exercise. This research was based on six focus groups covering 35 prospective students aged between 17 and 20 and applying to England. In four of these groups, participants had heard of the TEF – in two they had not – and in every case the applicants had ended up applying to silver rated universities.

This is backed up by what initially looks like a decent survey instrument – a big (2,599 respondents, covering various existing online panels, and weighted via the use of quotas on age, gender, ethnicity and post fieldwork by provider type, mode of study, domicile, and neighbourhood participation marker) survey conducted in April and May of 2024. The headline finding here is that 41.7 per cent of applicants (n=798) had seen TEF ratings for any university they had looked at.

Somewhat mystifyingly, the survey then focuses entirely on the experience of those 333 applicants in using the TEF information, before asking whether applicants may think TEF would be important in applying to university of the whole sample (52.2 per cent reckoned they would be important, despite a fair number of these applicants not having even noticed the ratings).

Can I just stop here and say this is a weird methodology? I was expecting a traditional high n survey of applicants, asked to rate the importance of various factors on application choices, ideally with no prompting. This would give a clearer picture of the current value of TEF for such decisions, which is what you would expect in evaluation. That’s not to say that the focus groups or a specific awareness or use survey wouldn’t be a valid contribution to a proper mixed methods analysis – or as a means of generating a survey instrument for wider use.

Even so, participants in the focus groups were happy to list the factors that affected their choices – these included the obvious winners like location, course content, and graduate outcomes, plus a “significant role” for the cost of living. Secondary (less important) factors included university reputation, teaching quality, and other personal preferences. Though some of these factors are covered within the TEF exercise, not one single applicant mentioned TEF results as a primary or secondary factor.

For those that had heard of TEF it was seen as a “confirmatory tool rather than a decisive factor.” Applicants did not understand how TEF ratings were determined, the criteria used, or what the meaning of – say – gold rather than silver meant when comparing providers.

The focus groups chucked the supplementary information (panel statements, submissions, the data dashboard) at applicants – they tended to quite like the student statements (viewing these as authentic), but saw the whole lot as lengthy, overcomplicated, and lacking in specificity.

I enjoyed this comment on the TEF data dashboards:

I feel like there is definitely some very useful information on this page, but it’s quite hard to figure out what any of it means.

On the main ratings themselves, participants were clear that gold or silver probably pointed to a “high standard of education,” but the sheer breadth of the assessments and the lack of course level judgements made the awards less useful.

There was, in other words, a demand for course specific information. Not only did applicants not mention Discover Uni (a government funded service that purports to provide course level data on student outcomes and the student experience), the report as a whole did not mention that it even existed. Oh dear.

Unlike IFF, Savanta made some recommendations. There needs to be better promotion of the TEF to applicants, clearer ratings and rationales, and a more concise and direct presentation of additional information. Which is nice.

What to make of it all

Jim will be looking at the student submission aspects in more detail over on the SUs site, but even this first reading of the evaluation documents does not offer many hints on the future of the TEF. In many ways it is what you would expect, TEF has changed mainly when OfS decided it should, or when (as with the Pearce review) the hand of the regulator is forced.

While providers are clearly making the best of TEF as a way to keep the focus on the student experience (as, to be clear, one stimulus among many), it is still difficult to see a way in which the TEF we have does anything to realise the benefits proposed way back in the 2015 Conservative manifesto – to “recognise universities offering the highest teaching quality” and to allow “potential students to make decisions informed by the career paths of past graduates.”

Leave a reply