Jim is an Associate Editor (SUs) at Wonkhe

When the Office for Students’ (OfS) proposals for a new quality assessment system for England appeared in the inbox, I happened to be on a lunchbreak from delivering training at a students’ union.

My own jaw had hit the floor several times during my initial skim of its 101 pages – and so to test the validity of my initial reactions, I attempted to explain, in good faith, the emerging system to the student leaders who had reappeared for the afternoon.

Having explained that the regulator was hoping to provide students with a “clear view of the quality of teaching and learning” at the university, their first confusion was tied up in the idea that this was even possible in a university with 25,000 students and hundreds of degree courses.

They’d assumed that some sort of dashboard might be produced that would help students differentiate between at least departments if not courses. When I explained that the “view” would largely be in the form of a single “medal” of Gold, Silver, Bronze or Requires improvement for the whole university, I was met with confusion.

We’d spent some time before the break discussing the postgraduate student experience – including poor induction for international students, the lack of a policy on supervision for PGTs, and the isolation that PGRs had fed into the SU’s strategy exercise.

When I explained that OfS was planning to introduce a PGT NSS in 2028 and then use that data in the TEF from 2030-31 – such that their university might not have the data taken into account until 2032-33 – I was met with derision. When I explained that PGRs may be incorporated from 2030–31 onwards, I was met with scorn.

Keen to know how students might feed in, one officer asked how their views would be taken into account. I explained that as well as the NSS, the SU would have the option to create a written submission to provide contextual insight into the numbers. When one of them observed that “being honest in that will be a challenge given student numbers are falling and so is the SU’s funding”, the union’s voice coordinator (who’d been involved in the 2023 exercise) in the corner offered a wry smile.

One of the officers – who’d had a rewarding time at the university pretty much despite their actual course – wanted to know if the system was going to tackle students like them not really feeling like they’d learned anything during their degree. Given the proposals’ intention to drop educational gain altogether, I moved on at this point. Young people have had enough of being let down.

I’m not at home in my own home

Back in February, you might recall that OfS published a summary of a programme of polling and focus groups that it had undertaken to understand what students wanted and needed from their higher education – and the extent to which they were getting it.

At roughly the same time, it published proposals for a new initial Condition C5: Treating students fairly, to apply initially to newly registered providers, which drew on that research.

As well as issues it had identified with things like contractual provisions, hidden costs and withdrawn offers, it was particularly concerned with the risk that students may take a decision about what and where to study based on false, misleading or exaggerated information.

OfS’ own research into the Teaching Excellence Framework 2023 signals one of the culprits for that misleading. Polling by Savanta in April and May 2024, and follow-up focus groups with prospective undergraduates over the summer both showed that applicants consistently described TEF outcomes as too broad to be of real use for their specific course decisions.

They wanted clarity about employability rates, continuation statistics, and job placements – but what they got instead was a single provider-wide badge. Many struggled to see meaningful differences between Gold and Silver, or to reconcile how radically different providers could both hold Gold.

The evidence also showed that while a Gold award could reassure applicants, more than one in five students aware of their provider’s TEF rating disagreed that it was a fair reflection of their own experience. That credibility gap matters.

If the TEF continues to offer a single label for an entire university, with data that are both dated and aggregated, there is a clear danger that students will once again be misled – this time not by hidden costs or unfair contracts, but by the regulatory tool that is supposed to help them make informed choices.

You don’t know what I’m feeling

Absolutely central to the TEF will remain results of the National Student Survey (NSS).

OfS says that’s because “the NSS remains the only consistently collected, UK-wide dataset that directly captures students’ views on their teaching, learning, and academic support,” and because “its long-running use provides reliable benchmarked data which allows for meaningful comparison across providers and trends over time.”

It stresses that the survey provides an important “direct line to student perceptions,” which balances outcomes data and adds depth to panel judgements. In other words, the NSS is positioned as an indispensable barometer of student experience in a system that otherwise leans heavily on outcomes.

But set aside the fact that it surveys only those who make it to the final year of a full undergraduate degree. The NSS doesn’t ask whether students felt their course content was up to date with current scholarship and professional practice, or whether learning outcomes were coherent and built systematically across modules and years — both central expectations under B1 (Academic experience).

It doesn’t check whether students received targeted support to close knowledge or skills gaps, or whether they were given clear help to avoid academic misconduct through essay planning, referencing, and understanding rules – requirements spelled out in the guidance to B2 (Resources, support and engagement). It also misses whether students were confident that staff were able to teach effectively online, and whether the learning environment – including hardware, software, internet reliability, and access to study spaces – actually enabled them to learn. Again, explicit in B2, but invisible in the survey.

On assessment, the NSS asks about clarity, fairness, and usefulness of feedback, but it doesn’t cover whether assessment methods really tested what students had been taught, whether tasks felt valid for measuring the intended outcomes, or whether students believed their assessments prepared them for professional standards. Yet B4 (Assessment and awards) requires assessments to be valid and reliable, moderated, and robust against misconduct – areas NSS perceptions can’t evidence.

I could go on. The survey provides snapshots of the learning experience but leaves out important perception checks on the coherence, currency, integrity, and fitness-for-purpose of teaching and learning, which the B conditions (and students) expect providers to secure.

And crucially, OfS has chosen not to use the NSS questions on organisation and management in the future TEF at all. That’s despite its own 2025 press release highlighting it as one of the weakest-performing themes in the sector – just 78.5 per cent of students responded positively – and pointing out that disabled students in particular reported significantly worse experiences than their peers.

OfS said then that “institutions across the sector could be doing more to ensure disabled students are getting the high quality higher education experience they are entitled to,” and noted that the gap between disabled and non-disabled students was growing in organisation and management. In other words, not only is the NSS not fit for purpose, OfS’ intended use of it isn’t either.

I followed the voice, you gave to me

In the 2023 iteration of the TEF, the independent student submission was supposed to be one of the most exciting innovations. It was billed as a crucial opportunity for providers’ students to tell their own story – not mediated through NSS data or provider spin, but directly and independently. In OfS’ words, the student submission provided “additional insights” that would strengthen the panel’s ability to judge whether teaching and learning really were excellent.

In this consultation, OfS says it wants to “retain the option of student input,” but with tweaks. The headline change is that the student submission would no longer need to cover “student outcomes” – an area that SUs often struggled with given the technicalities of data and the lack of obvious levers for student involvement.

On the surface, that looks like a kindness – but scratch beneath the surface, and it’s a red flag. Part of the point of Condition B2.2b is that providers must take all reasonable steps to ensure effective engagement with each cohort of students so that “those students succeed in and beyond higher education.”

If students’ unions feel unable to comment on how the wider student experience enables (or obstructs) student success and progression, that’s not a reason to delete it from the student submission. It’s a sign that something is wrong with the way providers involve students in what’s done to understand and shape outcomes.

The trouble is that the light touch response ignores the depth of feedback it has already commissioned and received. Both the IFF evaluation of TEF 2023 and OfS’ own survey of student contacts documented the serious problems that student reps and students’ unions faced.

They said the submission window was far too short – dropping guidance in October, demanding a January deadline, colliding with elections, holidays, and strikes. They said the guidance was late, vague, inaccessible, and offered no examples. They said the template was too broad to be useful. They said the burden on small and under-resourced SUs was overwhelming, and even large ones had to divert staff time away from core activity.

They described barriers to data access – patchy dashboards, GDPR excuses, lack of analytical support. They noted that almost a third didn’t feel fully free to say what they wanted, with some monitored by staff while writing. And they told OfS that the short, high-stakes process created self-censorship, strained relationships, and duplication without impact.

The consultation documents brush most of that aside. Little in the proposals tackles the resourcing, timing, independence, or data access problems that students actually raised.

I’m not at home in my own home

OfS also proposes to commission “alternative forms of evidence” – like focus groups or online meetings – where students aren’t able to produce a written submission. The regulator’s claim is that this will reduce burden, increase consistency, and make it easier to secure independent student views.

The focus group idea is especially odd. Student representatives’ main complaint wasn’t that they couldn’t find the words – it was that they lacked the time, resource, support, and independence to tell the truth. Running a one-off OfS focus group with a handful of students doesn’t solve that. It actively sidesteps the standard in B2 and the DAPs rules on embedding students in governance and representation structures.

If a student body struggles to marshal the evidence and write the submission, the answer should be to ask whether the provider is genuinely complying with the regulatory conditions on student engagement. Farming the job out to OfS-run focus groups allows providers with weak student partnership arrangements to escape scrutiny – precisely the opposite of what the student submission was designed to do.

The point is that the quality of a student submission is not just a “nice to have” extra insight for the TEF panel. It is, in itself, evidence of whether a provider is complying with Condition B2. It requires providers to take all reasonable steps to ensure effective engagement with each cohort of students, and says students should make an effective contribution to academic governance.

If students can’t access data, don’t have the collective capacity to contribute, or are cowed into self-censorship, that is not just a TEF design flaw – it is B2 evidence of non-compliance. The fact that OfS has never linked student submission struggles to B2 is bizarre. Instead of drawing on the submissions as intelligence about engagement, the regulator has treated them as optional extras.

The refusal to make that link is even stranger when compared to what came before. Under the old QAA Institutional Review process, the student written submission was long-established, resourced, and formative. SUs had months to prepare, could share drafts, and had the time and support to work with managers on solutions before a review team arrived. It meant students could be honest without the immediate risk of reputational harm, and providers had a chance to act before being judged.

TEF 2023 was summative from the start, rushed and high-stakes, with no requirement on providers to demonstrate they had acted on feedback. The QAA model was designed with SUs and built around partnership – the TEF model was imposed by OfS and designed around panel efficiency. OfS has learned little from the feedback from those who submitted.

But now I’ve gotta find my own

While I’m on the subject of learning, we should finally consider how far the proposals have drifted from the lessons of Dame Shirley Pearce’s review. Back in 2019, her panel made a point of recording what students had said loud and clear – the lack of learning gain in TEF was a fundamental flaw.

In fact, educational gain was the single most commonly requested addition to the framework, championed by students and their representatives who argued that without it, TEF risked reducing success to continuation and jobs.

Students told the review they wanted a system that showed whether higher education was really developing their knowledge, skills, and personal growth. They wanted recognition of the confidence, resilience, and intellectual development that are as much the point of university as a payslip.

Pearce’s panel agreed, recommending that Educational Gains should become a fourth formal aspect of TEF, encompassing both academic achievement and personal development. Crucially, the absence of a perfect national measure was not seen as a reason to ignore the issue. Providers, the panel said, should articulate their own ambitions and evidence of gain, in line with their mission, because failing to even try left a gaping hole at the heart of quality assessment.

Fast forward to now, and OfS is proposing to abandon the concept entirely. To students and SUs who have been told for years that their views shape regulation, the move is a slap in the face. A regulator that once promised to capture the full richness of the student experience is now narrowing the lens to what can be benchmarked in spreadsheets. The result is a framework that tells students almost nothing about what they most want to know – whether their education will help them grow.

You see the same lack of learning in the handling of extracurricular and co-curricular activity. For students, societies, volunteering, placements, and cocurricular opportunities are not optional extras but integral to how they build belonging, develop skills, and prepare for life beyond university. Access to these opportunities feature heavily in the Access and Participation Risk Register precisely because they matter to student success and because they’re a part of the educational offer in and of themselves.

But in TEF 2023 OfS tied itself in knots over whether they “count” — at times allowing them in if narrowly framed as “educational”, at other times excluding them altogether. To students who know how much they learn outside of the lecture theatre, the distinction looked absurd. Now the killing off of educational gain excludes them all together.

You should have listened

Taken together, OfS has delivered a masterclass in demonstrating how little it has learned from students. As a result, the body that once promised to put student voice at the centre of regulation is in danger of constructing a TEF that is both incomplete and actively misleading.

It’s a running theme – more evidence that OfS is not interested enough in genuinely empowering students. If students don’t know what they can, should, or could expect from their education – because the standards are vague, the metrics are aggregated, and the judgements are opaque – then their representatives won’t know either. And if their reps don’t know, their students’ union can’t effectively advocate for change.

When the only judgements against standards that OfS is interested in come from OfS itself, delivered through a very narrow funnel of risk-based regulation, that funnel inevitably gets choked off through appeals to “reduced burden” and aggregated medals that tell students nothing meaningful about their actual course or experience. The result is a system that talks about student voice while systematically disempowering the very students it claims to serve.

In the consultation, OfS says that it wants its new quality system to be recognised as compliant with the European Standards and Guidelines (ESG), which would in time allow it to seek membership of the European Quality Assurance Register (EQAR). That’s important for providers with international partnerships and recruitment ambitions, and for students given that ESG recognition underpins trust, mobility, and recognition across the European Higher Education Area.

But OfS’ conditions don’t require co-design of the quality assurance framework itself, nor proof that student views shape outcomes. Its proposals expand student assessor roles in the TEF, but don’t guarantee systematic involvement in all external reviews or transparency of outcomes – both central to ESG. And as the ongoing QA-FIT project and ESU have argued, the next revision of the ESG is likely to push student engagement further, emphasising co-creation, culture, and demonstrable impact.

If it does apply for EQAR recognition, our European peers will surely notice what English students already know – the gap between OfS’ rhetoric on student partnership and the reality of its actual understanding and actions is becoming impossible to ignore.

When I told those student officers back on campus that their university would be spending £25,000 of their student fee income every time it has to take part in the exercise, their anger was palpable. When I added that according to the new OfS chair, Silver and Gold might enable higher fees, while Bronze or “Requires Improvement” might cap or further reduce their student numbers, they didn’t actually believe me.

The student interest? Hardly.

0 Comments
Oldest
Newest
Inline Feedbacks
View all comments