Were students heard in TEF 2023?

Jim is an Associate Editor (SUs) at Wonkhe

Students were able to – and in most providers did – feed into the 2023 Teaching Excellence Framework (TEF).

The Office for Students has published an independent evaluation report of the 2023 Teaching Excellence Framework (TEF), accompanied by an analysis of the estimated costs to the sector, a survey of applicant attitudes to TEF, a survey of students involved in preparing the student submissions and a student guide to evidence collection for TEF.

An accompanying blog post says that OfS will “build on the elements of the TEF that worked well and improve on areas that worked less well for some providers” in future iterations. DK has some narrative up on the docs on the main site.

Its student rep survey found most student representatives were “positive” about their overall experience of the TEF. They felt it helped them influence positive changes at their provider, and the process had strengthened students’ voices in discussions about learning and teaching. Over half of staff surveyed by IFF Research also agreed that the student submission empowered the student voice within their provider.

Student representatives also reported “challenges” with the time they had to produce submissions, handling existing data and engaging with the student outcomes area of the TEF.

If those are the key conclusions that OfS has gathered from the reports it commissioned, it’s missed several tricks. Because what’s been published actually reveals some much more problematic things about the way the TEF was structured and designed from a student input point of view.

The what now

If you’re new to all of this, the Teaching Excellence Framework (TEF) is supposed to assess the quality of teaching in higher education institutions in England. It evaluates providers based on factors like student experience, learning environment, graduate outcomes, and teaching quality. Institutions receive ratings of Gold, Silver, or Bronze based on their performance, and the idea is that TEF aims to help students make informed choices and encourage universities to improve teaching standards.

A review of the TEF 2018 saw a need for students to be able to input in a way that was both more independent, and more current than the satisfaction and outcomes metrics driving at least part of the process. So for TEF 2023 OfS proposed that a lead student contact for every provider would be able to choose to develop and submit their own report, offering “additional insights” to the TEF panel on what it is like to be a student at that provider, and what they gain from their experience.

Submissions included qualitative feedback on teaching, learning resources, academic support, and overall student satisfaction on two areas – student experience and student outcomes. Now we have the results of a survey of the lead reps – summarising 74 full responses from student contacts at 70 providers, eight of whom had not made a separate student submission.

Guidance, timing and length

Unsurprisingly, student submission guidance was widely used and rated as the second most helpful resource. Respondents would have liked it earlier to avoid busy periods, some said they’d have liked clearer expectations from OfS, and others argued for a more detailed template with content examples. A few found the template too broad to be useful, though one person appreciated the flexibility (!). Other requests included better advice on using existing data and engaging students.

Regulatory Advice 22, aimed at providers and panel members, was used by nearly three-quarters of SU/student respondents but had mixed feedback – 56 per cent found it helpful, but some struggled with its accessibility. OfS drop-in sessions also received mixed feedback – some valued peer interaction and direct engagement with OfS staff, others wanted more sessions, discussions, and better communication.

Most respondents (73 per cent) found the student submission timeframe challenging, with 55 per cent managing but feeling pressured and one in five unsatisfied with the time available. Only 27 per cent felt they had enough time. The short timeframe was the most common concern, especially for student reps balancing this with existing responsibilities.

Timing was another key issue. The guidance was released in early October – one of the busiest times for SUs – and the Christmas break further cut working time. Other difficulties included overlapping election periods, industrial action, and challenges in getting students involved. Some described the process as stressful or overwhelming, stressing the additional pressure it placed on SUs.

A common frustration was that the short timeframe forced SUs to divert time from student-facing activities. One respondent from a small SU highlighted their limited budget and resources, calling for significantly more time to complete the submission properly.

As was the case throughout 2022 and 2023, for some reason at no stage does OfS link capacity issues in SUs to its own regulatory conditions on universities – the frustrations are framed as TEF design issues rather than a lack of support from a provider to actually undertake the role implied by the exercise.

That OfS repeatedly dodged the opportunity to link the quality (or indeed existence) of a submission to its B2 (minimums) and SE7 (features of excellence) descriptions of student engagement was frustrating – but regardless, anecdotally it’s clear that in several cases whatever support was offered to the SU over the project manifested as a one off rather than resulting in a permanent improvement in relationships and resourcing.

As we often note, in countries like Norway not only must student bodies “be heard” in all questions concerning them, institutions must “provide conditions” in which student bodies are able to perform their functions in a “satisfactory manner”. OfS having the guts to say something similar itself – regardless of the size or programme portfolio of a provider – would really help. Its own student panel doesn’t work for free, after all.

One other weird thing was the length – a page limit applied to providers as large as UCL and as small as the Chicken Shed Theatre Company. Most student/SU respondents felt the submission length was appropriate, though nearly a third wanted more space to include valuable content.

A small group from specialist providers found the limits too long, feeling pressured to add unnecessary detail. Six specifically mentioned the page limit as a challenge – three struggled to condense their content, one found editing time-consuming, and another felt time spent cutting down could have been better used for data collection. One size, as they say over and over and over again, does not seem to fit all – and didn’t here.

The good and bad news

In the survey SUs fed in several strategies that made the submission process smoother. Key recommendations for future guidance include reviewing existing data early, centralising relevant information, maintaining regular communication with provider staff, and securing support from fellow reps or a research partner.

Other tips included conducting student data collection before the winter break, using benchmarked TEF data to guide surveys, involving student committees for feedback, and structuring submissions around the TEF assessment criteria. Respondents also said that early planning and active collaboration with institutional leads eased the workload.

But there were problems. Handling data was a major challenge, with respondents citing the time, expertise, and resources needed to compile TEF-relevant data – especially when it wasn’t originally collected for that purpose.

Many appreciated provider support but wanted more help accessing and using the TEF data dashboard. Issues included inconsistent data across courses, format variations, and apparent “GDPR-related restrictions” (better known as excuses).

Some struggled to reconcile their knowledge of student experience with NSS data, while others felt the emphasis on quantitative data undervalued qualitative insights and student representatives’ tacit knowledge. Inconsistent data availability also complicated drawing provider-wide conclusions.

Others talked of the support they wished they had, including funding for extra resources, better internal staff briefing to avoid surprising academics, access to data and analytics support, and early access to the provider’s draft educational gains section.

Truth and impact

Throughout the process, we heard endless tales of SUs feeling both indirect and direct pressure to bend the truth so as not to upset or shame the university, or jeopardise the provider’s TEF rating.

The survey tested that with this question:

I felt free to say whatever I wanted in the student submission.

5 per cent respondents felt they were not able to say whatever they wanted in their student submission and just under a third (30 per cent) only felt it was “somewhat true”. Why might that have been? Over half (57 per cent) said it was definitely or somewhat true that they wanted to ensure their provider got the highest possible rating.

You might have expected that over a third of respondents fessing up that they felt unable to be fully truthful, partly because so many felt they needed to help the university get a great rating would be a key source of concern in the report. Not really.

All of that is then underlined via OfS asking whether TEF involvement had made respondents’ relationships with providers more difficult – a quarter (presumably those that felt they had been as accurate as possible) agreed that it was definitely or somewhat true.

Arguably the better news is that the majority felt their participation strengthened the student voice in discussions with providers – and many noted improved understanding of the student experience and stronger relationships with the SU.

Separately, OfS has also published an IFF report that’s mainly about provider views on the process – with some reflection on the student input stuff, partly from 20 interviews with student representatives, ten of whom were involved in producing a TEF student submission and ten of whom had come into post since the submission (!).

That finds that some students found it hard to know what data was needed for the student submission and where to get it, in many cases students noted that they were working on the student submission alone, and one even student reported that they had a staff member in the room with them while they wrote the submission – and felt that being monitored in that way impinged on their freedom to write exactly as they would have liked.

A couple of providers felt that students did not have the knowledge or experience to give robust evidence – presumably shared without thinking “and I wonder whose fault that is”. Similarly, where provider staff were asked how easy or difficult it was to support students in submitting their contribution, the reasons for why providers did not have a student submission were a lack of student time; a lack of awareness of the TEF among student representatives; and a lack of an embedded student representation structure. Now what might fix those issues?

It’s the design, stupid

The idea behind the student submission was that honest student feedback sparks institutional reflection and change. But there was always a catch – identifying excellence also means exposing its absence, which can threaten reputations, block grant funding, and even jobs.

Plenty of student reps at borderline Bronze institutions were in the provider submission meetings too – and were “reminded” that failing to secure Bronze might mean budget cuts and layoffs. Even without direct pressure, many backed off from pushing hard on focus group findings about poor assessment fairness or lack of placement support, for inevitable reasons.

More broadly, student leaders are often proud of their university and see themselves as part of it. The last thing many want is for their feedback to damage the institution they were elected to represent.

When you add the blurred line – sometimes crossed – between giving an SU constructive feedback on its draft and subtly pressuring to change the content, and you mix in the odd tacit threat to SU funding or dire warnings of institutional collapse, and you create an environment where, even with a signed declaration of independence, the power dynamics made true autonomy impossible. Ironically for a body set to regulate on free speech, it made some self-censorship inevitable.

In many ways, the provider and student submissions were framed as almost identical – but the crucial difference was never really front and centre. Providers were always going to aim to get the highest grade while pretending what they were saying was all true – while student submissions should only have been about truth.

We did our best to explain TEF’s excellence features – the actual marking scheme – but OfS should also know that asking students for evaluative feedback without properly explaining the criteria is counterproductive. If student and SU input is meant to be meaningful, clarity isn’t just helpful – it’s essential.

But the fatal mistake was not clocking the importance of formative versus summative feedback. One of the smartest aspects of an older QAA “student written submission” process – one actually designed in consultation with SUs – was the long lead time for the submission and the sharing of drafts.

It gave university managers a chance to work through and come to terms with their initial defensiveness, collaborate with SUs on potential solutions, and start making improvements before the review team arrived. This TEF process offered no such buffer, unless the SU was sufficiently resourced to finish its submission even earlier than the deadline.

And while it would be a great idea for OfS to require providers to demonstrate real action on student feedback – revoking awards if they don’t – that hasn’t happened and shows no sign of happening.

Student representation, from individual complaints to university governance, relies on a kind of fearlessness – the ability to raise uncomfortable truths without fear of retaliation, for both students and staff.

If TEF is to continue, unless OfS acknowledges that reality and actively works to empower an independent student voice, the integrity of the processes won’t be shaped by evidence, students’ ability to engage or its support materials.

Instead, it will be dictated by confidence – something that’s usually shaped less by experience and more by economic precarity.

Leave a reply