What is happening to the SSR?

How has the the staff-studio ratio in universities been changing as the value of tuition fees declines? Jim Dickinson dives into the data

Jim is an Associate Editor at Wonkhe

I was delivering some training for some student reps the other day when the objective of “improving academic support” emerged on a post-it note.

The officer that had scratched it out told of enrolling late (visa issues) and never getting any sort of academic induction, either on stuff like “what the UK means by critical evaluation” or more basic stuff like the rules around academic misconduct or extenuating circumstances.

He then described a litany of subsequent issues – poor to non-existent supervision on his dissertation, “personal” tutoring being run in groups of 35, endless waits for answers to emails and a curriculum overly focussed on theory – in sharp contrast to the implied employability promises outlined by his agent.

There was also mention of a racist incident involving a tutor that he’d been encouraged to “resolve” “informally” – but that’s another article for another day.

As I always try to do when this sort of thing comes up, I encouraged a look at the marketing materials, the relevant university policies and the SSR for the subject he was studying. It was enlightening.

Promises promises

The blurb promised that his Business MSc would see him gain a “wealth” of international contacts – if “international” means “150 people, 140 of which are from the same country that I’m from”, that is.

He’d also been promised “small group discussions”, but as far as he could make out had never been in a session of less than 30 people. And he described competing with everyone else to obtain a placement in the city he was in as “impossible”, with the support available to do so as “useless”.

One student’s tale doesn’t tell us much about the state of the sector, but unless every international PGT sabbatical officer in an SU just happens to have had a poor experience when everyone else is thriving, it does at least feel like something’s going on on the evidence that I’ve picked up this summer.

And probably the closest there is to evidence supporting that is in the apparent rapid and dramatic decline in his subject’s staff-student ratio.

I’m not going to disclose the precise figure here – suffice to say that it’s rapidly increased since 2019 (having already been pretty high) and, looking at HESES, I’m guessing was higher in the academic year in which the officer actually enrolled.

His PVC, apparently, has a review on of the “academic support and student success” policy. She may well do, but if that SSR is near-accurate, it’s unlikely to make much difference.

The question is how widespread the problem is.


Here via my colleague David Kernohan we have the difference between the 2018/19 SSR and the 2021/22 SSR for Business and Management, as per the Guardian University Guide numbers derived from HESA stats:

[Full screen]

This chart allows a look at the four years’ worth of data by subject and provider:

[Full screen]

This one allows a comparison by provider for each subject for a given year:

[Full screen]

And this one allows a comparison by subject for each provider for a given year:

[Full screen]

As well as Business and Management, it looks like Marketing, Computer Science, Economics, Law and Psychology all have seen dramatic increases in the SSR in plenty of providers.

Crucially, back in 2018/19 there were 17 providers with an overall SSR of 18 more. That had reached 32 by 2021/22.

“Taught” postgraduates

Unhelpfully, the SSR as supplied to the compilers includes PGT provision, which has been growing at an unprecedented rate – and of course it could well be that teaching at this level on these sorts of courses is cheaper in general.

That may be the case, but it may also be that the students enrolling on these programmes don’t know any better and are in a worse position to complain.

It’s also true that economies of scale can kick in for very large programmes where a decent proportion of the credits are taken up with project or dissertation supervision.

But some of the declines on offer here are so eye-watering that it’s hard to believe that whatever standard of teaching and support was being offered when the SSR was much higher have in any way been maintained.

And the idea that those departments at the top of the provider views are going to able to meaningfully engage with that working group on authentic assessment in the year ahead feels pretty far-fetched. It’s much more likely that beleaguered markers will be turning to AI tools themselves to get their marking and feedback done.

It’s also worth remembering that these are figures for students that each of those providers teach – who knows what the numbers look like in the burgeoning franchised part of the sector. But it all points to a real and serious need to improve the SSR calculation to take these things into account – or else not use it all.

(Note that in the SSR supplied to table compilers, students on industrial (or other) placement or on a year abroad for the year as a whole are counted at 20 per cent of an FTE; students on placement or a year abroad for a proportion of the year are counted at 60 per cent; and franchised students aren’t counted at all. Meanwhile staff figures include academic “teaching only” and “teaching and research” staff. HEPSA consulted on changes to the method of calculation last year – but resolved in the end not to.)

Faulty signals

From a student point of view, the numbers in some cases really are remarkable. A current September-start final year undergraduate student will have looked at the SSR for their subject back in 2020/21, and so will have been looking at SSR calculations from 2018/19 – and even PGTs that have just enrolled will have been looking at SSR calculations from 2020/21.

Where there are big falls (or increases, if you will), it reminds us why lagged data is so dangerous in a scenario where the sector’s size and shape is changing so rapidly. Its “consumer signalling” function literally depends on changes and declines being much, much slower than we’ve been seeing in some subjects in some providers in recent years – and so if the signals can’t catch up, its use in student decision making is downright dangerous.

To the extent to which declines tell us something real about the academic experience, I suspect providers would argue that the decline in the unit of resource has pushed them in these directions. That’s a real issue – and needs to be understood.

But in most cases, students are paying for something that they were promised – and if Alton Towers was touting 10 minute queue times, only for it to be admitting so many revellers as to make it impossible to get on more than one rollercoaster in a given day, I’d be due a refund.

It also may well be the case that providers are intending, over time, to increase the number of staff in departments and subject areas that have been growing fast. But it seems pretty unforgivable that providers are somehow allowed to do so after increases in student numbers, rather than before, with pretty much complete impunity.

One other interesting aspect on each of the provider pages is to look at the subject areas for each provider at the bottom. There are strong arguments for maintaining that provision for as long as possible – not least because of the loss of opportunity that closure would represent for many bits of the country’s geography, and the astonishing weakness of the student protection regime in any part of the UK when closures or restructures do happen.

But there does come a point where the cross-subsidies involved from international PGTs on these “cheap to teach” courses – increasingly from the Global South – to subjects which students apparently aren’t too keen on studying, becomes a viscerally difficult moral problem. Especially when it’s resulting in unacceptably inconsistent academic support across a university, with failings landing on those both paying the most and in the least powerful position to complain.

In England at least, the Office for Students (OfS), with its focus on outcomes and ageing satisfaction data, feels like it’s miles away from having a strategy to capture any of this, and the rest of the UK’s lackadaisical focus on quality “enhancement” feels like an industry of denial rather than a method for improvement. If ever there was evidence that students are on “rip off” courses, this is surely it.

Leave a Reply