Jim is an Associate Editor at Wonkhe


David Kernohan is Deputy Editor of Wonkhe

In an earlier era of the pandemic, when Universities Minister Michelle Donelan was roaming around the country fobbing off students demanding lower tuition fees by telling them they could be charged in full “as long as the quality is there”, students and their representatives could be heard saying “yes, but what do you mean by quality?”

They might have been surprised to learn that for much of the intervening period, OfS has both been saying that it has been “monitoring” that quality, and consulting on what “quality” means so it can produce formal proposals on what it means that will now go out for further consultation.

Back in November last year OfS put out a broader – principles-based, if you like – consultation canvassing the sector’s thoughts on a set of ideas that it said would define “quality” and “standards” more clearly for the purpose of setting minimum baseline requirements for all providers. You can read more about that exercise on Wonkhe from the time.

And now after many months of waiting, we have fruits of that process – a formal consultation on new conditions of registration, and the usual two accompaniments of a tough-talk press release signalling a future “clampdown”, and a summary of consultation responses that spends most of its time revelling in the misunderstandings and misconceptions of respondents about OfS and its role.

Yes but what do you mean?

In the press release new chair Lord James Wharton says that the regulator sees too many examples of courses that cause concern, and argues that OfS must be able to investigate concerns like this vigorously:

Published data tells us that employment outcomes can vary significantly for students studying the same subject at different providers.”

Once we’ve all stopped screaming into the void about contextual factors, maybe someone should tell him that his proposals here don’t include all the stuff on baseline numerical outcomes (the B3 conditions) – that’s all to follow in November. What we do have here is everything else to do with quality and standards, the sort of stuff that gets judged qualitatively rather than quantitatively, along with material on how that will be judged and who it will be judged by.

Annex B of the consultation response tells us that of 250 responses, 52 per cent (140) disagreed with the definitions of quality set out last November which set out expectations against 6 areas – access and admissions; course content, structure, and delivery; resources and academic support; successful outcomes; and secure standards.

Revised condition B1 is to cover academic experience, B2 resources, support and student engagement, B4 assessment and awards, B5 sector-recognised standards, along with a couple of new ones on getting onto the register in the first place.

And as suggested, there’s new expanded definitions of these things. B1 for example says that each higher education course has to be up-to-date, provide educational challenge, be “coherent”, be effectively delivered, and should require students to develop relevant skills.

And there’s further definition on the meaning of those terms. “Effectively delivered” means the manner in which a course is taught, supervised and assessed (both in person and remotely) including, but not limited to, ensuring an appropriate balance between lectures, seminars, group work and practical study, as relevant to the content of the course and an appropriate balance between directed and independent study or research, as relevant to the level of the course.

And “relevant skills” apparently means: knowledge and understanding relevant to the subject matter and level of the higher education course and other skills relevant to the subject matter and level of the course including, but not limited to, cognitive skills, practical skills, transferable skills and professional competences.

Definitional issues

Now when you’ve read and responded to as many consultations as Team Wonkhe have, you tend to gravitate towards definitional issues – because, fundamentally, if we can’t agree what we are talking about then there are no chances whatsoever we can solve the problem. The design of definitions actually constrains and shapes the debate.

What people didn’t like about the definitions was that they were neither measurable nor unambiguous. Language like “up-to-date” content and “effective” assessment was meant to give providers flexibility to make their own decisions – in practice it made them concerned that their definitions of these terms may not match the regulator’s own impressions.

If you took them on their own and you weren’t heavily involved in all of this, you might just say “well it’s all semantics”. And to some extent you’d be right. It’s whose semantics that matters.

The big issue in the feedback was the failure of the definitions to map or match what the UK Quality Code says. The Quality Code, fully revised in 2018, is a longstanding agreed sector standard developed by the Quality Assurance Agency (the designated quality body) on behalf of the UK Standing Committee for Quality Assessment (kind of the sector’s representative body on quality assurance). The code is short, clear, comprehensible – literally one side of A4 plus a handful of paragraphs on the way it will be used and a short annex of definitions. Everybody knows where they are with it (from PSRBs to providers), it is popular, UK-wide, and internationally recognised. And it’s symbolic – insofar as it is a piece of co-regulation.

But OfS appears to hate it. If that sounds simplistic, impressionistic, or taking sides, then be assured that it is a simple matter of evidence. It would be dead simple for OfS to point everyone to the Quality Code and say – do that please, we’ll send some peers round via the QAA every few years to check that you are (as happens elsewhere in the UK and used to happen in England) and we’ll check up on your outcomes metrics. Instead we suffer through the pain of endless rounds of consultations and definitional problems. It can’t be for the sake of convenience.

Instead, OfS proposes to kill off any remaining mentions of the Quality Code from the regulatory framework guidance. The core Code was heavily revised and simplified a few years back to retain the idea of a component of UK wide co-regulation within the overall framework – but now it’s not detailed enough. And so on.

We’re not told how many respondents didn’t like the idea (it was a “substantial number”, apparently) – but, in a nutshell, they felt that introducing another set of definitions that nearly duplicated the code but differed in implication and shades of meaning would add to burden, that OfS’ attempt didn’t cover everything covered in the code, that the code was better on student engagement, and was incoherent with everything else that goes on in quality (from professional requirements to the European Standards and Guidelines).

The closest we get to a meaningful rationale is this:

It is also our view that the UK Quality Code, including its common practices, advice and guidance, risks creating a homogeneous approach to quality and standards assurance that stifles innovation and overly focuses on policy and process rather than outcomes for students. By contrast, our intention is to establish an approach to regulation that protects all students through the articulation of a clear minimum baseline for quality and standards in the regulatory framework, while enabling competition, student choice, provider autonomy and innovation to develop freely above the baseline.” (208)

Kill the Code

Let’s take a step back. The Code “risks creating a homogenous approach… that stifles innovation”? Even the previous version was clear that the manner in which providers meet the expectations is “their own responsibility”. Focuses on policy and process rather than outcomes? The code never mentions the word “policy”, and the only mention of processes relates to assessment and classification, an almost identical clause in the OfS definitions substitutes the word “arrangements”.

And then paragraphs 209 and 210 see the regulator tantalise us with a “well, we could just use the code…” before doubling down on the idea that the QAA version is unclear and key terms are not defined, criticisms you’ll recall the OfS attempt got from the sector for its own failed attempt.

On and on it rumbles. The code goes “well beyond” what the OfS would see as minimum standards for quality. The sector and three other UK regulators are fine with that level of ambition, but poor old England can’t cope? The definition of quality we’ve had for a few years now was too strong? The code is “articulated in a way that makes it suitable for some approaches to the external assurance of quality and standards but not others”? We’d love to see an example, because re-reading the code we can’t see anything that would cause an issue.

Ultimately, if OfS was interested in retaining a component of internationally recognised and UK-wide co-regulation it could have fed some of its developing concerns back to the UK Standing Committee for Quality Assessment and asked it to revise the Code again. It isn’t. It didn’t.

Getting in as well as getting on

Responses to November’s consultation expressed the opinion that the requirements for registration should not be less demanding for new providers – B7 and B8 are the result of this. It has more usually been the case that applicants have had at least some experience delivering higher education on behalf of a registered provider before applying – this was the case, say, for Dyson via an agreement with Warwick.

In other words you’ve pretty much had to deliver higher education before (or have some significant evidence that you are working with people that do) to make your way on to the OfS register. This is not necessarily an ideological position – though it’s hard not to recall the “Byron Burger” stuff from the passage of HERA – it’s more the way that the conditions are drafted.

Many providers in this position have already reported issues with complying with the B conditions – how can you prove you can support students to success (B2) if you don’t have any students yet? The “Designated Quality Body” has generally been wheeled in, but reading the entrails for stuff like this is not a core QAA competency. Now the anomaly is dealt with via the proposal of two new initial conditions of registration – B7 (initial condition relating to quality) and B8 (initial conditions related to standards).

As with much of these, the ambiguity does a lot of the work to solidify what is basically a sniff test. The words that do the heavy lifting are “credible” (including, but not limited to, evidence of the providers past performance delivering higher education) and “capacity and resources” (staff, finances, management and governance). The cumulative effect is to open the door just a crack to allow providers without direct experience of, er, providing higher education to give it a bash. It’s a pair of conditions that basically lower the bar for registration – the dynamism and innovation that new providers can bring is posited as worth the risk to students.

Wide eyed

Various other aspects catch the eye. The unit of assessment is always a fascinating one – you’d have to assume that half of the courses being below these minimum standards at a micro-provider that only runs two would be a big red flag for OfS, but the same number of students with the same number of students on them would barely register on OfS’ risk dashboard in a large university – despite the implications and the risk being similar for students. As previously, the context of concerning provision within a provider portfolio is pretty much dodged here apart from warm words on OfS’ ability to look at courses.

You’ll also be pleased to learn that in the context of the Skills Bill and the modularisation of funding, a module is a course and a course can be a collection of modules, which we’re sure is the sort of endless loop that used to cause you to lose all your work in Excel back in the 90s.

B4 is amusing – an example of the way in which OfS might judge that assessment is not credible is given as:

Students are not penalised for poor technical proficiency in written English. For example, for assessments that would reasonably be expected to take the form of written work in English and for which the OfS, employers, taxpayers, and the Mail on Sunday would reasonably expect such proficiency, the provider’s assessment policy and practices do not penalise poor spelling, punctuation or grammar, such that students are awarded marks that do not reflect a reasonable view of their performance of these skills.

For the avoidance of doubt, the addition of the name of a newspaper in that paragraph is “satire”. The rest is real.

Grade inflation and work on assessment offences are also in there – ticking important ministerial boxes in the process.

In B2, “student support” is now defined specifically as academic support relating to course content, the support needed to underpin digital, academic misconduct support, and careers support – “but for the avoidance of doubt, does not include other categories of non-academic support”. Any student thinking their centralised study skills function, or their wellbeing service might have to meet some baseline requirements is wrong, in other words.

And despite some pretty desperate sounding protestations, Transnational Education will remain “in scope”. At various points in the consultation responses summary OfS has to point out that just because some of your provision isn’t funded through the fee system, doesn’t mean that that provision isn’t covered by quality and standards definitions if it’s operated by a qualifying provider. There may be good reasons to exempt TNE or have different definitions of quality – “you don’t fund it” isn’t one of them.

There are juniors and there are masters

Buried in the Quality Wars stuff is a running sore on student engagement – back in the original rows over the regulatory framework OfS wanted the minimum to be “consultation” and students’ unions wanted “representation”, and in the end everyone settled on the Quality Code’s “the provider actively engages students, individually and collectively, in the quality of their educational experience”.

Now that that deal is off, OfS says it will be principles-based and needs to regulate minimum requirements that are applicable to all providers and all courses:

For example, a student who chooses a short professionally-oriented course may have different views about the need for student engagement activities than a student beginning a three-year campus-based undergraduate course, and providers need to be able to respond to both views.”

We think that means that OfS reckons that professional PGTs never put their hand up when asked if anyone wants to be the course rep, but in any event instead of the neat “individual and collective” confection in the Quality Code, we get this with dog’s breakfast with a bizarre “masters and juniors” style “academic rigour” rider – because presumably someone somewhere thinks that course reps are calling all the shots:

“Engagement” means routinely building into the course delivery opportunities for students to contribute to the future development of the higher education course in a way that maintains the academic rigour of that course, including, but not limited to, through membership of the provider’s committees, including the governing body, opportunities to provide survey responses, and participation in activities to develop the course and the way it is delivered.

Why OfS can’t just define students as partners in their education?

OfS and students

More generally, to return to the opening theme, the relationship between students and the regulator remains as confused as ever on this issue surrounding material on quality and standards.

As usual, there’s little on making students feel more powerful – but plenty for OfS.

If students can expect what’s in there as a baseline, will they ever be told about it? There’s nothing in the proposals on making sure students know what to expect. And there’s nothing on linking the questions in the NSS to this definition of quality.

The consultation responses summary even responds to a suggestion that OfS should routinely gather feedback from SUs on performance against the Code by saying:

We would not routinely seek feedback from students’ unions about the quality and standards at their provider unless there was a concern – this would not fit with a risk-based approach.”

Yes – but how would students know what was a concern and what wasn’t to feed that back to their SU unless their SU or OfS routinely asks students if what they’re experiencing meets what’s laid out here? And aren’t you going to routinely ask SUs for input on quality and standards in your TEF proposals anyway?

And there is one final implication to all of this. Although co-regulation is explicitly rejected and the idea that academics should exclusively judge matters of quality is very clearly killed off, OfS of course still says that seeking “input” from academics will be important if it ever had to make a judgement. How ironic that because it would involve “academic judgement”, proposals designed to protect the interests of students will still mean that students themselves can’t make a complaint about provision that doesn’t meet the new definition of “quality” proposed here.

2 responses to “Are proposals on quality really in the student interest?

    1. Values and proportion from two different parts of the OfS document. I’m sure (charitably) there was some rounding/filtering that went on somewhere.

Leave a Reply