Jim is an Associate Editor (SUs) at Wonkhe


David Kernohan is Deputy Editor of Wonkhe

The Office for Students Register of Providers for the 2019-20 academic year remains incomplete.

In some ways this is as expected – after all, universities and colleges in England can apply at any time for registration. But the initial plan assumed that providers who applied in the first round would be registered (or refused) by now – and this did not happen.

So why so tardy? In OfS’ new “Registration process and outcomes 2019-20: Key themes and analysis” its Director of Competition and Regulation Susan Lapworth blames the quality of the applications – 72 per cent were incomplete in some way (expressed as 66 per cent in other parts of the document), and the responses to requests for further information were of poor quality.

Some applications, it seems:

“demonstrated a deficiency of understanding over and above what might reasonably be attributed to providers’ lack of familiarity with the new requirements. They were of significantly poor quality: it was clear that a number of providers were not ready to be regulated”.

In all 387 higher education providers are now registered with the Office for Students. This is not the complete set – around a further 100 are still within somewhere the process – but we do know that eight applications have been refused and a further 13 are complaining about a provisional decision to refuse registration. Around 500 providers applied for either Approved (57) or Approved (Fee Cap) (330) status, the latter offering 243 institutions the right to charge the full £9,250 annual tuition fee.

And here lies our first problem. The 2018 Regulatory Impact Assessment for the Higher Education and Research Act suggested 508 providers would be registered for the 2019-20 academic year in either the Approved (129), or Approved (Fee cap) (379), category. Part of this may be explained by the overall poor quality of applications, but it does also look a bit like the demand for registration anticipated by the Act simply is not there (and potentially, there’s registration fee income missing from the budget as a result too).

Cruel, unusual?

Registration decisions are based on compliance with a number of conditions set out in the OfS regulatory framework. What wasn’t initially in the framework, but that we have come to know and love via the OfS board papers, is the scale of possible negative responses. This goes as follows:

  • Formal communication – let us not mince words, this is a nice letter from OfS to the provider. It’s a bit like an advisory on your MOT, or in TEF parlance – a “provisional”.
  • Enhanced monitoring – a letter to the provider with an expectation of a response. We first got to know these during the Access and Participation planning process, where providers were asked to send the results of planned evaluations and such like. Bronze, if you will.
  • Specific ongoing condition – this is what was originally promoted as the most common expression of concern, as a visible and public “black mark” against the providers registration. Silver, if you like.
  • Refusals/not approved – the nuclear option. A decision not to register an institution, or not to approve the access and participation plan. A solid gold intervention.

OfS can also fine providers – though there is no indication here that this has happened in any case. Not yet anyway.

Here’s how the punishments were applied for each regulatory crime – note in particular the particular use of enhanced monitoring on condition B3.

[Full screen]

Students, outcomes

The whole document opens with the assertion that OfS’ approach to regulation “puts students at its heart” because it aims to ensure that “providers are delivering positive outcomes for students”, and so there’s plenty of action in the “B” conditions, which riff off bits of the UK Quality Code.

To do this OfS started with data that showed the performance of the provider in relation to “three key indicators” broken down to show outcomes at different modes and levels of study, and for students with different characteristics:

  • Student continuation and completion indicators.
  • Degree and other higher education outcomes, including differential outcomes for students with different characteristics.
  • Graduate employment and, in particular, progression to professional jobs and postgraduate study.

There’s a long section on process here – no doubt included to shore up OfS’ position ahead of legal challenges that surround registration refusals in this area. B3 has also been the source of most of the public regulatory conditions, and behind the scenes, the source of plenty of enhanced monitoring. OfS considered a provider’s performance “in aggregate”, over a time series (for the number of years up to a five year period for which indicators can be derived from available student data), as well as across split indicators.

The mystery so far has been the baselines that were used as part of the analysis of whether Condition B3 was satisfied, so  alongside the narrative doc it’s published these along with a technical explanation of how the indicators were constructed. We’ll delve into these separately – suffice to say that its starting point was that a provider was not likely to satisfy the condition if more than 75 per cent of its student population fell into a demographic group which was identified as experiencing an outcome “that may be of significant concern”. As we’ve pointed out before, this approach does seem to favour large providers that are able to balance student and provision type to produce helpful averages – and really hurts small providers doing focussed WP work.

You can take the view that small providers shouldn’t be recruiting students that can’t cope with a course. That’s fine – but that logic suggests that OfS should be delving deeper into groups of students in large providers – and anyway, the smaller ones may argue that behind the headlines of “continuation” lie a host of complex stories that don’t necessarily suggest careless recruitment. We’ll get more of this as the court case(s) play out.

There’s a short section on access and participation – here OfS was largely assessing 19-20 plans before the new APP regime kicked in. Remarkably, only 12 providers received no regulatory intervention – 142 got a formal letter, 79 were subject to enhanced monitoring, and five received one or more specific ongoing conditions of registration. Two plans were not approved at all by the Director for Fair Access and Participation.

Away from outcomes and access, it’s the individual relationship with students that attracts the most interest. 147 (29 per cent) had to provide additional evidence or clarification on their self-assessment against the consumer law compliance condition – some made no reference to CMA guidance, many provided insufficient evidence of demonstrating due regard, and a large number simply stated that contractual terms and conditions for students were fair without demonstrating how they knew that this was true.

Some stated that policies were publicly available, but OfS was unable to find them. In other cases published policies and information were inconsistent with the guidance published by the CMA. And applications from providers operating in partnership with another higher education provider tended to demonstrate a reliance on the policies and approach of the partner provider. “Once registered, a provider is responsible for ensuring its own compliance with the ongoing conditions of registration. It is important that governing bodies understand this”, says the document.

Protection, racket

We’ve talked before about the inadequacy of student protection plans, and it’s almost a year since OfS Chair Michael Barber admitted at Wonkfest that most weren’t up to scratch. These were supposed to set out the risks to provider, course, campus or “material component” closure – and were so bad that it looks like OfS started off by bouncing most back, but eventually in all but the most extreme cases gave up late last year, requiring everyone to resubmit once new guidance emerges.

The problems were myriad. Plans often didn’t address all aspects of the guidance. They weren’t student facing. Some included risks that didn’t appear to be relevant to them, like referring to loss of university title where no title was held. And there was a plagiarism problem too. As more student protection plans published, “we began to discern a pattern of provider application submissions which drew heavily on plans published by other registered providers”. Whilst OfS expects providers to seek out examples of good practice, “the extent of the replication meant that in some cases risk assessments were not specific to the provider”, and it was also not clear that the “provider’s governing body had properly engaged with the requirements and the importance of these to … students”.

Again, we’ve covered before the tension between being being honest about risk internally and they way “risks” are presented in SPPs. OfS found that risk assessments in some plans “were overly optimistic, and in some cases contradicted other publicly available information about the financial position of the provider”. Some focused on an assessment of business risks rather than risks to continuation of study for students. Refund and compensation policies were weak because they were not always clear that refunds and compensation would actually be available to students. Details of compensation were also “limited in detail and scope”, and proposed mitigations in student protection plans often “lacked detail”.

Intriguingly, absent from the narrative is any mention of providers failing to address the shuttering of a “material component” of a course – ironic given that probably the biggest threat to students when a provider tries to save money isn’t wholesale closure, but major module rationalisation. OfS has now announced it will consult on new guidance before issuing, which suggests that there may also be revisions to the regulatory framework on student protection on the way.

Management, governance

The second set of issues covered concerns providers themselves – things like their finances, and how they’re governed. This is very much the opposite of the “student outcomes” focus, the idea being that the way providers are run acts as a set of prerequisites for delivering outcomes and using public (and student) money effectively.

Interestingly, before it even started assessing the financial picture, OfS had to ask 61 providers for additional evidence (or seek “clarification”) because the initial submission wasn’t up to scratch. Problems included errors in financial tables or inconsistencies between the tables and the audited accounts, a failure to provide cash flow statements and insufficient detail in the commentary to explain and support forecasts. New guidance on the annual financial return has been published to help ensure these problems are avoided next year.

The biggest reason for imposing interventions across all provider types was forecast financial performance – which OfS reckons is underpinned by optimistic growth in student numbers with little or no supporting evidence about how that growth would be achieved, along with “insufficient evidence of stress testing” of the underlying financial position in the event that forecasts were not realised. Most are assuming growth in student numbers, with 122 (out of 183) projecting increases of more than five per cent over the next four years. OfS concludes that whilst the majority aren’t reliant on growth to ensure viability and sustainability, they may need to “reduce their projected costs” if ambitions are not realised.

On management and governance, the guidance said that providers were to self-assess how arrangements delivered OfS’ public interest governance principles, and where a provider followed a particular governance code the self-assessment was to explain how the provider ensured compliance with this code. Here OfS had to ask 269 (54 per cent) for additional evidence or clarification because initial submissions didn’t state which governing documents upheld the principles. And in “many instances” providers had failed to recognise that an external code didn’t fully cover them all.

On “value for money”, the majority (70 per cent) described mechanisms such as audit or finance committees and the publication of accounts as evidence to demonstrate that the principle was upheld – very much defining VFM in the old HEFCE groove. But they failed to address the issue of how transparency for students was achieved – a finding UUK has already pre-empted in their guidance on telling students where the fees go.

Proper, fit

Another area of weakness was the “fit and proper person” principle. Here providers described practices of relying on declarations from members of the governing body, but in many cases “it was not clear whether any checks were conducted”. Crucially, there was “limited recognition” that OfS’ “fit and proper” indicators are wider than (for example) Charity Commission requirements in this area – and there was widespread non-disclosure on the application form of directorships and trusteeships held by individuals. It also spotted a number of providers that had “very long serving” members on their governing bodies with no limitations to terms of office.

On governance effectiveness, it also found a lack of evidence and a lack of external input and independent views. Many that were previously regulated by the Secretary of State “had no evidence of ever having reviewed the effectiveness of their arrangements” either internally or externally. And there was “limited information” about how providers approach “academic risk” within wider risk frameworks. Academic governance “often appeared to be a reporting protocol” rather than a “robust approach” of the governing body to testing the assurances it receives in this area – a challenge to the “dual mode” traditions of corporate and academic governance in universities that underlines the legal duties of the corporate end.

You’ll recall that in the Sam Gyimah era his “tough new regulator” was going to crack down on freedom of speech crimes like trigger warnings, safe spaces and no platforming. In the end, all OfS has found is a handful of further education colleges without a freedom of speech code of practice. And in plenty of colleges many self-assessments didn’t distinguish between academic freedom and freedom of speech more broadly. It’s almost as if colleges view the moral panic over freedom of speech as some kind of made up Russell Group myth, but OfS will see to that.

Amusingly, some privately owned providers (who might not have had “the public interest” and principles thereof in their veins) tried to assert that advisory boards called “governing bodies” counted, but OfS worked out that they didn’t actually have legal decision making responsibility. “Providers appeared to have expected that this would meet the requirement for independent members of the governing body. This was not the case”, says one memorable passage. 

Interestingly, just about the only “public interest governance principle” that isn’t mentioned is the one on student engagement, where the governing body has to ensure that “all students have opportunities to engage with the governance of the provider”, and that “this allows for a range of perspectives to have influence”. There are plenty of SUs who may regard that an area of weakness – but it’s not one that was picked up by the registration team.

I didn’t mean this

So where does all this get us? On the one hand, if your accept the central logic of it all, OfS has a good story to tell here. It’s bearing down on weak outcomes, improving protections for students, and improving the management and governance of the sector. Trebles all round.

But we’re not so sure. First, there’s the nagging fear that much of the exercise is more about testing provider’s evidence writing skills than it is about the actual practice. Then there’s its calibration – the level playing field requiring some crimes (like established university failure on SPPs) going unpunished whilst others (smaller providers on continuation) get hammered. The focus on outcomes rather than provision looks weirder by the day. It’s also quite hard to trace your finger through and imagine real students that benefit from it all – but maybe it’s too early and we’ll see that more clearly later. 

Perhaps the biggest problem is tone. We get that the cosy club of HEFCE provider interest had to end, and there’s no doubt that OfS has taken to its regulator not funder role with apolmb. But through our work with SUs we spend an awful lot of time looking at Charity Commission stuff too. It’s principles based. It too stresses outcomes over process. It’s often tough, focuses clearly on beneficiaries, and also has a hard job regulating a level playing field. But you also get the clear sense that through its interventions, it really wants charities to succeed. Wherevever we are, we’re really not there yet with OfS.

2 responses to “Registering a complaint? The OfS registration process, analysed

  1. The focus on outcomes rather than provision looks weirder by the day… Especially when Ofsted have said that with their new education inspection framework, inspectors will be spending less time looking at performance data, and more time considering how providers are making sure their learners are developing…

  2. “But you also get the clear sense that through its interventions, it really wants charities to succeed.” Let’s be clear, guys, the OfS wants one thing and one thing only to succeed: the OfS.

Leave a Reply