David Kernohan is Deputy Editor of Wonkhe

One of the fascinating things to watch as the Office for Students has settled into its regulatory identity is the extent to which it sees itself as a guarantor of high standards rather than a pure market regulator.

In the latter role, it has traditionally handed off the health of the market to the Competition and Markets Authority. So I was surprised to see the Herfindal-Hirschman Index make an appearance in the analysis of progress on Key Performance Measure 8, released yesterday.

KPM8 (and the closely linked KPM9, which puts things in a regional perspective) looks at the diversity of provider choice within a subject area – as such it looks very much like an anti-monopoly measure that feels quite out of place alongside measures of student satisfaction (KPM10) and fantasy metrics on the impact of poor learning and teaching (KPM13) and learning gain (KPM12).

Market concentration

OfS has measured the number of providers offering courses in a given subject, and the number of students on these courses. This is done at a very broad (top level CaH) level, with splits by student domicile and mode of study – the level of study is first degree for the measure, though data is available for other levels in the associated data. There’s also detail on the Herfindal-Hirschman Index (HHI), but we’re not encouraged to look beyond the “big numbers are bad” level.

If there was only one provider offering a subject, the HHI would equal 10,000 (the highest possible) indicating a monopoly. If a large number of providers were offering a course, HHI would be closer to zero – with zero indicating perfect competition (an infinite number of providers of negligible market share.

[Full screen]

I’m as up for this kind of thing as anyone – but it doesn’t really tell us much about the state of the sector. Only one subject area – veterinary medicine – is ever above about 700 for full time provision, and that number has dropped over time as more vet schools have opened. As proper competition-focused market regulators only get interested when the HHI gets over about 1,500 it would appear that there is little to worry about. Part-time and distance learning get more interesting, there are less students but also less providers serving them and pushing the index up.

So what?

Though OfS says that:

Students on part-time or distance learning courses may have fewer providers to choose from in their subject area compared to full-time students

this is very much something we already know – there is no indication that the KPM will drive action, and no indication as to what OfS might do with the information. Reading between the lines it is possible that the regulator could take a need to expand part time provision into account in the delayed review of funding – but a supply side incentive would do nothing to address a low demand for such courses, which itself is evidence of policy rather than market failure.

There might be an issue regionally, but all KPM9 tells us is that there may not be enough competition in London. This measure purportedly applies the subject offer and market size calculation regionally, but we’re given so little data that it is difficult to say, for instance, that the North East is low on languages provision – something that a regulator may want to know about. KPM9 also tells us about “local” and “mobile” students – again, it is difficult to disentangle this from regional availability and wider trends so there’s not much we can learn.

Unistats to the rescue!

What would be interesting would be a look at actual courses by regional distribution and size. After all If a student can’t study ophthalmics in the East Midlands, then this surely is a matter for regulatory concern. So, as always in the spirit of helpfulness, I’ve built a visualisation.

[Full screen]

As with much unistats-derived work there are a bunch of caveats. The data isn’t perfect (hats off to Plymouth College tagging a foundation degree in international tourism as being concerned primarily with ophthalmics!), most notably, I’ve used the population used in continuation data as a proxy for course size so some courses will look bigger that they are (look at the aggregation value: Course level is good, Cah2 is passable, anything else – including where it is an aggregation of two years – is only really indicative of the general size of the wider subject area).

For a student who is unable or unwilling to move from their home, a provider offering broadly the right subject in broadly their region doesn’t make much difference. There should really be some sector way of monitoring where such demand is likely to exist, but it needs to be at a fine granularity. Obviously there can’t be medical school on everyone’s doorstep, but maybe additional support (or, whisper it, a grant) could help someone who would otherwise be a loss to the profession.

On regulation

During the pandemic OfS has taken time to reflect on how it regulates – we saw this in the annual review and an associated event, summarised here by Nicola Dandridge, but the kick off was really an insight brief released back in October that did not include the words “market” or “competition” anywhere but the aim.

OfS sees itself predominantly as a principles-based regulator – as we said at the top of this piece concerned primarily with monitoring (and to an extent driving up) the quality of the provision. Through this lens, KPIs 8 and 9 look like measures for a completely different body that has an interest in the shape and scale of the market.

This isn’t to say that OfS does not have a legitimate interest in this area – it absolutely does, and considered thought about “place” within applicant decision making (remember, Discover Uni lets you filter by distance from home!) could see a sector that is more accessible in terms of entry and continuation. We’ve started to see regional considerations for HE funding in Scotland so a more action on this issue in England would be very much on-trend.

Ministerial attention has focused on quality and value as a way, arguably, of driving down participation – the “low quality courses” agenda is framed as “saving” students from wasting time, effort, and money. The fact that many of these courses serve communities where opportunities and investment are few and far between means that this impulse contradicts directly the leveling up agenda – it seems that FE colleges and free online training are to do that.

As a regulator OfS should really be pushing strongly back about the transformative value of higher education – sure, it could be seen as a producer interest, but the analogue would be OfWat arguing against a government decision to end mains water provision in deprived areas. Opposition to government agendas is a difficult move for OfS, but with the right data behind it such opposition could do a lot of good.

One response to “The OfS’ performance on place

  1. There is a suggestion here that financial support aids geographic mobility but my analysis shows no such systemic impact from the pgt loan scheme so that should be filed under assumption.

    Some ethnic segments are relatively immobile and this might impair access to some niche categories.

    Surely there is a trade off between having courses of every type on every doorstep and quality and depth of provision? And given 50% of courses have unviable numbers….financial sustainability anyone?

    And mapping distance learning provision….well surely that ought not be confined by uk borders as choice clearly is not or am I missing something?

    OfS needs to be assured that if there is a viable gap in the market (geographic-wise or not) my team helps providers to locate them. Its what happens in a competitive market.

    But any analysis MUST be at course level otherwise pretty much useless.

Leave a Reply