So how are we going to do outcomes based regulation for modular provision?

OfS doesn't know.

David Kernohan is Deputy Editor of Wonkhe

We were really pleased to see how early the Office for Students started thinking about regulating modular (lifelong learning entitlement provision).

It was a “call for evidence” rather than a formal consultation, and it appeared as early as July 2023 – and if you think there is still not a lot of detail about the way the LLE would work, there was even less back then.

So you did suspect that these responses might feed in both to the design of regulation (something entirely within the remit of OfS) and the design of the overall initiative, which is led from the Department for Education. The fact that a digest of responses has only appeared in 2025 doesn’t really help that interpretation.

Recently OfS has been farming out consultation response digests to external companies, and it is Pye Tait Consulting that do the honours this time round, in a report dated July 2025 and commissioned in October 2023. Again, I mention this because it doesn’t say a lot about the urgency with which this information has been treated.

A part of this may have been because of the low number of responses, just 39, with less than two thirds coming from people at higher education providers. As the OfS register hovers around 400-450 institutions, this is a concerningly low response rate.

So what have we learned? The vanishingly low numbers mean we need to be careful about drawing conclusions, but it is notable that just under half of those that responded were nervous about the administrative burden that the LLE would bring, and a third were concerned about student demand.

And, given the topic of the consultation, there should be some concern that a third of respondents felt that tracking student outcomes from modular provision would be a difficult problem. It was noted that the LLE would naturally support less traditional higher education, such as reskilling or upskilling, and a positive outcome would be less easily measured from such provision.

That’s not to say that the people that responded were against regulation all together – people were onboard with the need to ensure high quality provision and value for taxpayer funds. But there was a fair about of concern that the current definitions of a “positive outcome” (the OfS trifecta of continuation, completion, and progression) couldn’t just be rolled across to 30 credit chunks of provision, almost by definition. As one response had it:

The area which is particularly challenging relates to measuring positive outcomes from modular study. If the positive outcomes mirror those for graduates, i.e. graduate employment, those completing modules rather than programmes are not qualified to the same level, and will not be treated in the same way by employers, so what would be measured?

If you were looking for clues as to what might happen instead, I’m sorry to say that there are not many. So 46 per cent agreed that completion was a reasonable measure, but opinions about progression were less positive – 70 per cent highlighted potential challenges, with a third arguing to remove the measure entirely. Why? Difficulty in collecting data, difficult to track specific progression from a specific module (as opposed to anything else a student may be doing), small cohort sizes – even the definition of progression. We get one illuminating quote:

Arguably, progression to another module could be deemed positive but so could progression into employment. The current Individualised Learner Record […] will need to be updated to allow accurate tracking of student destinations which would imply huge burden on institutions (especially smaller institutions and Further Education Colleges).

For me, it increasingly feels like we need to regulate the quality assurance process of the provider rather than the outcomes experienced by students. As long as a university is setting and measuring against meaningful targets for outcomes, alongside doing more practical things like keeping course materials up to date and meeting student support needs, that should be enough.

But that’s before we get into the wider questions of student intentions. Providers are concerned that students would be less committed to a single module than an entire course, suggesting that non-continuation rates would be higher, and that an intention (perhaps to progress in their career, or to start a new one) may be linked to multiple modules at multiple providers (and other opportunities like placements or work experience). Modular (like part-time) students will be more likely to already be in employment, others may not be qualified for their employment goal after just 30 credits.

It wasn’t all negative – many responses suggested measuring students’ own views (perhaps via a survey that could gather qualitative data) would help, and argued for a more nuanced and personalised approach to monitoring progression. There is also scope for closer work with employers (who may recommend students take individual courses, or even fund them. But completion rates remained the most popular approach, even though just 13 responses said so.

That last point neatly illustrates the low numbers problem. I’d have expected more consultation responses than there were learners on the short courses pilot. It seems I was wrong. And that feels like a big problem for OfS.

0 Comments
Oldest
Newest
Inline Feedbacks
View all comments