David Kernohan is Deputy Editor of Wonkhe


Jim is an Associate Editor at Wonkhe

If you’re a higher education provider with a “pocket” of “poor quality” provision, are you prevented from closing it to avoid regulation? Or are you encouraged to do so to avoid students experiencing the outcomes?

That’s one of the many puzzles that remains following the publication by the Office for Students (OfS) of its decisions on its consultations on regulating student outcomes (ie the B3 bear) and the Teaching Excellence Framework.

For some time now, OfS has been saying that opportunities for study are not meaningful if students are able to choose or continue on “low quality courses delivering weak outcomes” because the regulatory system has endorsed such performance. But does that mean working over time to improve the quality and outcomes, or killing off those courses – especially if they’re a component of your partnership portfolio?

Minister for skills, further and higher education Andrea Jenkyns says

Following on from the first wave of OfS inspections, this consultation response is an important next step to halting dead-end courses.

But the OfS documentation says that it recognises that providers may choose to close courses rather than take steps to improve student outcomes, and if it suspects that’s been happening it will “interrogate whether a provider had taken action to improve its performance” and sought to “evade regulatory action by closing courses with weak performance”. How about if we sleep a little bit longer and forget all this nonsense?

What we’re looking at?

Weighing in at over 200,000 words (more than The Fellowship of the Ring, to use our traditional metric) of high grade OfSplaining, there’s an external summary of responses to and a separate responses and decision document on regulating student outcomes (that’s the B3 bear to you and me), a revised condition of registration on B3, a formal notice of determination of initial and general ongoing conditions of registration for B3, and some formal regulatory advice on it – as well as an external summary of responses to and a separate responses and decision document on the Teaching Excellence Framework, some YouthSight polling on TEF nomenclature (these are, it seems, our salad days), an analysis of responses to the consultation on and decisions on constructing student outcome and experience indicators for use in OfS regulation, a press release that reminds us that providers who recruit students from disadvantaged backgrounds “must support those students to succeed during their studies and into their life and career beyond graduation”, and a blog from OfS’ Director of Quality Jean Arnold.

In reality the publications don’t deviate significantly from the emerging regulatory rule – the more words in an OfS consultation summary and decisions document, the less has actually changed as a result of that consultation. As such if you’re new to all of this you’re safer reading our summaries of the B3 outcomes and Teaching Excellence Framework proposals from when they came out, but here we have the major announced changes.

Outcomes

Save the date – Monday 3 October 2022 is when the new Condition of Registration B3 will come into force, which will be shortly after OfS makes and publishes decisions on the minimum thresholds that will apply to continuation, completion and progression. They won’t be any higher than the drafts we saw in January, but could be lower. The dashboards showing data for those metrics with a whole bunch of splits will also appear around that time, once OfS has completed its consultation on publishing provider information. Providers will be thrilled to learn that those dashboards may include a “below threshold” filter, which the press and MPs will have a field day with.

As expected, for B3 both sides had to do a playing of the gallery even though that ship sailed a long time ago. So a significant chunk of respondents (around two thirds) argued that OfS’ definition of positive outcomes – metrics on continuation, completion and progression – are too narrow to reflect the wider benefits of higher education for students and society, and not in the full control of providers. And in response, OfS’ view remains that there are wider benefits of higher education for individuals or cohorts of students and society, but that all students, regardless of their background or what or where they are studying, are entitled to the same student protection in relation to the quality of their course.

Down in the weeds, some had asked that OfS set out a list of contextual factors that it might apply when letting a provider off for poor outcomes – but it won’t be limiting its discretion by adopting a prescriptive list – one of the many times “principles-based” is used as a positive way of saying “we dunno yet”. We’re also going to get a separate consultation on the sorts of thresholds that might apply for courses that may be LLE funded.

Partnerships and place

In another cake-ist section on partnership arrangements, while OfS will worry about providers ditching dodgy franchise provision with poor outcomes, if they do close OfS does not consider that to be adversely limiting student choice, “because courses that do not meet the OfS’s minimum expectations for quality cannot be considered a meaningful choice and their continuing provision would not be in the student or taxpayer interest”.

It’s an interesting confection – and we’re looking forward to James Wharton trying to explain to an MP whose constituent can’t move house for study that the closure of their local college’s English provision isn’t a reduction in choice. Nevertheless providers will be pleased that there will be no “partnerships” view on the data dashboard in the first year of implementation – partly because OfS hasn’t quite figured out yet how to collect all the data it needs.

Lots of the feedback was about how OfS would prioritise providers for further investigation and action once the data comes in – and we do get some text on a prioritisation approach, although it wouldn’t really allow you to actually predict whether you’re going to get a knock on the door in September. Crucially, it remains the case that a newish provider in franchise partnerships with six universities won’t have a meaningful dashboard of its own (despite students only really seeing the name of that college) and looks unlikely to be targeted by OfS intervention (saving David Willetts any blushes over the early 2010s new providers regime).

Teaching Excellence Framework

As with B3, there are very few changes from the consultation we saw in January, other than a welcome if brief extension to the timeline:

EventTiming proposed in the consultationRevised timing
The provider and student submission window opensEarly September 2022By the end of September 2022
Submission deadlineMid November 2022Mid January 2023
The TEF panel carries out the assessmentsLate November 2022 to March 2023Late January to June 2023
Universities and colleges notified of the panel's provisional decisions about their weightingsApril to May 2023July to August 2023
Outcomes published for universities and colleges that do not contest the provisional decisionMay 2023September 2023

Also worth noting:

  • Educational gains claims made by a provider will have to go beyond the TEF indicators, and be relevant to the mix of students and courses.
  • To take part, providers will now need to have two (not one) indicators with a denominator of 500 and above in the same mode of study.
  • Apprenticeship provision will only be in scope if a provider says it should be.
  • The submission from providers can now be 25 pages in length – up from 20.
  • Student submissions will be allowed to focus on current (or presumably at least, recent) students and it will be optional to include students who study elsewhere.
  • Assessment will only be split by “taught at provider” or “subcontracted out by provider” (the original proposal also included registered and subcontracted in.
  • No decisions have been made on publication just yet pending that other consultation on publication of information.

The relationship between the B3 bear and TEF? If a provider is in breach of one or more B conditions (as identified by OfS rather than just having some performance below one of the thresholds it won’t be eligible for TEF.

Scope is kind of interesting, too. Indicators won’t separately show students taught at each partner provider, and students registered by a provider but taught elsewhere will be in scope of the TEF assessment – but unlike for B3, courses that are only validated by a provider (that is, where the students are neither taught nor registered by that provider) are not included in the scope of the assessment, unless the validating provider chooses to include information about this within its submission. And while TNE may one day be in scope for TEF, it won’t be this time around.

Belonging and names

There are some interesting aspects to the commentary. There are emphatic rejections of proposals for interim exercises – you get one chance every four years, and there’s no respite even if you end up with a “requires improvement” badge. The nomenclature of that badge is staying too, even though many pointed out that “requires improvement” won’t actually mean that the provider actually “requires improvement” by the Office for Students.

One astonishing – and revealing – section concerns belonging. Some respondents suggested that the scope of the TEF “could go beyond teaching, learning, assessment and the educational environment” and include “the extent to which students have a sense of belonging and feel part of a community”, evidenced by NSS questions on learning community. OfS rejects that proposal – tortuously attempting to argue that belonging and community are factors beyond teaching and learning environment, and trying to fish apart the “student educational experience and the outcomes of that experience” and “the wider higher education experience”, despite the obvious and significant ways in which those categories are linked – and the ways in which belonging relates to mental health and outcomes.

Most responses were concerned with the nomenclature of ratings – with a lot of ire directed at the “required improvement” one. To justify rejecting any change, OfS points to YouthSight polling commissioned on the naming of the TEF and its awards – a classic case of asking applicants what to call the sausages without telling them what’s in those sausages or how they’re made:

Almost nine out of ten applicants and first year undergraduates surveyed as part of research we commissioned from YouthSight felt that the TEF scheme will help to inform students when deciding where to study. Over four-fifths thought that the TEF would have a positive impact on quality in the higher education sector.

Survey design corner

So what’s this YouthSight study? We’ve got a classic two phase survey design, starting with a qualitative analysis of 20 in-depth (30 minute) interviews with students. This was used to develop propositions to put to 1,112 applicants and first year undergraduates over the back end of May this year. Responses were weighted only by gender – by eye the sample looks reasonably representative, though students from advantaged backgrounds may be overrepresented and those who are first in family may be underrepresented.

There are two weaknesses that we should be concerned about. The first is a quirk in survey design at the quantitative stage that means that the number of responses feeding back on each naming choice is small (around 300 – so we are looking at around a 6 per cent margin of error). As OfS warns:

please note that the findings from the qualitative stage come from a small group of respondents and should be treated with caution

The second is more fundamental – why bother to go to the trouble of running a survey of students and applicants if you are only cognitively testing what are, in effect, brand names?
Students and applicants tend to be fairly bright. What something is called is fairly important, but we have missed a golden opportunity to get student input into what we are actually measuring – on the importance of belonging for instance – and why.

Constructing indicators

Back in February DK was on a panel with DataHE’s Mark Corver, and suggested there were probably only about 150 people in the sector with the range of knowledge to respond to this beast of a consultation (Mark’s response? “Who are the other 148?”). In fact just 142 responses were made.

Despite the sterling efforts of these brave souls there’s not been a great shift in OfS plans, and you may want to see our write up of the consultation for the full picture of what is going on.

Once again, decisions on publication are deferred until we get the results of the consultation on that. The hints we get point to a simplification of dashboard design (and thus we assume data design, for those conducting serious analysis) with more contextual information around the size and shape of provision and more thoughtful labelling of areas where regulatory intervention is underway (though not caveats from providers).

With the recent year zero in access and participation, the publication of that dashboard is subject to a number of pressures – there will be another update this autumn that will expand time series from four to six years, and the promise of additional equalities data (on socio-economic class, parental HE experience, household income, IDACI, estrangement, and care experience) will be fulfilled only at sector level until 2025.

One thing you might expect given recent APP movements is data on higher education contributions to improved school attainment – no luck there. OfS can’t find any.  The new style APP dashboard, meanwhile, will begin from spring 2023. Elsewhere dashboards are out in autumn of each year starting in 2022 – offering us “OfS autumn” as a counterpart to “HESA spring”.

Cohort or compound? And what makes positive positive?

The consultation left a dangling issue around the construction of completion metrics – should we use cohort-based (actual) measures a bit like KPI table 3 or compound (estimated) measures a bit like KPI table 5. The latter would give us more immediate data, but OfS has come down on the side of actual non-continuation as opposed to baking in assumptions around disadvantage (the reason? Easier to understand). Also on completion a tweak means that awards made to postgraduate research students during a reporting year (rather than just by the deadline) count as a positive outcome.

On progression we’re still using the full set of positive outcomes from the consultation (travelling, caring, or retiring, building a portfolio, entrepreneurship, studying, or working in a professional job – all only as “main activity”) which is broadly good. There have been a number of concerns about the definition of a “professional” job – we get an actual boxed out list of professional jobs in the arts as a response, and other pleas (dental nursing, veterinary nursing, agriculture…) will be dealt with contextually in regulation rather than within the data itself. And there’s no hope offered for those suggesting that graduates’ own reflections on their activities should become a part of this analysis.

Original proposals offered four populations per provider – taught students, registered students, taught or registered (ToR) students, and partnership (contracted out/validated) students. This has been simplified – we’re now just going to get a choice of “taught” and “partnership”, with the latter not disaggregated by delivery provider – meaning that we are deprived of an oversight of the experiences of students studying at those unregistered providers.

On that, we do learn that OfS are aware that it can sometimes be difficult to get quality data out of delivery partners (data from which will inform B3 decisions).

We also accept that we will need to introduce additional data collection to produce comprehensive information on student outcomes for validation-only arrangements that involve students registered at providers who are not registered with the OfS. We consider this additional collection is likely to be necessary to ensure that our regulation can protect all relevant students; we will set out proposals for how we will collect this data in a future consultation.

Don’t touch that dial!

We heard you like consultations, so we found some more consultations inside this consultation. If you have thoughts on the use of NSS metrics there’s another consultation next year after we’ve sorted out the future of NSS (itself subject to a consultation). There’ll be more on LLE-style provision when decisions are made on that. And one “in due course” on a new approach to higher education funding. There’s not a consultation on the use of Associations Between Characteristics (ABCs), or “geography of employment” quintiles in regulation – there should be. And here’s something to keep you awake:

Given our role as a regulator, unless otherwise stated, any measures we develop may have a role in our regulation.

It all demonstrates that this voluminous release is not, and can never be the last word on higher education data.

But let’s race through what we have. OfS don’t feel like the pandemic will be something that they need to take account of:

We consider that the absence of consistent and widespread impacts of the pandemic, as evidenced within relevant higher education data, and more recent indications of recovery, mean that it is not appropriate to introduce any delays or adjustments to our proposed approaches.

And the idea that smaller providers may not have the capacity to muck about with all this data gets equally short shrift.

It is a requirement of OfS registration that providers have the resources needed to meet our regulatory requirements, including the submission of, and engagement with, accurate data returns”

Perhaps least convincing are the arguments ranged against the idea that focusing on un-caveated outcomes will make providers less likely to recruit students from groups that are less likely to go on to positive graduate outcomes. There’s a lot of faith placed in the APP process as a mitigation, and a lot of concerning use of ideas of an expectation of equality of outcomes to avoid considering the additional work needed to make this happen.

As we noted when the original consultation came out this is a bit of a grab-bag of issues, and to cover them all in depth would make this article unreadably long. We will note, in closing, that indicators on published dashboards will not be changed as a result of investigations that do not result in regulatory action, that APP intersectional data splits will be provided based on the priorities of the moment in univariate form (no rolling your own intersections here), and that there’s no intention to look at outcomes or continuation for “blended courses” so we’ll just have to take that particular set of interventions as being based on ministerial whim only.

And we can look forward – thinking back to the original goals of OfS – to the publication of detailed data on credit transfer. One day. Maybe the OfS will disappear first.

4 responses to “OfS responds on B3, TEF, and indicator construction

  1. “But the OfS documentation says that it recognises that providers may choose to close courses rather than take steps to improve student outcomes”

    Every University up and down the country has a ‘red’ list of courses they plan to close – we know this, University staff knows this, Govt knows this.

    I was approached only last week by a headhunter to go and work in a northern University to “Reconfigure and close courses that fail PROCEED” – that’s the language that is being used.

    It’s really only the timeline that is at stake e.g. How long can we generate revenue from this course before we need to close it.

  2. ABCS and GEE are interesting because when they were quietly introduced last year I suspect many (myself included) assumed they were just the latest analyses OfS appeared to be churning out for fun. We might have had more to say at the time had we known of their future role as both splits and benchmarking factors in the upcoming quality work.

  3. “On progression we’re still using the full set of positive outcomes from the consultation (travelling, caring, or retiring, building a portfolio, entrepreneurship, studying, or working in a professional job – all only as “main activity”) which is broadly good.”

    This statement is untrue. The OfS’ core algorithms that accompanied the consultations show that a graduate will count as positive if they have any one of these activities, regardless of which (if any) they select as the most important. Perhaps the authors were not among the 150 with the knowledge to respond to this consultation?

Leave a Reply