David Kernohan is Deputy Editor of Wonkhe

The Office for Students has updated some of its data dashboards.

Specifically, we get an extra year of student data in the outcomes (two views), TEF, and access and participation official datasets – while TEF also gets results from the 2024 national student survey.

Dashboard fans will note the absence of an update to the size and shape dashboard, this is scheduled for 3 September.

About the data

If you didn’t experience the pain of Data Futures first hand, you may need reminding that the collection of the 2022-23 student data by HESA wasn’t as straightforward as in a regular year. There’s been a few stories about data quality issues, so the OfS’ “data quality notes” are even more interesting than usual. As the regulator puts it:

Our data quality assessment has identified some quality issues in the 2022-23 student data. The issues tend to concern elevated proportions of data categorised as either ‘missing’ or recorded by providers as ‘unknown’. The proportions of ‘unknown’ values in certain data fields has increased overall at sector level but also varies greatly by provider.

“Unknown” is a “none of the other options fit” option rather than an actual absence of knowledge about students at providers, suggesting that some parts of the data validation system within the submission process were causing issues. OfS (and, one assumes Jisc/HESA) has used 2021-22 data for the same students to fill in the gaps where possible.

In particular, substantial problems have been identified with data on student ethnicity. The big issue here will be with the access and participation data, but ethnicity is also used in developing benchmarks via the “association between characteristics” (ABCs) quintiles. We’ll see more on this when we get the HESA Student data in August.

There are moderate issues reported with parental education data and other entry profile data, with small issues related to domicile, sex, mode of study, and the status of dormant students.

However, we also get details of a range of provider-level issues:

  • Bedfordshire (multiple)
  • Birmingham City (PT)
  • Brighton (inactive)
  • UCLAN (partners)
  • Chester (partners)
  • Chichester (partners)
  • Hertfordshire (multiple)
  • Loughborough (partners)
  • Navitas (partners)
  • Solent (short courses)
  • Teesside (partners)
  • Wolverhampton (UG with PG)

Official statistics?

Regular readers may also remember that the 2022-23 data has not been published in the more usual form of HESA Open Data tables – we’re expecting this on 8 August. HESA put this very delicately in a note published today:

This data would normally be published after the release of Jisc’s statistical bulletin Higher Education Student Statistics: UK, 2022/23 on the HESA website. The statistical bulletin is scheduled to be published on 8 August 2024.

Less delicately, it is highly irregular that materials including unpublished official statistics should be published ahead of their scheduled annual release. While OfS reassures us:

We are taking this approach because a delay in publication would have an impact on the primary users of our data, and who need to be able to engage with the data before the next academic year

We are not reminded that the “primary user” of this data is, er, OfS itself. The regulator notes that the overlap between the “size and shape” data and the scheduled HESA release (which is why the whole “size and shape” release is delayed till September), but the rule of thumb is that you don’t publish anything that is based on official stats until the official stats themselves are published.

This is compounded by the fairly open secret that the data is already in a suitable quality for open release with caveats similar to the one above, and has been since April. It was the Office for Students that insisted on a further sector consultation period, and thus the August open data release date.

Franchising

One reason we were waiting patiently for these dashboards to be released was because the Office for Students had promised more detail on franchise and partnership arrangements. Wonkhe has been making noise about the risk and comparative lack of oversight that partnership arrangements can bring, and the Office for Students itself is now firmly on board with this (prioritising partnership and franchise arrangements in the next round of quality assessments).

We do get data (as you’ll see below) on franchise and partnership students based on their registered providers. What we wanted was data based on the teaching provider – a number of unregistered higher education providers are teaching a large number of students, and it makes sense that the regulator would be able to take a view on this. If one unregistered provider is doing a bad job of teaching or looking after students from numerous registering partners – the regulator (and, indeed, everyone) needs to be able to know this.

Alas, this is not our year. OfS tells us:

In the autumn, we plan to share indicative dashboards with providers, showing student outcomes separately for each of their subcontractual partnerships. This data will show any aspects of partnership provision that may need attention. We also plan to publish some of these dashboards as a pilot this year, with a view to publishing student outcomes data for all subcontractual partnerships next year.

With joint representative body action on franchises and partnerships only this week, OfS is beginning to look embarrassingly behind the curve here. To be clear, it would be using data from the 2022-23 student collection here, which it apparently already has and is willing to publish parts of.

Outcomes

With this week’s release of eleven (of twelve) investigation reports, most attention is likely to be paid to the release of the latest year of outcomes data. However, by default, data is presented in a four year aggregation – meaning that new issues are quite difficult to spot.

The format is broadly familiar from previous releases (there’s some minor data design changes that I won’t bore you with) so I’ve put together two dashboards for your perusal.

The first “ranks” providers based on a chosen indicator (top right, radio buttons), split metric (“category” and “split” drop downs, immediately below, pink background), across a range of population types, modes, and levels (three dropdowns with blue backgrounds, immediately below). The main chart shows values for the indicator (actual measured value), the benchmark (the expected value calculated based on student characteristics), and the numeric threshold (the value below which OfS would consider a B3 investigation). You can use the highlighter at the bottom to find the provider you are interested in.

[Full screen]

The first allows you to see values for all split metrics within a provider. The controls otherwise work in the same way as above. On the main chart data points are grouped from the top down by population (taught, taught and registered, partnership), category (the type of characteristic) and then actual characteristics along the bottom.

[Full screen]

TEF and access

We also get TEF data, as we do annually for some reason – all ready for 2027 I suppose, though it would take a Conservative parliamentary private secretary to feel confident on betting that either TEF as a whole or (indeed) the Office for Students will still exist in a recognisable form by then.

One notable factor is that the NSS redesign means that student experience gets two more indicators – the “learning opportunities” and “organisation and management” scales.

As a special treat, I’ve knocked up my own homebrew subject TEF using the subject splits in the data. It’s an updated recipe and I hope you enjoy it.

Finally there’s access and participation data, which I will save for a quiet day over the summer. OfS no longer use this data directly in regulation, though it is handy to have about the place. There used to be a “findings” report from OfS, though we don’t get one of those this time (the last version was published in March 2023).

Leave a Reply