A step change for access and participation?

David Kernohan analyses OfS consultation documents on its approach to regulating access and participation, and explains why it is the biggest change in the realm of widening access since the 2004 genesis of the Office for Fair Access.
This article is more than 2 years old

The latest consultation from OfS, “A new approach to regulating access and participation in English higher education”, heralds the biggest change in what we used to call “widening participation” since the genesis of the Office for Fair Access (OFFA) in 2004.

A cursory glance at UCAS clearing statistics suggests that we’ve seen slow and steady progress in driving up the proportion of students from a range of under-represented groups. Both as a proportion and in absolute terms there are more disabled students, more students from a non-white background, and more students from under-represented areas (POLAR4Q1) than ever before.

Chris Millward, the director of Fair Access at OfS, a role specified in the Higher Education and Research Act 2017, is not afraid to demonstrate the scope of his ambition – to “drive transformational change rather than incremental progress”.

“Deliverology” redux

Surprisingly, within the scope of the document, there’s no firm target attached to that ambition. Neither is there a clear connection between the activities of the regulator and the impact on the end user. A few readers will be rolling their eyes at this point – to bring the rest of you into the secret these are ideas straight from Michael Barber’s “deliverology” conception of public service delivery.

To quote the sage himself, (one aspect of) “deliverology” works as follows:

“You know where the data is now on your chosen metric; and you’d know where you’d like it to be because you’ve set your aspiration. Go away and draw the line that connects the two points; that, at its simplest, is what a trajectory is”.

In the Prime Minister’s Delivery Unit, back in the Blair years, Barber would – in partnership with the government department where the target was held – work to plot the actual progress of change against these trajectories. A dip in actual against projected performance would need to be explained, and possibly investigated. More intensive monitoring would be brought in for particularly problematic cases.

The OfS agenda

And this is what we see here. Each institutions is invited to set its own aims, and taking into account these and the sector-wide aims of the OfS, specify targets as part of a strategic planning process. These five-year plans (I’m sure the OfS press team love that terminology!) are monitored via data from UCAS and HESA alongside institutionally submitted “transparency data”. Annual qualitative updates against the strategy would also be required from institutions.

This is a big shift in practice. Previously, the annual approval (or not) of the access and participation plan was the one policy lever that OFFA had to engender compliance – no plan, no higher-level fee cap. Staff at OFFA would take a longer view of institutional planning and seek evidence of evaluation – but this, though important, was a secondary activity.

These changes put A&P in line with OfS’s other regulatory processes – light-touch data-driven measurement, low impact contact where risk is minimal and the ability to ramp this up where needed. The other leg of the stool, transparency (or “inclusion of data in league tables” if you want to by cynical), is supported by the development of a new database to include up-to-date figures at sector and institutional level.

Widening widening access

There’s a lot to be pleased about. The clear and wider definition of under-represented groups takes in POLAR alongside socio-economic status, ethnicity, disability, age (there’s a focus on mature students as a clear priority) and care leavers.

OfS will be requesting data from institutions covering every stage of the student journey – application, offer, acceptance, registration, completion and award – against a backdrop of information on gender, ethnicity and socio-economic background. It’ll be published by OfS too. This data will be used alongside other sources (UCAS, HESA, etc.) to measure progress against targets outlined by OfS and set by the institution itself. There’s a list of likely areas:

  • Entry rates for POLAR4 quintile 1 students compared to quintile 5 students, or an alternative measure using POLAR4 where appropriate, or mature-student equivalent.
  • Non-continuation for POLAR4 quintile 1 students compared to quintile 5 students.
  • Entry rates for students entitled to free school meals compared to those not entitled.
  • Degree attainment gap (firsts and 2:1s) between ethnic and white students.
  • Degree attainment gap (firsts and 2:1s) between disabled and non-disabled students.
  • At least one outcome-focused target to raise attainment in schools and colleges, however the OfS will not specify the measure.

In these areas, OfS specifies the population and, for the most part, the relevant measure – it is for the institutions themselves to set the target and thus the trajectory.


Elsewhere in the document the commitment to reversing the decline mature student enrolments is made clear. As UCAS does not generally handle mature student (often part-time) applications, this data would be gathered directly from institutions as a component of the transparency information.

There’s room for specific institutional aims too – OfS will suggest possible measures but institutions will plot their own courses and report back. However, OfS will “challenge providers’ assessments of performance, strategy and associated targets if they do not, in our view, address areas where we have identified concerns” and will form their own view on how a provider has performed against these self-selected targets.

So there will be an element of strategy in choosing institutional aims – they are expected to be stretching (those that are not will not be approved) but an institution would be foolish to specify targets it realistically could not achieve.

All this hard work is balanced against a loosening of the reins on reporting against the spending of allocated funding for access and participation. In the past expenditure was carefully reported as part of the access plan monitoring statement – this now migrates to general OfS monitoring process, and there are no longer requirements to report expenditure on student success and progression – as providers have been keen to point out this is difficult to disaggregate.

There’s obviously nothing OfS can say about their use of funding to support the access and participation agenda. We know that the National Collaborative Outreach Programme (NCOP) will be evaluated by the board this autumn, and the consultation outline plans for student premiums with the usual collection of buzzwords: directly linked to outcomes, targeted, evidence-based, add value to provider investment, and provide sector-wide benefits not offered by the market. A slightly more standardised evaluation strategy will provide some of this evidence.

The consultation closes on 12 October, so there’s plenty of time to respond.

One response to “A step change for access and participation?

  1. The OfS – and the wider sector – need to be very cautious about making inferences about trends in access using UCAS data.

    First, it excludes all part-time students – looking at all students it is clearly not true that there are more disabled students, more students from a non-white background and more students from under-represented areas: there are far less (e.g. there are 17% fewer entrants from POLAR Q1 than there were in 2011/12)

    Second, UCAS data refers to the number of students accepted on courses; it does not refer to the number who actually take those up and participate in HE for any length of time. Comparing UCAS data with HEIPR data shows that the UCAS numbers exaggerate participation at age 18 by around 20% – I suspect a disproportionate share of those non-participants who were accepted for entry will be from disadvantaged backgrounds.

Leave a Reply