Resetting the dial on access and participation

Revamped access and participation plans must continue to follow the evidence, even where it feels counterintuitive, say Lizzy Woodfield and Liz Moores

Lizzy Woodfield is Policy Advisor at Aston University.


Liz Moores is Professor and Deputy Dean in the College of Health and Life Sciences at Aston University.

Access and participation plans (previously access agreements) have evolved a great deal since their original conception.

In the past, they were more concerned with spend, input measures, and access to higher education. Now APPs are much more outcomes-oriented, and their scope is broader: covering issues of access, success, and progression to employment.

They must set out a systematic examination of data on access, success, and progression gaps, and they must put forward a credible evaluation strategy. Until recently, a new plan was negotiated each year.

Time for a rewrite

A different approach announced in late 2018 ushered in a more strategic timescale – a five-year cycle – for institutions considered least at risk of breaching conditions, with delivery of plans starting in 2020 and running to 2025.

On 24 November 2021, just over a year into this delivery cycle, a significant change in direction for universities on access and participation was announced by Higher and Further Education Minister Michelle Donelan, along with a commitment to reduce bureaucracy, “ending the need for novel-like plans”.

Staff in universities will need to re-write their access and participation plans and renegotiate their targets.

Despite a substantial financial commitment – upwards of £800 million – to access and success activities, until very recently, robust evidence demonstrating “what works” in supporting learners to access and succeed in higher education has been remarkably sparse. Indeed, the Augar report expressed surprise that:

there has been no overall assessment of the effectiveness of spend on different approaches to recruiting and supporting disadvantaged students.

In particular, evidence of a causal relationship between intervention and the student outcome has been lacking.

Policy makers might therefore be forgiven for questioning whether these funds are wholly well spent, or whether they might be better spent elsewhere in the education ecosystem to provide greater impact and value for money.

However, great strides have been taken in introducing a much more robust approach (as befits the higher education sector and its research credentials), under the leadership of the outgoing Director for Fair Access and Participation, Chris Millward, not least in the establishment of the affiliate “what works?” centre Transforming Access and Success Outcomes (TASO) in 2019.

OfS has recently published a report evaluating this investment, with the first main finding suggesting that the use of evidence by the sector has increased between 2020 and 2021 – a step in the right direction.

Whether the fresh approach and new requirements for APPs provoke frustration or are seen as a welcome opportunity to reset and refocus the agenda (after all, so much has changed since the current plans were first conceived), what is certain is that we must continue to develop and follow the evidence on “what works”.

Donelan sets out that there will be:

a shift away from marketing activities, and a far greater focus needs to be placed on activities which benefit students, including summer schools, programmes of intervention in schools and targeted bursaries to assist with living costs. Measures could include… supporting curriculum development or offering students and lecturers to tutor pupils.

Evaluating the impact

What does the emerging evidence tell us about the effectiveness of these types of preferred interventions? Evidence for the impact of financial support on student access and success is mixed, with much of the research conducted in the US (see Kaye, 2020 for a review). This is a relatively high-cost intervention, which universities have to some extent been discouraged from using in recent years.

However, emerging evidence from our own university, Aston University, does suggest that scholarships may indeed have a positive impact on student retention for students from households with lower incomes.

Much of the evidence on outreach and tutoring activities provides associations between participation in interventions and HE enrolment but this is often not causal evidence.

In a review commissioned by TASO, David Robinson and Viola Salvestrini from the Education Policy Institute reported a single UK-based study of multi-intervention outreach programmes that they judged as providing causal evidence that found positive impact.

This year, researchers Adrian Burgess, Matthew Horton, and Liz Moores provided an evaluation of the UniConnect programme (a multi-intervention outreach programme) operating in the West Midlands. They found that activities most strongly linked to university acceptance were summer schools, campus visits, and information and guidance.

Tutoring (the type of intervention which you might imagine is more “demonstrably aimed at helping students achieve the highest possible grades”) appeared to offer no significant benefit when it came to university acceptance.

There could be different reasons for this: it is possible that the tutoring was offered to students who needed it more, was delivered by the wrong people, or not for long enough, but the assumption that universities – through their staff or students – may be in a better position to deliver this activity than schools is not currently supported by evidence.

Given the imperative to raise attainment, and the central role that universities will now play in this important agenda, it is crucial that researchers continue to uncover evidence on what works, so that universities can add the most value.

In contrast, information advice and guidance activities and campus visits have generally been found to have positive effects (e.g. TASO, 2021). Reading between the lines of the minister’s speech, it’s quite possible that these are the types of interventions that are characterised as “marketing activities” and that we are encouraged to deprioritise.

Many will embrace the opportunity to reset the dial on their access and participation plans, and a less bureaucratic approach feels instinctively attractive.

Similarly, many will embrace and surely rise to the challenge of deepening their engagement with local schools and colleges to increase attainment. But, to sound a note of caution, let us ensure that in our eagerness to refresh and simplify the approach to access and participation plans, we don’t let robust evidence of what works become a casualty.

Leave a Reply