This article is more than 1 year old

Here’s what happened when I read every Access and Participation Plan

As providers prepare monitoring returns and impact reports, Jim Dickinson reads every APP in England and has thoughts on student success in a post-pandemic context.

Jim is an Associate Editor at Wonkhe

In the aftermath of the publication of the Commission on Race and Ethnic Disparities report, I was tweeting about the persistence of the Black attainment gap, and someone slid into my DMs to ask a question.

“Out of interest”, said the correspondent, “how much of the apparent narrowing of the awarding gap remains if you normalise for the change in overall distribution of degree classifications?”

Now there’s an intriguing question. Clearly, if everyone got a 1st or 2:1, the gap would be 0 per cent – so the question was whether the apparent improvement that the sector congratulates itself upon reflects a systemic improvement in the lot of black students, or is instead a statistical side effect of attainment improving in general.

What you measure is what you get

So I had a look. In the latest 2019-20 HESA figures (UK domiciled first degree qualifiers, all modes) there’s a 17.9 percentage point awarding gap between black and white students. That’s down from 20.7 percentage points in 2018-19, and from 23.1 percentage points in 2014-15. So it’s reducing, and that reduction accelerated in the last, partially Covid-struck academic year for reasons that need to be interrogated.

That gap was based on those getting firsts or upper seconds, so I thought I ought to interrogate firsts to see if the pattern was consistent. There the gap was also 17.9 percentage points in 2019-20, and given Deliveroo had just appeared, I didn’t think much more of it.

But a week or so later, having skim-read all of England’s access and participation plans for an analysis for our SU subscribers, I came back to the spreadsheet just to double check on the historical trajectory. And it’s remarkable. That gap in 19-20 was up from 16.0 percentage points in 18-19, and that was up from 13.7 percentage points in 14-15. When it comes to firsts, the gap is getting worse.

There will be plenty of ways to explain that finding that I’ll not go into here – suffice to say that in the context of both reducing disparities and effective regulation, it’s important that we don’t kid ourselves, and that we keep an eye on what we’re measuring. In an interview reflecting on the accomplishments of the Office for Students (OfS) recently, outgoing chair Michael Barber said that the thing he’s most proud of is progress on access and participation. I’m hoping his new book on accomplishment doesn’t make the fatal mistakes of confusing intention and target setting with achievement, and of confusing setting targets (hope) with actual human accomplishment.

Every APP in England

I mention all of this because having done that skim read, it’s clearly true that England’s approved access and participation plans demonstrate an ambition not previously seen in our sector to both address disparities within universities’ control, and correct problems bequeathed to us by schools, colleges and society – even if some of that “ambition” had to be extracted with a crowbar by OfS officials whose “tone” was the subject of provider “feedback”.

But the plans weren’t perfect – and various things did strike me that I thought I’d share as providers complete their monitoring returns and impact reports on year one of the operation of the plans in the context of the impacts of, and lessons from, the pandemic.

A couple of caveats. Plans approved a couple of years ago may not reflect the latest thinking or understanding, and also may not reflect deep thought and understanding that’s gone into the crafting of the interventions and initiatives on offer in the text. I’ve also not had the time to look at APPs outside of the university sector, and nor have I looked at the shorter access and participation statements that are required for HE providers in the “Approved” category. And I’ve focused here not so much on the “getting in” aspect, but the “getting on” of student success measures.

And yes, I’ve read both of Nous’ excellent evaluation reports for OfS and the regulator’s own response.

Tick ToC

Every APP needs a decent theory of change or two, and mine is focused on capitals. I neither claim this as my own, nor am I blind to the possibility that none of these thoughts are original and instead are just a mashup of things I’ve heard, read and seen. It may also be that this whole ToC already exists somewhere and I’ve just taken it in by osmosis. Anyway, it’s not the model that matters so much as my thoughts on the contents of plans compared against it.

What I’m suggesting is that to succeed, students need three types of capital – and investing in boosting and bolstering those capitals is key to improving student success and performance both in general, and in relation to underachieving and disadvantaged groups specifically.

Money and stuff

Financial capital is well understood, and as the report from Nous notes, the general move across five year APPs has been to be more critical when evaluating the efficacy of student financial support schemes like bursaries and scholarships.

One worry would have to be the extent to which the arrangements (and associated budgets) as described will turn out to be sufficient for a student body whose families will be struggling with the longer term impacts of a post-pandemic recession, and another will need to be DfE’s ongoing insistence that what we used to think of a central component of APP funding (the student premium) now continues to be held up as the catch-all funding pot for anything and everything impacting students in HE.

What really surprised me was that while there was plenty of discussion on student financial support, there was much less on solving the issue from the other end of the telescope. The costs of participation faced by students are significant, have a disproportionate impact on the poorest, and universities really ought to be playing an important role in both reducing those that they have control over, and using their influence to get those they don’t control, under control.

I’m talking everything from the cost of a coffee on campus, to the length of reading lists, to the costs of campus facilities now transferred into bedrooms, to collectively tackling the rigged market of student accommodation – which is increasingly parcelling students off into gated communities of haves and have-nots.

As I said back in 2019:

The issue is that “pandering to the poor” is left to the WP unit – whilst those that make reading lists ever longer, or set accommodation fees, or design field trips, or price up a latte, or promote years abroad, or set a price for gym membership are all free to design a student experience that the existing student body can afford. They’re under pressure not to get that cost down, but to maximise the revenue from it to fund the “student experience” displayed on the website. And that cycle of privilege ratchets ever further up.

When it comes to the digital divide, I’d put the tech “kit” issues of laptops and the like, and wider stuff like a chair that doesn’t break your back and decent broadband in here too.

Study skills

Academic capital should also be pretty well imaginable, and the breadth of student success programmes involving personal interventions like mentoring, counselling, coaching and advising, and cohort interventions designed to improve retention and success among students from disadvantaged and underrepresented groups, were fascinating to finally get the time to get across.

Again, in the context of the pandemic and almost two years of disrupted prior learning, it strikes me that an assumption of preparedness in September would be faulty generally and problematic for target groups specifically – and that something like Michelle Morgan’s pre-arrival focus on needs and setting students up for success would really help.

Aside from all that, what struck me here wasn’t so much the lack of evaluation and evidence of efficacy that (for example) TASO points out, it’s that what is there is focused on supporting students in deficit to their achieving peers rather than pedagogy and curriculum – and crucially there’s almost nothing on assessment, or organisation and management issues like timetabling.

We only need to take a glance at the OfS APP data dashboard to see that something happened at the end of last academic year that was almost certainly related to these aspects of how we run institutions, safety nets and no detriment policies, and how and why we assess students. Assessment is almost certainly a major A&P issue, but you’d never know it from APPs.

And we only need to look at this national analysis of NSS differences by student characteristics to know that these other issues could make a big difference to the experience that students have of being on a course and therefore their outcomes.

It’s here, by the way, that I’d put the “digital skills” issue.

Getting friendly

Next, I looked across the plans from the perspective of social capital. I’ve written about this on the site many times before, but I was taken aback at the relative lack of ambition surrounding students’ lateral social networks and communities of support.

The extent to which keeping people on courses and helping them to do well is assumed to be improvable via vertical (directly delivered) support rather than horizontal (ensuring students have a strong network of peers to rely on) support is really quite extraordinary.

Again, this is a major issue coming out of the pandemic after 18 months’ of isolation – and providers ought to be thinking about how to pump-prime the kinds of friendships, networks and social activity that sustain students through their studies and reinforce their confidence to achieve.

To be fair, there’s barely a plan that doesn’t identify a particular group – black students, or care leavers, or student parents or whatever – and doesn’t have a light stab at what Robert Putnam would call “bonding” social capital through dedicated events, networks or student societies. Commuters seem to be well served by breakfasts, for example. What’s astonishing is the almost complete absence of “bridging” social capital, and the need to get students from disadvantaged backgrounds involved in wider student life.

There’s a huge but currently missed opportunity for much deeper integration with SU efforts in this space, too. I’m also reminded of this observation from social mobility expert Duncan Exley on the site a year or so ago:

The belief that a degree is the key to a good career also makes it less likely that first-generation students will take part in extra-curricular activities. (This is also partly because they can’t afford the cost or work part-time because their family can’t supplement an inadequate maintenance loan). Many have been told by parents and others not to waste time on sports and other activities when they could be studying. The consequences of this often don’t become apparent until the student has graduated and is being turned down for jobs because they don’t have a rounded CV.

Until it’s over and we’re looking back

While I’m on, a note on student input and engagement. The Nous evaluation picks up and notes the classic list of issues – the struggle to get students beyond SU officers involved, the difficulty in demonstrating what changed as a result of the input, and even this hardy perennial, a version of which seems to end up in every OfS evaluation of every project ever:

The annual cycle of inducting a new cohort of SU officers into the world of APP is challenging, and each year’s engagement can depend on the people voted into the officer roles.

More interesting was the role that “lived experience” appears to be starting to play in plenty of the plans. Hilary Cottam’s terrific book on collaborative design in social change has plenty to say on targets, saviour syndrome and the design of interventions – but this bit in particular I think is helpful here:

What’s wrong with the focus group? Focus groups are efficient: they can garner rapid reactions to a set of given ideas. But participants in focus groups are known to have a tendency to talk about what they already know, and new ideas are rarely produced. Instead, focus groups often suggest improvements to the existing system: more youth centres, better meals on wheels, rather than radical alternatives. We are all susceptible to opinions we receive from friends and the media, and in a focus group we are inclined to repeat these back. The focus group is like a fishing expedition that continually sets out to fish in the same pond. We find nothing new and gradually the stock dwindles. Listening to stories takes time, but stories are abundant and in this rich, unruly mix we find the seeds of ideas that can be grown.

Any university going around its SU to set up focus groups of disadvantaged or underrepresented students to give their views on a dry set of well-rehearsed APP interventions is probably missing two tricks. Those doing work like this at York – working with their SU to build understanding and power amongst students – are likely to get better ideas into APP evaluations, and a more representative SU outside of the context of APP work while they’re at it.

Leave a Reply