In John Blake’s blog on updates to his guidance and advice on producing access and participation plans, he’s stressed three things.
First, the Office for Students’ (OfS) Director for Fair Access and Participation wants to see more evidence of collaboration between universities and colleges and third sector organisations, schools and employers to address risks to equality of opportunity.
That suggests he didn’t see enough in the summer pilot exercise. The official message is – why would you be doing all of this on your own? The unofficial one is – you ain’t gonna pull this off on your own, lads.
The second is a desire to see more ambitious work to raise the attainment of students before they reach higher education. When he says “I recognise that there are barriers to … these ambitions”, he’s effectively saying that the excuses he presumably read in the summer pilots won’t wash. “There are excellent examples of good practice in these areas across the sector”, he says, “and I would like to see these more widely taken up.”
His third is about bravery – where he says that he’s seen “nervousness” from providers around setting out targets and activity where the success of the activity undertaken is not necessarily entirely in their control, particularly in relation to both that pre-16 attainment stuff above and what’s going on within collaborative partnerships.
Having banged the drum for evaluation, “what works” and proving it, he’s now very much encouraging providers to take calculated risks, and to know that where expected progress is not being made, OfS will give providers both a chance to explain and help plans to get back on track.
There’s also something interesting about the “risks to equality of opportunity” that he’s seen pilot providers highlight and tackle – and ones that aren’t really surfacing. If I ran an access and participation team in a university, I’d be keen to hone in on things I think my team might make the most difference on – but they are not necessarily the right risks to focus on objectively.
In other words, in all three cases Blake is saying – the idea here is that you identify the biggest issues, not just the “low hanging fruit” – and yes, you may have to work with academic departments, schools, SUs, the NHS and other voluntary sector partners who you may feel are flakier than might be the case if the whole plan was full of stuff you (as in that tightly managed APP or student success team) can do.
It’s all very much deleting the cheat codes, part two. He says the regulation is not designed to catch anyone out who is doing the hard work – even where that work does not always lead to the outcomes we all want. What he means is that the regulation is designed to catch out anyone who is writing a plan to please John Blake rather than writing a plan that looks like it’s percolated and embedded ambitions and actions both throughout the provider and wider community.
One thing that’s fairly major but not highlighted in Blake’s blog is mental health. The previous version of Regulatory notice 1: Access and participation plan guidance did say that it expected most providers to consider how they can improve the mental health of their students – but it looks like providers in the pilot took that to mean “look at gaps for those with a declared condition” rather than addressing mental health more broadly.
As a result, the section on mental health in the new version of Regulatory advice 6: How to prepare your access and participation plan (effective practice advice) has been souped up – there’s now explicit references to addressing known intersectional gaps in the likelihood of reporting mental health conditions, and addressing how poor mental health outside of a declared condition may affect outcomes for specific groups of students differently.
There’s also new material on providers identifying the most suitable support and pathways for particular groups of students, how access can be supported and any barriers addressed through targeted prevention and intervention strategies, and how targeted support builds on general wellbeing and mental health provision and sector activity – for example, via participation in the University Mental Health Charter.
This is all welcome stuff – and underlines the irony of the minister insisting that the time isn’t right for one-size-fits-all OfS regulation in this space when his own access Tsar is pretty much delivering it anyway. It also brings both Risk 7: Insufficient personal support and Risk 8: Mental health to the fore for providers that may have been deprioritising them in their own analysis.
The official one
In an attempt to reverse engineer what’s now expected that wasn’t previously (cheat codes on deleting the cheat codes, if you will) we’ve gone through other changes to RN1 (the official legalish one) with a fine toothcomb:
- In RN1, the wording on what to do with schools has changed from “the risk posed to fair access and successful participation by knowledge, skill and attainment gaps emerging across childhood” to “supporting schools to raise pre-16 attainment for students who do not have equal opportunity to develop the knowledge and skills required”. Who knows why.
- For the avoidance of doubt, RN1 now points out that “qualifying persons” on “qualifying courses” include students studying under sub-contractual arrangements. There’s nothing on using sub-contractual arrangements to make a provider’s overall numbers look good – but for those hoping otherwise, they are covered.
- In terms of using the Equality of Opportunity Risk Register, providers were told to identify risks to equality of opportunity “having regard” to the EORR. That’s now changed to providers being “expected to consider the EORR” when identifying its own risks to equality of opportunity. This is code for “don’t just look at your gaps like last time and retrofit some references to the EORR, use it properly”. You can hear the schoolteacher in Mr Blake in that one.
- References to the student submission now say “this should be submitted by students or student representatives”, which presumably is a response to some providers trying what some tried in the equivalent TEF process of asking if they could write it instead. Obviously not, but nice try. It’s to be independent – and if I was a provider I’d be asking the SU now what sorts of partnership activity on consultation and mitigation delivery might result in a thumbs up from the SU, rather than the thumbscrews on toning down the criticism the night before it’s due in that we saw in some providers over the TEF process.
- There’s a new clarification that asks providers to outline how prospective students will be provided with clear and accessible information about the plan, which does suggest that right now OfS thinks it’s both buried and baffling.
- To drive home that point about the EORR, providers can’t just say “we’ll keep an eye on the outcomes”, and are told they must identify the indications of risk used to set targets and measure progress towards eliminating those outcomes issues.
- Stressing that thing about stuff that might feel less easy to control, providers are told that identified risks should relate both to a provider’s own context as well as relevant sector-level risks.
- A previous acceptance that statistical uncertainty might cause problems is now reserved for “smaller providers or those with limited data”.
- And each intervention strategy now has to have an expected outcome or outcomes – Blake and his team need to see all the threads, folks.
The nicer one
Then in a further attempt to reverse engineer what’s now expected that wasn’t there previously (more cheat codes on deleting the cheat codes), we’ve also gone through changes to RA6 (the guidancey one) with a fine toothcomb too:
- Providers are told that OfS will append the following information to a plan for each provider after the approval of a plan – a summary of fees for year one of the provider’s plan, a summary of the proposed investment (access, financial support, and research and evaluation) for the duration of the plan, and the targets and milestones set by the provider.
- When summarising the most pressing or significant risks to equality of opportunity, providers are now told explicitly that they “may use charts and graphs to aid communication”, which suggests Blake’s team got a bit sick of walls of text. DK will do you some pretty charts for a fee.
- That stuff on using the EORR in RN1 is addressed in more detail here – so on assessment of performance, providers are told not just to look at those gaps in the dashboard, but to “identify the greatest risks to equality of opportunity that certain groups may face”, identify “indications of risk through an analysis of data and insights”, and then “consider what underlying risks these indications may relate to”. No cheat codes, remember – because OfS wants providers to “demonstrate understanding” of why the numbers are the way they are.
- Because not all student characteristics are recorded in datasets, providers “may wish” to consider using a suitable proxy (ie data on household income is not commonly available to match to an individual learner record or student application, so providers “may consider” free school meals eligibility instead.)
- There’s quite a bit of new material in here on intersectionality – because single characteristic data or aggregated data can mask the student group that is most affected by a risk. Providers are encouraged to interrogate accordingly.
- There’s a new bit on not just having a pizza party with the SU sabbs to get them to nod at your intervention ideas, but instead a nudge towards co-creation with students, in particular those who are intended to benefit from the intervention strategies “to better understand the risks that they face, and how these can best be mitigated”. Creativity is allowed, don’t just do fait accompli, and ask them about your misguided food vouchers scheme before you deploy it, in other words.
- That thing about partners is covered off with “where we establish that a provider is not making expected progress in relation to any collaborative targets, we may contact the provider to understand both the reasons for this,which may be beyond the direct control of the provider, and the steps it is taking to get back on track where this is appropriate.” That removes an excuse they must have been reading over the summer pilot.
- And providers now have to ensure that they have ways of checking how clear and accessible the information that it provides on fees and financial support is, which SUs will welcome (and again no, doesn’t mean send it to the SU’s welfare sabb 5 minutes before publication to ask if it looks OK).
Dates for the diary
There’s some helpful process stuff in here too. If it’s knocked back, providers will normally be allowed up to two opportunities to make amendments and resubmit their plan before refusal.
“Early-recruiters” (those with an application deadline in or before October) have to submit their plans by the end of May 2024, and everyone else has a deadline of late July 2024.
One important thing to note – a Data Futures problem – is that OfS isn’t intending to update its access and participation data dashboard prior to May 2024 at the earliest. Providers are asked to use the data and insights that are currently available.
What is coming sooner is some minor updates to the EORR early in the New Year, but again, OfS says these should not affect any work providers are already doing to prepare their APP.
It’s also planning some further support for providers and students to prepare their plan, details of which are coming in January 2024 – as well as a “collaboration and partnerships” beauty parade event on the 18 January, to which all providers submitting a plan in 2024-25 are “invited” to attend, in the same way that students are “invited” to attend a disciplinary hearing.
So, all sensible stuff. Some providers will still moan about all of this – but John Blake is a formidable opponent for those used to hanging on excuses. What’s especially tricky is the difficulty that some APP teams are clearly having in making all of this stick beyond their own teams – but that’s a wider issue of leadership and management in undoubtedly more challenging circumstances, not least because “inconsistency” between student characteristic groups and subject areas is a major theme emerging out of the TEF panel statements. Playing the averages isn’t really allowed any more.
To PVCs sponsoring the process, Blake is basically saying – it’s on you to help that APP team of yours cause the embedding of the thinking, on you to get intervention efforts expanded beyond that team, and on you to help your SU engage students as partners at every level. Fair enough.