Jim is an Associate Editor at Wonkhe

When something as big (literally in word count terms) as OfS’ B3 and Teaching Excellence Framework proposals drops, it can be difficult to know whether to over-do or under-do the “reaction” internally.

Do we treat it as a critical problem – convening emergency strategy groups, rip up agendas for the next round of committees and bring in expertise to assess our tactics? Or would that send panic around – better to treat it as tame, project a steady as she goes ship, take some time to understand the proposals as a senior team, and then to adjust our approach accordingly? Or this a wicked one that needs more deeper thinking – all with the clock ticking?

I can’t answer that question – and it partly depends on the actual position of the university on all of those split indicators – but what I do know is that there’s been quite a bit of confusion around since publication over precisely what is and isn’t in the proposals.

Con text

I’ve seen, for example, commentary that suggests that if OfS is going to regulate according to how many of a university’s graduates get “managerial or professional jobs”, then there are some questions over disincentives to offer courses in key areas like social care which may lead to jobs that don’t count in the ONS codes.

I’ve also seen arguments that say that requiring 60 per cent of creative arts graduates to go into managerial or professional roles “ignores the reality” of creative careers.

The good news is that after looking at the raw metrics, the second part of an OfS judgement on “context” explicitly has that covered. OfS says that when considering a provider’s context it will give consideration to factors where a provider can evidence the reasons for its performance – it can consider outcomes below a numerical threshold that may otherwise be considered positive outcomes. For example, a provider:

May have courses designed to provide access to a particular profession, but this is not classified as managerial or professional in the way the indicator has been constructed. We may consider this positively where graduates report through the Graduate Outcomes survey that they are using the skills developed on their course, or where graduates are demonstrating above average earnings in Longitudinal Education Outcomes data.”

It makes lots of sense to lobby OfS to be clearer about how these qualitative or contextual factors might apply to judgements – but less sense to pretend they’re not there.

Flashing red lights

Similarly, I’ve seen some commentary that seeks to justify performance that is below the threshold in “pockets” as OK because a whole university’s averages are fine. What I would say is that once all the split indicators are published – the threshold applies to both the university and every characteristic and subject area separately – there is unlikely to be a single university in the country that is above the thresholds on every split.

What will then matter is whether context can explain (justify) the performance, the basis on which OfS “prioritises” which pockets to focus on in a given each year, and the severity of regulatory interventions applied to size, shape and distribution of what we might call “the flashing red lights on the dashboard”.

Many senior people in HE that have built a career on developing a broad portfolio of provision, each aspect of which does different things for different pressures (regulatory, financial, etc) are now going to have to shift their approach – worrying much less about averages, and more about sending in a swat team to fix a pocket of poor performance.

Whole university approach

I’m struck by how much data is going to matter across a university. There are still plenty of universities where below senior level, access to data on characteristics or outcomes or access or even satisfaction is heavily restricted. That isn’t going to wash if it becomes everyone’s job to worry about continuation, completion and progression – and if it doesn’t become everyone’s job, there’s even more trouble ahead.

Partnership with students and the students’ union will have to evolve too. There’s a version of what happens next where a university is cheeky enough to propose that it’s not the SU, but their own student voice system, that handles the new student submission (despite all the warm words about “partnership”). There’s also a scenario where the draft workbook of indicators and outcomes shared with the university hasn’t already been shared with the SU.

Smart universities will have already sat down to start work with their SU on targeted improvement initiatives for continuation, completion and progression (or, in SU language, belonging, confidence and skills) which might not turn the flashing red light green in time for September, but at least might represent a “contextual factor” that a provider could use to explain its scores to avoid regulatory action.

Opportunities and threats

But the other important thing I want to focus on here is disadvantaged students and risk – insofar as I think that the simplicity with which some are dismissing or criticising universal baselines for continuation, completion and progression is worrying, and is in danger of being tone deaf.

In my ideal world it would be less personally risky to enrol on a degree. But it is.

You end up in “debt” (or at least with the responsibility to shell out a chunk of your salary for a long time), you waste a chunk of time, and you use up a finite entitlement to access financial support for HE – as well as the potential of failing in higher education harming you in other ways.

Much of the critique that has rightly focussed on the way that OfS’ proposals might harm opportunity is fair play. But we have to think about risk too – and asking higher education providers to think more and more carefully about the likelihood of a student being able to get to year two, complete or progress to a graduate job has to be legitimate in principle. It’s just not fair to say to students (for example) “Well you’re disadvantaged – good luck but you’ll probably fail”.

Are there consequences for who we take risks on? You betcha. Do we need to worry about courses closing and that restricting opportunity for students that can’t or won’t travel? We do. But the way to respond to that is to consider what we can how to improve student support and the assessment of potential – not complain that those things are being judged at all.

Three options

In the end, disadvantaged students are disadvantaged students and there will always be some that will find achieving the outcomes harder than others though no fault of a university. But from their point of view, there’s then three ways to reduce those risks they take by enrolling:

  1. You can give them more money, time and opportunities to try again if it doesn’t work out.
  2. You can work to improve the support available to enable them to continue, complete and progress.
  3. You can decide that on balance, they probably shouldn’t enrol on a programme.

The reality is that this government isn’t hurling towards Option 1 any time soon, and a crude implementation of this agenda would push providers towards restricting opportunity under Option 3. I just don’t think it’s legitimate to argue that we shouldn’t worry too much about those risks to start with.

Put another way, yes true that when you have a lot of disadvantaged students, it can feel unfair that all the pressure is on HE to fix that. But we can’t just say “don’t blame me”. We’re admitting them, consigning them to a lifetime of repayments and eating up finite HE entitlement.

If driving is risky you can make less journeys, wear better seatbelts or help people become better drivers against all the odds. If we have a group of people who find it harder to learn to drive well, the answer isn’t to say “not our fault”, blame the government and plough on in the name of opportunity. The answer is to double down and pull together to do everything we can to contribute to risk reduction.

Belonging, confidence and skills

The good news is that this should not be beyond us. In this study, it’s true that “learner context” aspects accounted for almost a quarter of the main reasons that students dropped out. In fact only 1 in 10 of the dropout reasons related to administrative issues or module design and delivery.

Yet the study found universities’ retention efforts often focussed primarily on those aspects, with “personal circumstance” factors seen as “uncontrollable” and beyond a university’s influence. And when the study actually asked students what might have helped, a much brighter picture emerged of ways in which small changes could help students combined those complex personal pressures with study:

  • Many wanted universities to help develop students’ study skills and build their resilience before they started their courses.
  • Many wanted university policies and processes to take account of personal challenges like a parent looking after a sick child, or an employee covering for a sick colleague.
  • Students with sudden temporary increases in personal demands needed simpler processes for requesting extensions and adjusting hand-in dates.
  • And assessment design mattered too – alternative assessment options and more flexible submission dates were cited as ways to allow for and account for students’ changing changing personal circumstances.

Students had all sorts of suggestions. Providing students with personal motivation support, including improvement of their time management and online study skills was the most common intervention theme. Dealing with delays with admin, delaying learning while students wait for answers was another. Improving response times on queries to academics to students was high on the list. And here’s a killer comment if ever I’ve seen one:

Group assessments don’t work. The reason for doing online study is to fit in with my schedule. Group-work means I have to fit in with everyone else’s.

It reminds us that the more diverse the students are that we recruit, the further from our own or others’ perceptions of “norms” are that we are presented with. Some of the challenges that presents really are outside of our control. Some involve us reassessing what our “standards” mean in the context of correcting for prior disadvantage by offering extra attempts and time. And with some partnership working, some active listening and a commitment to understanding students’ lives rather than just their opinions, some are just as addressable as OfS makes out.

Leave a Reply