Publishing three major – and highly complex – consultations at the same time, with only an eight week consultation period, has placed significant pressure on the higher education sector as a whole and specifically disadvantaged smaller providers.
That’s one of the “overarching” views in Guild HE’s responses to the Teaching Excellence Framework and B3 minimum outcomes consultations – and in many ways sums up the problem with both its own and Universities UK’s submissions to the Office for Students.
If you view what’s in the OfS proposals as a long time coming, where in reality the ship sailed a long time ago on some of the key issues like using raw benchmarked “outcomes” as a way to judge quality, eight weeks was probably too long. After a few years of ministerial guidance letters, I can hear ministers and their SpAds now saying “just get on with it with you”.
On the other hand, if you view consultations like this as one last throw of the dice on making the points you’ve been making since the passage of the Higher Education and Research Act, and a chance to signal to your own members that you still agree that’s all so unfair, the responses make lots of sense – especially if you have one eye on the post-legislative scrutiny process that’s about to kick off across the two houses of Parliament.
Elsewhere on the site Guild HE’s Alex Bols sets out some of the key concerns from small(er and more) specialist providers, which doubtless won’t be a million miles away from the concerns of Independent HE. The Russell Group’s “look over there” gambits are to call for more emphasis on absolute values when considering TEF indicators, and to say that those institutions who exhibit “the most serious breaches of B3 outcomes thresholds” should be prioritised for intervention. Here I’ve looked at Universities UK’s submissions, and reflected in particular on how the wealth of B3 data and the TEF student submission process has been “going down” on the ground.
Well that’s excessive
As UUK staff tweeted yesterday, for its responses you have three ways to read what’s been submitted based on your
attention span interest level – for the people with 1 minute, there’s a press release, for people with 5 minutes there’s a briefing, for the people with… longer… there’s all three responses in full, and for people who want some moderately snarky critique of that material, there’s this blog.
In the context of an ongoing industrial dispute that’s partly about workloads, UUK didn’t really do its responses any favours when its corporate twitter account highlighted one of the key press release lines as follows:
NEW: Excessive admin will take focus away from teaching, universities warn regulator.
Cue disgruntled academic staff – a major complaint from many of whom is that excessive admin already takes focus away from teaching – signalling everything from disappointment to outrage on social media. Maybe it was a tactic to get the consultation responses noticed, but I doubt it.
If you go down to the woods today
For the B3 consultation on minimum outcomes, it shouldn’t surprise us that UUK wants to see a more “well-rounded” approach to measuring quality and value:
Graduate jobs” are difficult to define and the OfS should also reflect graduate views of their own success. We have recently published a new framework to assess the value of university courses to students and society.
It’s one of the many moments where you can’t quite work out whether UUK knows that the key decision has already been taken here or if it genuinely thinks it will change OfS’ mind – it certainly paints a picture of the sector being stuck on the left-hand side of the Kubler-Ross grief curve.
Either way, we can pretty much guarantee that in a couple of months an OfS response will tell the sector that it’s wrong in principle, and anyway hasn’t read the proposals – which to be fair when taken in their totality along with the rest of the B conditions, do measure quality both quantitatively (via outcomes) and qualitatively (through proposals the sector isn’t too keen on ether, with a kind of be careful what you wish for vibe).
As designed, OfS’ B3 metrics for continuation (getting to the second year), completion (of the whole qualification) and progression (to a graduate job or further study) will be published not just at provider level, but for a whole range of splits including subject, by ethnicity, by mode, study level and so on. This is designed to stop providers playing the averages and hiding “pockets” of poor performance – the idea is that the published thresholds of minimum performance for each outcomes will apply to every split.
Naturally UUK has concerns about data reliability where the numbers are small, and more importantly – because OfS isn’t going fine or deregister every provider with a flashing red light for any of those splits – which splits, the level of performance away from the minimum and the number of students that might trigger intervention will apply:
We need greater transparency on how the OfS intends to prioritise its assessments of regulatory compliance.
And so UUK wants to see:
Committing to publishing early the annual approach to prioritisation and the rationale behind this. This may include drawing on an independent panel (with student input) to determine the prioritisation approach.
It’s a nice try – but you can guess what OfS is going to say. Folks might not like it, but the regulatory design here is a kind of split metrics panopticon – the whole point is that not knowing which red lights might cause you to come a cropper is the equivalent of a school not knowing if OFSTED is coming tomorrow. It’s the threat of monitoring – with the odd provider made an example of – that should be causing people to both work on improving outcomes where the red lights are, and having “contextual” action plans ready that show that work off if OfS phones you up in September.
On that outcomes-being-unfair thing, UUK has gone for calling on OfS to do an Equality Impact Assessment, which it presumably figures would highlight some of the potential opportunities loss if providers were to shutter courses with “poor” absolute but good “value added” outcomes.
Again, this is a nice try, but again, I can hear the reply already – a version of this has been in countless documents throughout the process, and will appear again:
Our position is that if providers are to recruit students from underrepresented groups, they must do so having understood the commitment they are making to support their students to succeed, irrespective of their backgrounds. This will include removing or minimising disadvantages suffered by students from underrepresented groups, and to take steps to meet the needs of students from underrepresented groups that are different from other students. Where providers fail to do this, and we see low continuation and/or completion rates, or disappointing levels of progression to relevant employment or further study, even where those providers may offer opportunities for students to enter higher education, we do not consider that this represents genuine and meaningful choice and opportunities for students which promotes equality of opportunity in connection with access to and participation in higher education.
In other words, the sector won’t win an argument that says poor outcomes are somehow good because of the “value added” in the context of a student’s prior attainment, background or geography. The right questions should have arguably have been what this kind of regulation will do to opportunities for students from disadvantaged backgrounds who would or could do really well – and how a busy and financially stretched sector will be supported to determine “what works” when it comes to good outcomes for these sorts of students.
In addition on B3 there’s some sensible stuff on the “contextual” factors that will be used to judge if a provider has outcomes that look bad on the numbers used but are otherwise good – where the obvious puzzle and need for reassurance around the sector is centred on creative arts courses. Given ministerial signals in this area you can understand why folks might want to know if being self-employed for a few years, or unemployed but happy, will count. There’s also some standard concerns about the uses (and abuses) that the data will be put to once it goes public.
Oh – and one “sharp intake of breath” moment in the consultation for many providers was the news that the outcomes for provision that is both validated and franchised will be take into account and splits for it calculated. On validated provision, the UUK response is that access to student data across validation arrangements is “not always present” for degree awarding bodies. You can see what UUK is getting at, but that may not be the defence that it thinks it is if you think a body awarding degrees ought to be have been thinking about outcomes all along.
On the Teaching Excellence Framework, the headline issue in the response is the implementation timeframe – it will, says UUK, be “challenging” for providers. But hold on – surely UUK members are looking at outcomes and experience routinely anyway? UUK squares that circle on B3 as follows:
While providers already review outcomes data internally, navigating the new construction and level of splits will add a burden in the short term. In adjusting to the new condition, the OfS itself notes providers will only have four weeks to review, check and understand their data before being subject to this revised condition
And on TEF, like this:
The proposed timeline for submissions to the TEF must be extended. We would support a spring window for submissions as this will allow universities time to review their data and for staff and students to engage in the process. The window for submissions should be a minimum of three months.
From ministers’ point of view, the exercise is already very late – some of that is DfE’s own making and some of that is pandemic related, but even so, with previous TEF banners in the bin because the metrics underpinning the awards are now so old, there will be pressure to keep the current timeline. Maybe there will be a few weeks or even a few months’ grace, but the basic OfS line that these are things that a provider ought to have been doing anyway is pretty hard to argue with.
One area where UUK has a decent consistency point is on the new, below-bronze “requires improvement” rating. It rightly notes that that category risks confusion with a baseline quality measure – in the B3 proposals you either meet the minimum or you don’t. Effectively, UUK is asking what “requires” means – how much, and by who? I expect however that we’re going to end up with a rebranding of “requires improvement” rather than an abolition.
Neutralise the threat
The section on the student submission is very frustrating. The big change proposed is that students get to directly submit the views of students into TEF panels rather than through a few words in the provider submission – and while UUK doesn’t oppose that, some of its feedback is just daft. One bit says that submissions should focus on existing evidence – but the whole point is that the process ought to leverage better support for student voice work and going to students enables some of the evidence to be contemporary.
There’s a bit where UUK says it’s concerned that there is no requirement to provide sources or verification of evidence in the student submission, and argues that there needs to be some detail “on how the student submission has been developed and students’ views collected.” It’s also worried about moments where there might be “tension” between an SU submission and the wider student view that will need to be carefully managed, with “variation in student union engagement” across types of SU and student.
UUK may not have read the actual consultation document, which does say that when considering how compelling the evidence in a student submission is, and how much weight to place on it, it proposes the panel considers the extent to which the evidence reflects the views of students within the scope of the TEF assessment:
Evidence would be more compelling, and greater weight placed on it, where it clearly articulates the views of students, and is broadly representative of all student groups and courses within the scope of the TEF assessment.
And there’s an especially daft moment where the UUK response says that OfS should consider how well the timing of the submission window will work for students:
Annex D suggests that a sabbatical officer might be an ideal candidate for the nominated student TEF candidate and yet the period to which the TEF assessment relates and the submission window will involve an overlap in student officers. There needs to be time for new sabbatical officers to understand the process and purpose of TEF and to meaningfully engage in the process. From this perspective, we would recommend a late winter or spring start date.
Maybe that’s a decent concern in a micro-provider. But if there are seriously any providers in membership of UUK that think that the named sabbatical officer in the SU is going to personally research and draft up the submission rater than their voice and research staff, they really need to have a long hard think about that SU’s block grant, and the role that the university supports the union to play in student engagement more generally.
If you’re a university that, for example, has invited SU officers to a meeting to talk about this in recent weeks without inviting the key SU staff along too, that’s a problem. And if you think the submission quality is about the capacity of the elected student officer rather than the wider resource in the SU, that’s a problem too.
Support the SU to submit evidence rather than opinion, and they’ll submit evidence rather than opinion. Who knew?
For the reasons we’ve identified previously, it’s hard to believe that the responses will generate any major changes in OfS’ approach. But we should also consider for a moment whether the regulation as designed will change providers’ approach(es) to both experience and outcomes.
I say that because I have a relatively good grasp of how the proposals, and the draft TEF and B3 dashboards, have been landing so far inside universities – and I’ve been pretty surprised. The regulatory assumption inside OfS is that providers will take their outcomes data in particular and do something about it – discussing it within the academic and corporate governance, interrogating the causes of poor performance beyond “the wrong kind of student”, and putting in place action plans (that it could demand at any minute) to improve.
That trouble is that that need – to get the whole university thinking about outcomes performance data and responding appropriately – rubs up against the need for internal capacity to interpret it properly, and also the need to manage bad news. There are plenty of providers whose school or faculty-based bodies have tended to be protected from this sort of “OfS thing” in the past, but will now need to engage with it properly.
Whether you regard a head of school having their feet held to the fire on poor continuation rates as administrative burden that takes away from teaching, or a necessary intervention into a pocket of poor performance is what it is – but nobody can say they didn’t see this coming, and anyway the time is probably best spent talking to students about why it is that students in that school are dropping out at a higher rate than elsewhere, and what might be done about it.
The picture is trickier inside the corporate governance. If you’re a senior team that’s been telling your governing body that everything is just fine for a few years, when in reality you now have a dashboard full of flashing red lights, that will represent something of a… presentation challenge. But even in the best providers, I’ve been surprised at the lack of focus so far on empowering governing bodies to really understand the regimes being introduced so that they know which interrogation questions to put.
That need to control and manage the “narrative” is an important leadership skill in a distributed leadership environment, but it’s one that can be abused – and the independent student submission in particular represents a direct potential challenge that senior types in both universities and their SUs are obviously thinking about a lot.
You got the power
To some extent the new IWS represents an interesting clash of base instincts. Universities will want to use the TEF process to get a better medal or at least maintain the one they have. SUs will want to use the process to change things in the student interest. Now clearly SUs will also cosplay wanting the university to do well, and universities will cosplay making things better for students. But it’s a base instinct thing.
As such, what’s really interesting here is the stuff that SUs might want to highlight in their submission that they regard as bad. The subtle pressure on them will be to not do so. But if that’s what their evidence says, doing so is a duty – and will make things better. This isn’t dissimilar to how QAA student written submission used to work, but the stakes are higher. The ego and reputation issues with falling from Gold to Silver or Silver to Bronze guarantee that.
Of course in reality an SU raising a failing personal tutor system or assessment feedback that’s systematically poor is a good thing – and if the final TEF submission says it was raised in a draft and a decent action plan was agreed, that looks great. But universities will be nervous about whether the TEF panel will agree that that looks great.
It’s certainly the case that some universities will need to be careful about undue influence on the evidence I’ve heard so far. Others will need to do better on supporting their SU with access to data. Some will need to improve student engagement generally and supporting the SU to do it specifically, as that area is capable of providing both evidence for “Gold” and evidence of an “absence of excellence” in the draft descriptions.
But one thing I’d remind people of is that in many cases, in the face of defensiveness and endless kneejerk hypothesising, the SU will be able to get at why some of the metrics are the way they are. That’s very valuable for a university if the SU is supported to do that and then says that they were in their submission.
Overall, the new student submission process is a pretty good bit of regulatory design work from OfS that builds on QAA traditions for a change. Credit where it’s due.