The publication of HEFCE’s Revised Operating Model for Quality Assessment is the latest move in a long and complex tale that Wonkhe has been tracking since the autumn of 2014. The Funding Council – now with its own future, or at least name, in some doubt – have been clear throughout that their primary driver has been to secure the best deal for the sector.
And so it was decided that the Council’s contract with QAA would not be automatically renewed, thus sparking a fevered period of review, consultation, rumour and high (and low) politics among agencies. Today, HEFCE announce a raft of tenders for different component elements of their new QA system.
The cost of the QAA has only ever been a small component of the wider network of HE quality assurance activity – in-house institutional processes and the needs of PSRBs are far greater contributors. For context, the current HEFCE-QAA arrangement has an annual cost of a little over £5m – just over 0.4% of the total annual cost of QA according to HEFCE. KPMG estimated that an average institutional saving of 0.1% of institutional expenditure would be possible by the entire abolition of external regulatory requirements.
Needless to say, the HEFCE plans do not quite go that far.
We must applaud HEFCE for their attempts to drive efficiency in the areas that they do directly control, but we can’t lose sight of the wider implications of these proposed changes for the costs borne by institutions. The Revised Operating Model for QA does little to change the frequency of the much-grumbled about institutional engagements. Now with the friendlier name of “independent peer review” these still take place on a five-year cycle, but to a new specification that may require the alteration of internal institutional processes to meet developing requirements.
The new model also adds an annual requirement for institutions to report on QA at a high levels alongside existing financial assurance data at a governing body level. This new governing body obligation is supported by a new contract that will be issued to develop for advice and support for govenors – this is in response to concerns expressed at question 11 of the QA consultation.
Looking across the full range of activity, from sector entry (the Gateway) to the Established Provider processes there are, at our count, four separate roles for “independent peer review” activity. Though the detail of each has some differences, it is difficult to imagine a situation where multiple organisations developing and running peer review processes would be more efficient than just one.
But it is also possible to argue that there is too little, rather than too much change. The Green Paper proposed an ambitious combination of sector entry processes and the granting of Degree Awarding Powers (DAPs). HEFCE’s proposals do not deal with the latter at all, leaving it with the Privy Council – who commission assurance around that service from the QAA. If there is one place in the entire QA process where there is the opportunity for consolidation, it is the burden placed on new sector entrants who want both HEFCE recognition (thus allowing their students to access fee loans) and DAPs.
The plans are not futureproof: the only mention of online delivery, a rapidly growing practice that has huge implications for franchise arrangements and transnational learning, is in creating an online course for external examiners. The document also remains silent on managing institutional (financial) failure in ways that do not compromise academic quality.
A UK-wide standing committee that would redevelop the famed Code of Practice does not appear to take into account the read across between this process and a (separate) contract that will be issued to carry out international activities. Surely the easiest way to cut the costs of international activity is to build in interoperability with international standards at the regulatory development point? As the plans stand, the job of the ‘international activity’ would be to fix issues as they emerge after the event, rather than build in resilience to the overall code.
Incidentally, the QAA – which does have the expertise required to do this but may not be selected to hold the contract in question – is a named participant in this group. Even across the UK, different components of the new plan apply to different devolved nations – some aspects are delivered by funding councils, others by sector representative bodies. Such a piecemeal system may be fashionable, but just managing the various required collaborations looks like a full-time job in itself.
The gestation of the quality plans has been long, the high agency politics alternately bewildering and disappointing. Others will surely comment at length on these more abstract issues. As we digest the implications of the new system, here at Wonkhe we simply ask whether these plans will meet the goals that HEFCE has set for itself on behalf of the sector.
As things stand, no one can be confident that these changes will reduce either the money or the effort that goes into running the current quality system.
The changes could be positive, they might not be – the devil is in the detail of the successful tenders; I think the last sentence sums it up nicely. However given the transaction costs of moving to the new system will be significant, it is a very high-stakes game.
Really good to see that HEFCE intends to consider an institution’s mission and strategy within its baseline regulatory requirements but that’s possibly about the only thing that’s really good. Well, maybe the stuff on new providers and TNE is good, too.
So, we’re still talking ‘risk-based approaches’ without a clear definition of what is meant by this – risk to what, exactly? A methodology considering risk to financial stability would look very different to one considering risk to student learning outcomes… Similarly, what’s the ‘student academic experience’, exactly…?
The one-off verification of a provider’s approach to its own review processes makes little sense; what happens if we immediately amend such the following year? Not at all clear how capturing student views for the Annual Provider Review will differ to the NSS or how such will add value on top of the NSS.
When did HEFCE get given responsibility for standards?! Fairly sure the legislation relates to quality! Is HEFCE just allowed to nick the FHEQ etc?
Allowing something to be reported directly to the relevant funding body where there’s a problem? Why aren’t we allowed to have a go at putting it right first? Shouldn’t this work as with OIAHE, that our internal procedures need to be exhausted first?
The external examining system – do we still need one?
Calibration – good to see the Australian system hauled out again, it’s obviously directly comparable to the UK, what with them having around 40 institutions and us having 150 ish (not including alternate providers or FE in HE). Will this really scale up in this way? Why nothing on describing a First, roughly, as guidance – we really are lacking anything in this area.
Good to see the HEA report also dragged out, reporting that 47% of institutions surveyed had changed their degree classification algorithms; the report actually states that “However, 40% of quality officers report that their institutions have changed their award algorithm(s) in the last five years to ensure that it does not disadvantage students in comparison with those in similar institutions, compared with 43% who report that they have not made such changes… Responses were received from quality officers employed at 98 of the 159 institutions with degree-awarding powers (62%).” This, then, is not 40% of all institutions but 40% of those that responded; approximately 25%. So, a far less bleak picture than the one painted.
The too little is too much…