Like colleagues around the sector, we are working on our OfS registration.
How much of a burden is this task? It is striking that the new “light touch” OfS looks significantly more expensive for many institutions than HEFCE. There is the cost for established institutions of making a fresh case for registration, after responding to some years of regulatory flux, with Institutional Review removed for all but alternative providers and a short-lived national quality framework, which disappeared before HE governors had time to blink.
There are the demands on small, specialist institutions, with some prompted to calculate whether to decline registration and work instead under a validating partner.
The deadlines are tight. We are working towards 23 May to have registration in place for September; with an earlier date of 30 April for some. The OfS method, with its emphasis on metrics, is linked to the move towards HESA Data Futures and we have to develop greater effectiveness in the use of data, including the tracking of student progression and outcomes.
There is the risk that the complexities of monitoring volatile student populations may be underestimated in the assessment of institutional statutory returns and in student support.
Student protection plans
The issue that has sparked much correspondence on “registraral” email lists is the requirement to have Student Protection Plans – a seemingly Kafkaesque part of the registration process. It is a struggle to produce documents that are meaningful to students. We have to write documents that will be acceptable to OfS, but that don’t conjure up risks highly unlikely to be realised. Institutions could place themselves in an awkward position where they are entrenched as responsible for factors that are really beyond their control.
We are already careful to protect student interests by teaching out course provision when it has to close or change substantially and to consider compensation when things go wrong. We have to outline our arrangements for consumer protection in the self-assessment under OfS condition C1.
We aim to be clear what we are offering to students and to meet our commitments. It is not to the advantage of students, however, if institutions are inhibited from a cycle of course revalidation: we have to be fleet-footed, keeping our courses current and enhancing them in partnership with students.
Access and WP – from inputs to outcomes
We are preparing our Access and Widening Participation Plan, with a short timescale for institution-wide discussion before the application deadline. We are reflecting on how to balance priorities for investment. We are conscious of the shift in the national assessment of the impact of learning and teaching from input to output measures, as discussed elsewhere on Wonkhe.
As with TEF, this can lead to a cul-de-sac of unsound assumptions about causal relationships between factors, such as the over-attribution of student employment outcomes to institutions rather than to the influence of social background.
The OfS is, however, making much use of existing data, particularly for the criteria on quality and standards, management and governance, and financial viability and sustainability – although we are required to repeat some information already available in annual accounts.
A richer view of quality and governance
What are the wider benefits of this frenetic registration activity towards the construction of a sound framework for governance and quality? Within institutions, we need checks and balances to avoid “groupthink” and the failure to detect underlying risks.
We have to satisfy multiple accountabilities, including students. We need internal and external stimuli to help us address fundamental questions about quality; what are we trying to do, how are we doing it, how do we know it works, and how can we improve?
The OfS is relying on metrics to account for a vastly diverse sector, indicators that something is already going wrong (and it will probably have to go seriously wrong to trigger scrutiny), and a national lottery of a random sampling of institutions.
There needs to be a richer view of quality and governance across the sector. Recently, we had an accreditation visit from the Nursing and Midwifery Council. They are uncompromisingly serious about public protection. The exercise was transparent, thorough, proportionate and developmental, with the reviewer actually visiting the institution to see what was going on.
There appears to be something missing from the new national framework – a developmental periodic institutional review, conducted by peer reviewers. Some are glad to be rid of that system but its loss may be a disadvantage masquerading as an advantage.
Nobody wants unnecessary bureaucracy but it is worthwhile to invest in quality and that involves systematic observation and conversation between, many interest groups. We welcome the resources of Advance HE, bodies such as Jisc, the AUA and the Academic Registrars’ Council and the QAA, with its albeit truncated Quality Code. Perhaps those valuable networks can help the sector to assert a more thoughtful and developmental approach to quality.