I’ve been asking people to show me a low quality course in the data for years now, but the definition still remains unclear. Such efforts are hampered by the difficulty in finding and using accurate course-level information in public data – Unistats (my very favourite HESA release) gives us data at a range of resolutions in order to reflect the scope of the underlying data and provide at least some statistical reliability.
Providers, however, collect all kinds of data at a course level – so are perhaps better positioned to root out their own “low quality courses”. And indeed they do, this morning’s release takes pains to remind us that
Institutions already monitor and review their courses regularly and have robust processes in place to uphold quality and standards
These internal reviews regularly lead to the closure of underpeforming (and under-recruiting) courses – but each university uses a different approach and different metrics to decide where to take action. To this end, UUK have established an advisory group with terms of reference empowering it to develop a “statement of intent”, and highlight best institutional practices (in terms of approaches used and action taken). This will be followed by the publication of guidance which universities “will be expected to follow”.
Politics and practice
This is a sensible move by the representative body – it ties a dangerous floating issue that practically invites DfE and Ministers to pick on courses they, personally, dislike to an existing and established practice within providers. It allows universities to take some ownership of an agenda that could very easily have resulted in courses and departments shuttered at a whim.
By “proactively strengthening” internal course reviews, and taking steps to give the appearance of a sector wide and metrics aware approach, The longer term goal of developing a charter to enhance what are, effectively, quality assurance practices plunges UUK back into a “quality wars” position of exercising control over such processes in preference to a regulatory approach. There’s even an offer of “independent review”.
This is an England-only approach, at least initially, with a crop of notable vice chancellors making the running within the advisory review. There is a commitment to consider UK-wide implication and take soundings from other nations.
Is it to code?
The UK Quality Code includes a common practice that:
“The provider reviews its core practices for quality regularly and uses the outcomes to drive improvement and enhancement.”
There’s also a big chunk of guidance on course design and development and monitoring and evaluation, the latter of which is clear in setting out the expectation that courses are reviewed regularly and consistently, and that findings are acted upon.
The Quality Code, as excellent as it is, sits in a strange netherworld of having compliance not quite being a requirement of registration in England. The ideas underpin pretty much everything the QAA does, and if the OfS properly got behind it (it currently underpins the “Quality and Standards Review”, but how many of these have happened to existing providers since 2018?) we would already have the intervention that UUK are proffering in place.