Telling people off is not a quality enhancement tool
David Kernohan is Deputy Editor of Wonkhe
Tags
It turns out that a more constructive regulatory approach – one that is less focused on deficiencies, more transparent in design, that more explicitly engages with context, and includes provider responses – would make for better and more effective regulation.
That’s the (perhaps unsurprising) key finding from an externally commissioned (York Consulting) review of the Office for Students’ investigation reports. These are the ones (B3, quality compliance) that appear before regulatory outcomes are decided, and the evidence suggests that these are largely read from behind closed fingers by providers who fear that they may be next in line.
I say “read” advisedly:
Most participants mentioned Wonkhe as a key additional source of awareness. For many, Wonkhe served as a filter or interpreter of the OfS’s publications, offering summaries, analysis, and commentary that made the reports more accessible. This was particularly important for participants who lacked the time to read full reports. Wonkhe was also valued for its critical perspective and sector-wide framing.
Other than a welcome reminder of what good value a Wonkhe subscription truly is (and these comments appear to have been completely unprompted!), we can surmise here that report literacy has suffered because each seemed to emerge as a methodological surprise. And where providers are reading defensively (to avoid something similar happening to them, or at least to inform how a provider would manage if it does) it is difficult for the hoped for benefits concerning continuous and reflective improvements to emerge. If we are triggering a threat response, there is unlikely to be much reflection.
While many participants welcomed the publication of assessment reports as a positive step toward greater transparency, this appreciation was tempered by ongoing frustration over the lack of clarity surrounding how assessments were conducted. Questions remained about how providers are selected, how assessments are conducted, and what the outcomes truly signify – particularly in cases where reports concluded with no regulatory action. This ambiguity left some providers unsure about the implications of being assessed and how best to respond.
While B3 and other quality assessments sit squarely on the assurance end of the quality spectrum, OfS have displayed a touching faith in their use in enhancement (beyond I guess, panicked queries as to whether our university is doing the thing the investigators called out). And the often combative language actively works against that.
Participants noted that the reports often emphasised what had gone wrong, rather than offering a balanced view that included examples of effective or innovative practice. This approach was seen as discouraging open dialogue and learning, especially among providers who might otherwise be willing to share their experiences.
The sector clearly is engaged with the OfS’ regulatory approach in the same way that antelopes are engaged with the activity of large predators. While there was some evidence of use in benchmarking and internal validation, much of this may well be attempts to internally replicate the OfS methodology to ensure that somewhere is complying with whatever the tea-leaves (one provider has an honest-to-god “OfS monitoring group that looks at everything that comes out and then summarises key messages”) suggest the regulator currently wants or expects. The sheer cognitive load of keeping abreast of this stuff can be seen in the finding that larger and more established providers are more likely to report high awareness of this arm of regulation than smaller or more specialist peers.
None of this is to suggest that the reports themselves are poor quality: they are seen to be thorough, detailed, and surprisingly granular. Wonderfully we hear that focus groups “some likening the level of detail to the QAA audit reports.” The issue really is a methodology that is perceived as arbitrary and a process that feels unclear is an occasion of anxiety not learning – and caution rather than inspiration or innovation will always be the outcome while that persists.
The other arm here is openness among providers – it is possible to imagine an effective approach to teaching quality that is collaborative, practice focused, and enhancement led (and there is a very successful example of this in Scotland) but you are not going to get an environment that fosters development and investigation if you feel that a candid admission that things are not quite right yet could lead to a specific condition of registration.
We’ve lived through a bunch of “resets” in the way OfS relates to the sector it regulates, and the language we’ve been hearing from Edward Peck suggests that under his leadership real change will come. For the actuality of teaching quality, this can’t come fast enough.