Improving the quality of assessments of quality

For Buckinghamshire New University's Nick Braisby, the Office for Students' investigation process could be improved

Nick Braisby is Vice Chancellor of Buckinghamshire New University

Recently, the Office for Students (OfS) published its quality assessment report into business and management at Buckinghamshire New University (BNU).

The process which led to the report was far from straightforward. I’ve identified a number of ways in which these quality assessments can be improved so that in future they command greater support from providers.

Despite often trenchant criticism of OfS, I want to start from a premise that providers have a responsibility to help improve regulation of our sector.

Timing and choice

The investigative process takes too long. The assessment was announced in May 2022, It references NSS data from as early as 2019-20, yet it is only in 2024 that we learned how the assessment team judges its significance. Where genuine concerns about quality exist, all involved need to work together to get these remedied speedily in the interests of students. The current process militates against the speed required.

For most of the recent OfS investigations, it is relevant that the B conditions changed on 1st May 2022. So data time-stamped prior to May 2022 has the most direct relevance to whether a provider is complying with the old B conditions. The regulator, in responding to its consultation on the change in B conditions, made it clear that it would need to consider such timing issues carefully in any future investigations.

To give an obvious example, how should NSS data be interpreted? Given only NSS data from 2023 onwards fall under the new conditions, the relevance of older NSS data to an assessment under the new conditions needs to be carefully argued and articulated. Yet, none of the recent assessment reports appear to recognise even that the change in B conditions is an issue.

Overall greater transparency is needed. Regulatory Advice 15 explains the regulator’s methods including a suggestion that engagement with a provider should happen prior to the decision to investigate being made. My institution was neither made aware why it had been specifically selected for investigation nor why engagement hadn’t been chosen as a first step.

Thresholds and process

It has been argued elsewhere that the OfS B conditions are expressed in general terms, and lack detailed benchmarks or thresholds. However, assessors are making judgments that go beyond these conditions. For example, in relation to teaching methods or the provision of digital resources, the B conditions require the former to be “effectively delivered” while the latter needs to be “adequate and deployed effectively”. But without guidance as to what might constitute benchmarks or thresholds, how can assessors come to reliable and consistent judgments? Either they should avoid making such judgments, or they risk compromising the reliability and validity of the process.

Consistency would also be improved by adopting a standard approach, or framework, albeit one with the requisite flexibility to pursue lines of enquiry. For the inspections Ofsted runs, it publishes a detailed framework and even more detailed handbooks that explain what to expect from an inspection visit (for example, the Further Education and Skills Inspection Handbook). Our experience of the Ofsted framework gave us great confidence in its outcomes, and I hope OfS will work with providers to develop a similar approach.

Use of evidence

Within the investigative process, as far as I can tell, there is no mechanism for OfS to intercede where an assessment appears to have gone wrong, perhaps where the assessment team has over-interpreted some key sources or disregarded others. I think it would help for OfS to check with providers as to the perceived balance and rigour with which an assessment has been conducted.

From its inception, OfS has adopted a welcome data- and outcomes-focus. Yet in making their assessments, assessors need to make inferences from data and evidence about the wider quality regime. But these inferences are inherently defeasible. There is an obvious risk that assessors, pursuing a line of enquiry, may focus excessively on confirmatory evidence; they need to be guided actively to consider possible disconfirmatory evidence.

The regulator’s risk-based approach also suffers a form of confirmation bias: by focusing only on cases where regulatory intelligence suggests there could be a quality concern it fails to test its own approach. Disconfirmation means investigating at least some cases where regulatory intelligence does not suggest quality concerns.

The impact

At BNU, some staff closely associated with the subject and/or aspects of educational provision, have suffered from the pressure of the ongoing investigation for more than 20 months, and now from the public reporting of concerns with no right of reply – the latter only being possible within the media, which is often not appropriate. Some staff feel their professionalism has been unfairly and unreasonably impugned. The Ofsted inspection regime has recently been modified to respond to similar criticism and I hope that in future OfS will similarly pay close attention to the experience of staff and consider how to better safeguard their well-being.

In this article I have focused on just a few of the things that would improve the OfS assessment process. While it could clearly benefit from some improvements, there is much good within the process too. Anything that increases the sector’s focus on teaching quality, and that can bring the regulator and providers together to improve the way it is assessed, can only be good for the sector.

2 responses to “Improving the quality of assessments of quality

  1. I read the initial inspectors’ report and the points raised above are important mitigation and I hope are taken onboard by Ofs. It is not really fair to be judged against new criteria when the evidence offered refers to a prior period or vice versa.

    Personal interpretation of words, phrases and sentences in documents will always be interpreted and understood in different ways by different people, partticularly if the original document was ambiguous.

    The time lapse between the date when notice of the investigation is given, to the date of the visit and then the date of the report and the decisions on what action will be taken should be as close as possible. Unnecessary delay must be avoided. In the reports I have read, the investigators have acknowledged that changes may already have been made to rectify some of the original critical comments made. Unfair commentary is not in the interest of either partner given the mutual desire to improve things for all involved.

    There are differences between Ofs and Ofsted both in terms of objectives and process, but shared best practice should be identified and when relevant, implemented. As more Universities diversify from degrees to apprenticeships and other qualifications / courses with different regulators, things will become more complex and perhaps universities will be faced with contradictory directives relating to the same issues.

    The commentary from the Vice Chancellor is in the form of a balanced contribution to achieve improvement for all parties involved.

  2. I welcome the Vice Chancellor’s comments on the complexities of the review process and for making the point around the timeliness of reviews. Furthermore, the matter around the additional burden and pressure faced from staff to support OfS reviews and other regulatory requests needs to be further amplified.

Leave a Reply