David Kernohan is Deputy Editor of Wonkhe

England’s higher education regulator first summarised the state of business and management education back in 1995.

This was based on a detailed provider submission made two years earlier by 105 providers, and 47 subsequent assessment visits made by HEFCE during 1994.

Just 19 of these examples of provision were judged to be “excellent” – with all but one of the remainder judged satisfactory (North East Worcester College made improvements based on the initial findings and was judged satisfactory the following year).

A report was published for every visit, with findings summarised across the sector in a “subject overview report”.

One eternity later

For most of the intervening 28 years, this work remained the state of the art for our understanding of the quality of business and management provision.

There was a further round of business and management reviews in 2001 – rounding up new provision that had been established since 1994 (but under slightly less intense circumstances). In terms of actually understanding the academic experience of students in business properly, 1995 marks the last time we tried.

Last week, the Office for Students released a “subjects in profile” publication for business and management studies, alongside two (of a rumoured eight) provider reviews. This is an entirely data-driven examination of provision – though there are some questionable omissions (I would have liked to know, for instance, how many providers had qualifying provision – we dive straight into tariff point groups).

The majority of what is there will be familiar to anyone who knows the various Office for Student data dashboards – 54 per cent of undergraduate students are male (up from 50 per cent in 1993), and the median completion rate for full-time undergraduates is 87.1 (compared to “generally sound, with some reaching 100 per cent” in the nineties). Some 78 per cent of graduates have a good graduate destination (a “good” job or further study) – nearly 30 years ago we were told in “most places” this is nearer 75 per cent rising to 90 per cent among the very best.

Was business studies any good?

In the early nineties we were told that:

The overall picture is of a thriving, developing provision in business and management studies, against a background of significant increases in student numbers at a time of resource constraints. Prospective students, employers and others with an interest in the subject area should be confident of finding at least satisfactory quality.

Even a rudimentary comb through the summary document will yield insights – there were, for instance, 40 per cent of providers focusing teaching around interdisciplinary problem-solving approaches and 48 per cent using a more theoretical, research-based, approach. We get commentary on what is taught as well as how – assessors recommended that IT and international business applications were among those that they would liked to have seen more emphasis on in the places they visited.

The nineties assessors observed more than 1,000 individual classes – with the number in each provider rated “excellent” ranging from 10 per cent to more than 66 per cent. An “unsatisfactory class” (between 0 and 10 per cent depending on the provider) is usually characterised by “students doing little beyond note taking” and an “inappropriate”” pace.

Is business studies any good?

The nineties report is characterised by commentary on processes – consideration is given, and recommendations made, on everything from course design to staff training and development. In contrast, the 2023 version is focused entirely on inputs and outputs. This is a deliberate regulatory choice – the advent of the Office for Students has seen what happens at providers treated largely as a black box (except when ministers express concerns, obviously) with applicant characteristics at one end and graduate outcomes at the other.

To get any sense of what actually happens in a university business school, you’d really need to look at the provider specific reports – we currently have just two examples of these. Both show instances where the Office for Students is broadly happy – thought they’d quite like the University of Bolton to have more staff.

Though in one of the two recent cases (South Bank) Office for Students assessors observed teaching sessions we don’t get much information about what they saw – employability modules were apparently “well planned”, and featured “appropriately pitched and well focused theoretical input”, with students appearing “engaged”. All of which feels like it might be, well, satisfactory.

One important difference between these two reports and the state of the art back in the nineties is that they focus on just a handful of courses each – previously reports would cover all teaching activity within a named subject area. In both instances that we have reports for this specialisation was done during the assessment process on the basis of size only – we don’t have reliable data at course level (as much as I love the unistats dataset, that doesn’t really count).

It is commendable to attempt to look at courses – these, after all, are what applicants apply to – but the reports themselves veer alarmingly between course and subject level. You’ll often see statistics that are true of the subject area (the majority of Institute of Management students at Bolton apply without tariffed qualifications) but not of the courses in question. Though OfS likes to position this activity as being primarily data driven (remember all the fuss about the numerical thresholds!) what we see here is a much more human approach.

Again, this would all be well and good (the old Subject Review teams at HEFCE and then QAA had the ability to peek into anything that seemed interesting) if the process, or the reasoning behind decisions, was clear and consistent. In not fully documenting how these things are supposed to work the regulator has left itself open to all kinds of challenge.

What to look for?

It’s hard to escape the feeling that a deficit model of quality is not serving us well at this point. Reading both reports, and to a lesser extent the language of the subject area report, you see a focus on problems rather than the totality of provision. It’s as if OfS expected assessment teams to catch providers in the act of deliberately conspiring to give students a poor academic experience. In the Bolton report particularly, the team appears impressed and surprised at the scope and value of current and planned interventions – leaving the report with the embarrassing conclusion that, were the business school properly resourced and able to employ the number of staff it needs, there would be no issue.

In responding here, the Office for Students has two choices – it could either (effectively) align itself with UCU in criticising poor management and wasteful spending, or align itself with UUK and the wider sector in criticism of the current availability of funding. All institutions are facing the need to find even more ways to do more with less after years of similar pressures, and “efficiency savings” are now cutting directly into bone.

As more reports – and regulatory responses – emerge we’ll understand the way this will play out better. The challenges made to the regulator about undue alignment with government will certainly play a part in defining a useful response – this administration, like many before it, seem concerningly keen to “punish” universities rather than improve them and we’ve seen how OfS’ instincts align with that mindset already.

Leave a Reply