Are the inspectors coming for your business school?

As the regulator launches a string of investigations into poor quality higher education courses, Jim Dickinson wonders why we're shutting students out of the process

Jim is an Associate Editor at Wonkhe

The cosmic ballet goes on. The Office for Students (OfS) has launched “eight new investigations” into “poor quality” business and management courses (subject areas) across the sector.

Regular readers will know that these have been coming for some time – and have been the subject of guidance from ministers on what and who should be looked at.

As such, it’s no surprise that investigations will examine whether “poor quality online learning” has replaced face to face teaching to the detriment of students’ academic experience.

Other factors under consideration include whether the delivery of courses and assessment is effective, the contact hours that students receive, and whether the learning resources and academic support available to students are sufficient – covering various aspects of OfS’ newly minted “B” conditions on quality.

The why bird stop

The immediate questions surround the choice of subject area, and the choice of providers.

On the former, OfS says that given its risk-based approach to regulation (where the risk often seems to be “getting a telling off from the minister”) it is focusing on business and management courses – on that basis that it is a large subject area where there is significant variation in performance across the sector, something that it says shows up in intelligence drawn from student outcomes data and National Student Survey responses.

In order words, while it’s accepted the in-principle guidance from ministers (“a set of investigations focused on a major subject grouping with large numbers of students and high variation in outcomes, with the intention to drive up the quality of those courses across the sector as a whole”) it hasn’t gone for ministers’ initial suggestions of “Computer Science or Law”.

The theory, remember, is that focusing on a single subject area with a large student population will enable OfS to snowball – it will understand patterns of provider behaviour that might extend to other subjects in the same provider (!) as well as patterns of behaviour that might be replicated in other providers that deliver business and management courses.

We don’t know the identity of the lucky eight providers at this stage – not least because officially OfS is still consulting on whether to make the names public in this scenario – but we do know that it is focusing on universities and colleges with larger student populations because any interventions made to improve quality will have a positive impact on a significant number of students.

That doesn’t seem like a very level playing field to me, but maybe I’ve got the wrong end of that bat and am worrying too much about students enrolled in parts of the long tail of providers on the register – where I can make a decent argument about risk if not student number volume. Put another way – why we suggesting that a student at Bobbleton University faces more risks than a student at Britannia Rules the Waves College, situated above a vape shop in Ilford?

The clipboards are coming

In terms of process, OfS confirms that each investigation will involve an onsite visit – and it’s inviting applications from “academic experts” to help undertake this onsite assessment work.

That’s where things get interesting. In the consultations on this regime, there was some concern that the “designated quality body” – ie the QAA – would not be the body being called on to do this sort of work. DK has reflected on the story of how we got here a while ago on the site, but broadly for whatever reason OfS is determined to recruit its own pool of experts to advise on the issues under investigation here.

That, though, raises a very specific difference between QAA reviews (which routinely involve expert students as well as academics as reviewers) and this new in-house expertise regime. When OfS announced the identities of the members of its blended learning review panel last week, I was taken aback to find that none of them were students.

I had thought that it was now so well established that students can and should be expert partners in their own learning and the evaluation of quality that it would be completely unthinkable for such a panel to not include any student expertise. Wrong again.

OfS came back and assured us that members of its student panel are fully integrated in the review process, advised on the scope and approach of the review, will join all meetings with providers and students, ask questions and provide advice – “their views will shape the outcome of the review”.

But being consulted is not the same. They’re not framed as experts on the review panel. It’s a world of difference.

It would be like a governing body reviewing provision where students were fully integrated in the review process, advised on the scope and approach of the review, joined all meetings, asked questions and provided advice – but there was no actual student member of the body.

I had wondered whether that would be a one off – I certainly hadn’t responded to the consultation on the B conditions to stress the importance of students as experts because I stupidly thought it was just a given these days.

But the press release surrounding this investigations exercise confirms that a “pool of experienced academics” will lead the investigative work. No students in that pool.

Make it make sense

What’s most surreal about it all is the internal contradiction within OfS. In its proposals for the TEF, as well as students making an independent submission, “expert review” will be carried out and ratings determined by a panel of academics and students who are “experts” in learning and teaching:

We consider that the best way to ensure the assessments are robust and credible is for those with expertise in the student experience and student outcomes to evaluate the evidence and make judgements about levels of excellence for providers with different mixes of courses and student groups.

Why can students be expert TEF reviewers but not expert B conditions reviewers?

I know there are those that will be reading this rolling eyes and thinking “they’ll speak to students when they turn up, surely, so what’s the difference” and might regard the idea of the student-as-expert in a role like this as pointless symbolism, or tokenism.

But this runs counter to decades of good practice and experience of watching students making a real difference to this kind of work, sends absolutely the wrong signals about the role of students in quality as providers are invited to streamline those processes, and goes to the very heart of what we think about students. They are either partners or not, and capable of making judgements about quality or not.

7 responses to “Are the inspectors coming for your business school?

  1. I appreciate that this approach shuts out students from the process, but it is also shuts out the DQB (expertly dissected by DK in the tagged article) and academic colleagues more widely. I am sure the QAA is digging away in the background, but why it is not more publically challenging this position is a mystery and I am unclear as to what it is actually now doing as the DQB. How on earth as a sector have we allowed it to come to this?

    1. The QAA knows it’s days are numbered in that sense which is why it’s spending its time building up its consultancy services in the middle-east and elsewhere.

  2. The question is do the eight institutions involved know who they are? The press release notes:

    “ The OfS will write to the eight individual universities and colleges this week setting out details of the investigations…”

    This feels a little bit fear mongering as it it implies it hasn’t and that letter to your VC could drop into his inbox anytime soon.

  3. For maximum absurdity, what are the odds that the OfS decides to visit over the summer when there are fewer staff and students around. Much more convenient to judge an institution when you have comforting spreadsheets to cling to rather than inconvenient people to talk to with their annoying counter-arguments and ‘facts’.

    Or perhaps I am being unfair? Anyone care to lay odds?

    1. Well…the other problem is that in the sorts of big schools they will tackle – it’s mainly going to be PGT students around.

      The schools will move quickly to avoid the inspectors talking to students who do not seem to speak any English but do well on essays.

  4. Doesn’t look like these are going get off the ground any time soon – they are only recruiting the assessors now – a process they expect to take two-three months. Looks like it will then take a couple more months to actually come to fruition before any OfS issued jackboots hit the ground. Hardly a particularly efficient process, and if its anything like their botched registration system will probably be another couple of years before they actually get any output. Certainly looks like something they are more likely to have put out to keep the minister happy rather than because their plans have any substance.

    I pity any poor academic colleague who actually puts themselves forward for this.

    1. I concur with this assessment, and this sentence “Certainly looks like something they are more likely to have put out to keep the minister happy rather than because their plans have any substance.” presumably also helps explain why we’re targeting bigger institutions rather than worse smaller ones.

      And re student involvement (none in this Vs TEF), I think there is a growing disconnect between OfS on TEF and OfS on outcomes more generally. With the latter, you have Susan Lapworth’s constant criticism of institutional bureaucracy being a waste of time (the classic interpretation of bureaucracy as something other people do, while what you do is effective and necessary practice) Vs TEF which repeatedly references the expectation that our submission will be driven by the internal processes which Susan wants abolished.

Leave a Reply