The cosmic ballet goes on. The Office for Students (OfS) has launched “eight new investigations” into “poor quality” business and management courses (subject areas) across the sector.
Regular readers will know that these have been coming for some time – and have been the subject of guidance from ministers on what and who should be looked at.
As such, it’s no surprise that investigations will examine whether “poor quality online learning” has replaced face to face teaching to the detriment of students’ academic experience.
Other factors under consideration include whether the delivery of courses and assessment is effective, the contact hours that students receive, and whether the learning resources and academic support available to students are sufficient – covering various aspects of OfS’ newly minted “B” conditions on quality.
The why bird stop
The immediate questions surround the choice of subject area, and the choice of providers.
On the former, OfS says that given its risk-based approach to regulation (where the risk often seems to be “getting a telling off from the minister”) it is focusing on business and management courses – on that basis that it is a large subject area where there is significant variation in performance across the sector, something that it says shows up in intelligence drawn from student outcomes data and National Student Survey responses.
In order words, while it’s accepted the in-principle guidance from ministers (“a set of investigations focused on a major subject grouping with large numbers of students and high variation in outcomes, with the intention to drive up the quality of those courses across the sector as a whole”) it hasn’t gone for ministers’ initial suggestions of “Computer Science or Law”.
The theory, remember, is that focusing on a single subject area with a large student population will enable OfS to snowball – it will understand patterns of provider behaviour that might extend to other subjects in the same provider (!) as well as patterns of behaviour that might be replicated in other providers that deliver business and management courses.
We don’t know the identity of the lucky eight providers at this stage – not least because officially OfS is still consulting on whether to make the names public in this scenario – but we do know that it is focusing on universities and colleges with larger student populations because any interventions made to improve quality will have a positive impact on a significant number of students.
That doesn’t seem like a very level playing field to me, but maybe I’ve got the wrong end of that bat and am worrying too much about students enrolled in parts of the long tail of providers on the register – where I can make a decent argument about risk if not student number volume. Put another way – why we suggesting that a student at Bobbleton University faces more risks than a student at Britannia Rules the Waves College, situated above a vape shop in Ilford?
The clipboards are coming
In terms of process, OfS confirms that each investigation will involve an onsite visit – and it’s inviting applications from “academic experts” to help undertake this onsite assessment work.
That’s where things get interesting. In the consultations on this regime, there was some concern that the “designated quality body” – ie the QAA – would not be the body being called on to do this sort of work. DK has reflected on the story of how we got here a while ago on the site, but broadly for whatever reason OfS is determined to recruit its own pool of experts to advise on the issues under investigation here.
That, though, raises a very specific difference between QAA reviews (which routinely involve expert students as well as academics as reviewers) and this new in-house expertise regime. When OfS announced the identities of the members of its blended learning review panel last week, I was taken aback to find that none of them were students.
I had thought that it was now so well established that students can and should be expert partners in their own learning and the evaluation of quality that it would be completely unthinkable for such a panel to not include any student expertise. Wrong again.
OfS came back and assured us that members of its student panel are fully integrated in the review process, advised on the scope and approach of the review, will join all meetings with providers and students, ask questions and provide advice – “their views will shape the outcome of the review”.
But being consulted is not the same. They’re not framed as experts on the review panel. It’s a world of difference.
It would be like a governing body reviewing provision where students were fully integrated in the review process, advised on the scope and approach of the review, joined all meetings, asked questions and provided advice – but there was no actual student member of the body.
I had wondered whether that would be a one off – I certainly hadn’t responded to the consultation on the B conditions to stress the importance of students as experts because I stupidly thought it was just a given these days.
But the press release surrounding this investigations exercise confirms that a “pool of experienced academics” will lead the investigative work. No students in that pool.
Make it make sense
What’s most surreal about it all is the internal contradiction within OfS. In its proposals for the TEF, as well as students making an independent submission, “expert review” will be carried out and ratings determined by a panel of academics and students who are “experts” in learning and teaching:
We consider that the best way to ensure the assessments are robust and credible is for those with expertise in the student experience and student outcomes to evaluate the evidence and make judgements about levels of excellence for providers with different mixes of courses and student groups.
Why can students be expert TEF reviewers but not expert B conditions reviewers?
I know there are those that will be reading this rolling eyes and thinking “they’ll speak to students when they turn up, surely, so what’s the difference” and might regard the idea of the student-as-expert in a role like this as pointless symbolism, or tokenism.
But this runs counter to decades of good practice and experience of watching students making a real difference to this kind of work, sends absolutely the wrong signals about the role of students in quality as providers are invited to streamline those processes, and goes to the very heart of what we think about students. They are either partners or not, and capable of making judgements about quality or not.