An eighteen-year old looks for an alternative path on A level results day. A mature learner wants to change career. A careers adviser wants to point clients in the right direction. An employer wants to recruit a graduate. A faculty committee is designing a new course. How do they know if their course is any good?
Providers vouch for the quality of their own provision, the Quality Assurance Agency vouches for the quality of this vouching, as it were. But this is all institutionally focused. The other, less-discussed, end of the UK’s Quality Assurance system focuses on the subject area. And if you think about it, isn’t this what people are really interested in?
Back in the nineties the QAA focused specifically on reviewing subject provision, but successive regulatory changes – coupled with some powerfully lobbying – mean that this is no longer the case. The QAA subject focus is now manifest in its system of subject benchmark statements which, in the words of the agency’s Alison Felce “lay out the nature of often quite technical subjects in an easy to understand way for both academics and quality staff”.
In setting out the recommended subject content for particular courses, the QAA draws on the expertise of professional, statutory, and regulatory bodies (PSRBs). These range from bodies that act as a gateway to defined careers, to those that simply draw together industry and professional interest in defining the required content of a course.
How can we tell?
Employers, obviously, have an interest in ensuring that graduates meet “minimum skills requirements” – the idea being that any graduate of a course accredited by their PSRB would be able to start work, safely, in their chosen career.
Providers, you’d think, would be proud to say that their course meets the standards set by a professional body. And prospective students would want the imprinteur of an august society – in some cases are required to have such attributes – to ensure their smooth entry into the workplace.
But you’ll look in vain for the ability to filter by accrediting body on the main UCAS course search. Some, but not all, course summaries will note these accreditations within the text. The much-maligned Unistats (soon to become “Discover Uni”) lets the intrepid searcher filter by whether a course has an accreditation or not, less than useful in subject areas with one or more active PSRB.
Both these approaches broadly abide by the CMA advice that accreditation status and source should be communicated to prospective students if the provider wants to stay on the right side of the Consumer Protection from Unfair Trading Regulations 2008. But it hardly gives the seeker after course any agency in using these facts as distinguishing factors.
Wonkhe to the rescue
So, as a part of my summer odyssey into the nightmare that is Unistats data, I’m happy to present two visualisations that may help.
On “PSRBs by subject” you can use the subject drop downs on the right to select a general or specific subject of interest. You can highlight individual providers in the graph using the highlighter box. Click on a course in the main graph to see details of all accreditations linked to that course.
And on “Accreditation by subject” (an alternate view of the first graph) I show simply the gap between accredited and non-accredited courses, by subject. You can also filter by group and region, and clicking on an institutional entry in the graph shows accreditation status for all courses in that subject group. You can also filter by group and region.
Data note: This data is presented as published in early July by Unistats. Any omissions and inaccuracies are present within the Unistats data.
The state of accreditation
What we learn from plotting the data in this way is the wide variation, even in quite small subject areas, in accreditation practice. There are many courses where you would expect to see an accreditation where it isn’t there – and many courses that may touch on a popular subject of study, but are secondary to a main course.
Overall, the majority of courses offered by UK providers are not externally accredited. This is as expected; not all courses are vocational enough or subject-specific enough to need it.
The administrative overhead of accreditation is a factor. Some PSRBs manage this process very well, either by streamlining their own processes to take account of the need to avoid additional work from academic and support staff, or by contracting out their accreditation operation to specialists.
But there are still horror stories – requirements for vast amounts of information (sometimes demanded in hard copy), lengthy accreditation visits, and of competing PSRBs setting deliberately antagonistic requirements that result in work being duplicated. There are also many who feel that the precise prescription of course curricula cuts across academic freedom, and makes courses less responsive to developments in the field.
There has been a movement from many accreditors to link their external reviews with the internal course review process, or do use publicly available materials where possible. Many are becoming less prescriptive as to course content, setting themes and skills expectations rather than stipulating methods and approaches.
Some PSRBs are starting to move away from the accredited course model all together – the Solicitors’ Regulation Authority will, from 2021, introduce a Solicitors Qualifying Examination which will eventually replace the accredited course route for those planning a legal career. They’re doing this for good reason, opening the practice of law to those unable or unwilling to take an academic course where they may already have other (workplace) experience. At this point law schools may choose to compete based on their ability to teach to that test, or on their ability to give a specific grounding in specific areas of the law.
Does accreditation attract a grade premium?
This prompts the other big question – whether accreditation actually attracts students. As a way into this I’ve plotted the average tariff points of current students against the accreditation status of the course. And it seems if one purpose of accreditation is to convince the more discerning student to sign up for your course, it doesn’t appear to be working. In the majority of cases there doesn’t appear to be any relationship.
Part of this may be down to the fact that level three of the Common Aggregation Hierarchy (CAH) is still pretty broad, part may be due to unistats inaccuracies (I note that both the CMA and HEFCE have complained about this in the past), and part due to cynical subject coding by providers (check out biomedical/healthcare sciences for an example of that). But even within fairly homogeneous areas there is a huge variation in practice.
These are the qualifications actually held by current students, not the “average offer” or other such marketing materials. Not all courses on unistats have this information attached to them, and many that do use wider aggregations within a subject area to make up for smaller numbers.
For some courses (medicine, nursing, (currently) law…) accreditation is pretty much a requirement, the few unaccredited courses are likely new and seeking accreditations or just data errors. But what I was hoping to find, and didn’t find, was evidence of subject areas where there was a tariff premium attached to accredited courses. But the old in-built assumed hierarchy of providers still holds true. So I’ve plotted that as well.
As excitement about “Discover Uni” builds to fever pitch, and as another UCAS cycle draws to a close, we really need to take a moment to ponder the significance of last graph. The policy direction of the last decade has been towards giving students better information about what the course they are borrowing money to study is actually like. There’s still no indication whether students are using it.