As providers consider OfS’ instruction to consider a “variation” to their access and participation plans, there will be plenty of wonks in the sector wondering what their revamped APP targets should look like.
When we first put together our current plans (which were to cover a five year period to cut down on bureaucratic processes, only now to be told the plans are too long and too bureaucratic) we were asked to engage with targets covering all three stages of the student life-cycle – access, participation and progression.
The progression target was somewhat of a misnomer – the Destination of Leavers of Higher Education (DLHE) survey data had been scrapped, yet was all we had to go on. However, institutions now have two years’ of Graduate Outcomes Survey (GOS) data at their disposal, and so it’s a good time timely to analyse this in detail in advance of agreeing new targets in rebooted APPs – particularly in view of the recently published OfS consultation on regulating quality and standards.
When metrics are published there can be a tendency to see parts of a university as a “problem” and then to hastily throw scrutiny and resource at “fixing it” – but it’s also important to understand what’s going on before applying changes, interventions or solutions.
Let’s imagine that your graduate outcomes survey results reveal that:
- A lower proportion of your female graduates progressed to highly skilled employment than male graduates
- A lower proportion of your BAME graduates (and black graduates in particular) progressed to highly skilled employment than white graduates
- A lower proportion of your graduates who entered your institution with BTEC qualifications progressed to highly skilled employment than those coming in with A-Levels
- A lower proportion of your widening participation graduates (however you define these) progressed to highly skilled employment than your non-WP graduates
- A higher proportion of your international graduates progressed to highly skilled employment than your UK domiciled graduates.
The above five statements are correct for my institution, and I doubt we’re alone. But what does this really mean? Are those groups of students with lower levels of highly skilled employment 15 months after graduation unfairly disadvantaged in the job market? Should we be ploughing our finite APP resources for employability into UK domiciled, female, BAME, WP, BTEC entrants? Or should we rewind a little and try to understand what’s going on here? I suspect the latter is the better option. Things may not be as they seem.
Let’s restrict the analysis to full-time undergraduates to keep things relatively simple, and let’s start with our international students. According to our 2018/19 GOS data (i.e. those who graduated in 2018/19 and completed the survey circa 15 months after graduating), the highly skilled employment rate of our overseas students was some 10 percentage points higher than our UK domiciled graduates. And yet, on average, our overseas students did less well in their final degree classifications.
Now let’s look at the relative response rates to the survey. Non-EU international students comprise around 6% of our full-time undergraduate intake, yet only 2% of respondents to the GOS. I’m no expert on GOS data collection or indeed visa restrictions of overseas students, but I’d guess that those responding to the survey were highly likely to have remained in the country 15 months after graduation, and the fact that they have remained tells us that they have more than likely acquired “good jobs” that required a visa extension.
If so, this renders GOS comparisons between international and UK graduates meaningless. Curiously, it also provides a distinct disadvantage in GOS league tables (and perhaps the mooted minimum requirements for graduate outcomes) to those institutions that recruit a low proportion of their undergraduates from overseas.
Closer to home
So instead let’s stick with UK domiciled for the rest of the analysis to eliminate any overseas bias. Our female graduates highly skilled employment rate was some 9 percentage points lower than our male graduates. All clearly statistically significant. But females were more likely to study courses that apparently lead to “lower quality” graduate outcomes (that is, “quality” according to the OfS consultation at least).
Our black graduates’ highly skilled rate was 8 percentage points lower than their white counterparts. Again, highly statistically significant. However, as we find across the sector, there is a large degree awarding gap between our black and white students. Crucially, we find that degree classification is one of the strongest predictors of highly skilled employment – the higher the degree classification, the greater the probability of success in the job market. We find the same applies to our BTEC entrants – it is their lower average degree classifications that appears to account for their lower rates of highly skilled employment.
To further confirm these assumptions, all of the known and available influencing variables were put into a logistic regression model which found that, when statistically controlling for other confounding factors, there was no evidence of any difference in highly skilled employment rates based on gender, ethnicity or pre-entry qualification route.
So it may be factors other than employment and employability that are leading to differential outcomes. Indeed, the strongest predictors of highly skilled employment according to the statistical modelling were subject of study, degree classification and whether or not the student undertook a sandwich placement.
There was, however, statistically significant evidence that our WP graduates (WP in this instance based on Indices of Multiple Deprivation, derived from entry postcode) had lower rates of highly skilled employment, even after controlling for the other known influencing factors. And this makes sense. They come from deprived geographical areas – many will go back to these same communities upon graduation. And we wouldn’t necessarily expect a land of employment opportunity in these deprived neighbourhoods. But this has worrying implications according to the new regulatory approach. Institutions could be financially penalised for taking in large numbers of disadvantaged students. Perhaps we can add this to Jim Dickinson’s list of unintended consequences.
It is only through going beyond the aggregated analysis and the headlines and properly interrogating the data that we can begin to understand more about the factors influencing the success of different groups of students in the graduate job market.
Perhaps different support is required for different groups at different stages of the student life-cycle. But it’s clear that more multivariate analysis is required before we can understand what’s going on and tailor our interventions accordingly. Aggregated data – such as that provided in OfS’ APP dataset (as welcome as it was) – does not give us sufficient nuance to support institutions as they strive to meet the new regulatory expectations.