This article is more than 3 years old

Would PQA help disadvantaged students?

It's taken as read that post-qualification admissions will help applicants from disadvantaged background. But, says David Kernohan, the question is still open.
This article is more than 3 years old

David Kernohan is Deputy Editor of Wonkhe

Applicants from backgrounds that are less likely to attend university may not necessarily be disadvantaged by the current system of predicted grades.

UCAS’ own data shows that, overall, POLAR4 Quintile 1 students see a benefit to their applications from over-optimistic prediction. Around 85 per cent of applicants from POLAR4 Q1 undershoot their predicted grades by at least 1 point, compared to around 78 per cent of POLAR4 Q5 applicants.

[Full screen]

Standard model

The gold standard research in this field – and highly likely to be cited in the many reports on admissions that are keenly anticipated over the next 12 months – is Gill Wyness’ 2017 report for The Sutton Trust, “Rules of the Game”. One key finding illustrates the phenomenon of “undermatching” – where high attaining students from low socioeconomic status backgrounds are less likely to attend universities where students in general have a high A level attainment. An important point – but one that may also illustrate how welcome students from non-traditional backgrounds feel at such providers.

“Rules of the Game” cites Wyness’ 2016 report for UCU, “Predicted grades: accuracy and impact” – Figure 4 shows a similar pattern to my visualisation above for 2015, though it uses measures of disadvantage (not available as public data) rather than POLAR. The trend to over predict results for disadvantaged students appears to have intensified in the years since – although part of this may be due to a decrease in prediction accuracy overall.

Famously, Wyness finds evidence that:

applicants who are under-predicted are more likely to apply to, and attend, a university that they are over-qualified for”

and

there is also a correlation between being over-predicted and an applicant’s likeliness to apply to a university that they are under-qualified for”

The data I would like to see is on courses rather than on providers, and I would also like to read more about understanding the wider system of influences to application and acceptance choices via qualitative research.

Will this be in the exam?

If this summer of examnishambles has taught us anything, it is that the A levels are not a gold standard that accurately measure student potential. Every year, results are arbitrarily raised or lowered to adhere to a predetermined distribution – based on, among other things, prior performance from an exam centre or school. A level predictions (and indeed, A levels in 2020) are not scaled in that way.

Certainly the court of public opinion saw more merit in the grade-referenced mechanism that became the Centre Assessed Grades, where teachers used their understanding of the performance of individual pupils informed – in part – by experiences in previous years. Grade referencing has been part of the landscape for around a decade, but it has never been less clear that it should remain. Limiting the attainment of current students based on the attainment of their predecessors was, and remains, unfair.

Therefore moving away from making offers based on predicted grades would need providers to move deeper into the kind of contextual information that sits below grades if current plans to widen access and participation are to be fulfilled. A PQA style system would mean entry requirements (as expressed in a prospectus) would need to be de-emphasised. This, arguably, should be happening anyway, though it is questionable whether a change in application system (rather than, say, targeted use of unconditional offers) would be needed to make it happen.

Circularity

This is a new slant on a plot I’ve shown before, allowing us to compare predicted and actual results for all applicants against accepted applicants. What we’re seeing is that applicants who do less well at A level are less likely to be accepted, and that that applicants who are predicted to do less well at A level are less likely to be accepted. I’m leaving the “selectivity” or otherwise of individual universities out of the question here – there is a lot more than A levels in matching an applicant to the course that is right for them.

[Full screen]

There is, clearly, a lot more to be done to promote fairness in university admissions beyond fiddling with when providers can make offers. A levels themselves have always been the most middle-class of qualifications, and the recent move towards staking everything on a final exam has made results both more difficult to predict and more likely to benefit those with the resources to undertake intensive revision or additional tuition.

The actual issues with university entry qualifications are a growing difficulty in predicting grades – most likely due to an increasing reliance on traditional exams rather than coursework – and the tendency that students from more advantaged backgrounds achieve better actual results.

Shifting to a post-qualification application or offer system would solve neither of these issues, and would remove the small advantage that disadvantaged applicants get from teachers predicting based on aptitude rather than likely exam performance. There is hard work to be done on educational inequality, and PQA is not a shortcut.

Leave a Reply