As a sector, we collect a lot of valuable data on applicants and students through various ways such as UCAS and HESA returns.
As a student experience practitioner, I use data to identify patterns of participation and engagement at undergraduate and postgraduate level in order to develop student experience initiatives. However, I am seeing critical gaps in the data that would help us investigate and support major issues that are upon us or on the horizon.
It is now time to adapt and evolve our acquisition of applicant and student data to help provide the intelligence that will help us create a higher education landscape fit for the 21 century. I provide four examples here of what data we need to collect – and how it can help us.
Non-participation through attrition
The UK HE sector gets participation data from two main sources: the Universities and Colleges Admissions Service (UCAS) and the Higher Educational Statistics Agency (HESA). Importantly, HESA provides data on undergraduate and postgraduate students who are in study in December of the current academic year through the compulsory institutional data returns.
However, there is a Twilight zone. No sector-wide data is available between confirmation and enrolment and also published data of those who withdraw between enrolment and the HESA return. HESA collects it but only publishes withdrawals after the return as it argues that there “may be reasons for this, which are unconnected with the course or the HE provider”. For postgraduate study, the lack of information is problematic as there is no compulsory UK PGT admission process.
As a result, it is hard to ascertain who is actually participating, who withdraws, when and why. This lack of data has consequences further down the line when using patterns in the data to create strategy and policy whether at national or institutional level regarding recruitment, retention, attainment, widening participation and support requirements.
UK HE is said to have one of the highest retention rates across the globe but this figure is potentially skewed without consideration of this data. Firstly, the data could provide more clarity on changing behaviour such as the increase in applicants coming into the system at clearing and those changing their mind about going to university even with a confirmed or unconditional place.
Secondly, if the HESA dataset of those who withdraw between enrolment and 1 December was compared with the UCAS undergraduate acceptance data after clearing, we could examine the demographic characteristics to determine if there was a pattern in our ever increasing marketised HE landscape. The benefits of having this data is that institutions would not have to look at their September to December withdrawals in isolation, and issues could be identified. If there was a compulsory UK PGT system, we could do this at this level, one of the reasons why I and many others have long argued for it.
Recording of withdrawals
There is also a need for the sector to more accurately record withdrawals. Currently, institutions record reasons for withdrawal very differently and the level of detail varies. It is challenging at institutional level to collect this data because often a student has already left before it is formally recognised that they have. Having a standard template that all institutions are required to complete would enable sector comparison as well informing institutional and national strategy and policy.
To ensure compliance, for UK/EU students, it could be tied to the student loan withdrawal process. If designed well, it could also collect data relating to withdrawal such as debt levels incurred through study fees and accommodation costs. We know that debt burden is a contributor to student anxiety and stress and using this data we could identify issues and provide targeted advice throughout the study journey from first contact through to graduation.
Mental health
There is a flurry of activity occurring to improve wellbeing in HE such as the Mental Health Charter and the recent call for Challenge Fund proposals by the Office of Students. These are excellent steps. But what we are lacking is longitudinal data to fully understand the scale of the issue and where we need to target resources.
If undergraduate applicants were required to complete a short anonymous questionnaire which could be linked to their post confirmation UCAS application, it would provide a more accurate picture of the status of student mental health and wellbeing on entry to study by student demographics nationally, regionally and institutionally. This will help with accurately targeting resources not just within higher education but across our health and social service provision. In the absence of a UCAS PGT system, a national survey could be executed by an educational body such as AdvanceHE.
Rise of unconditional offers
This year, there are been much coverage about the increase in the use of unconditional offers. Most of the debate about its value has been based on limited and anecdotal evidence based on the notion that applicants with unconditional offers won’t bother working hard for their pre-uni exams or that a student entering university with an unconditional offer won’t have experienced studying for exams under stress so it will impact on their ability to succeed.
UCAS has just published its End of Cycle and Unconditional Offer reports that will hopefully correct misconceptions about UOs across the UK and contribute to a more positive debate. It has also developed a range of good practice materials that has received support from Universities UK.
Graham Galbraith, vice chancellor at Portsmouth contributed to the debate when he spoke logically and pragmatically in his Wonkhe article about unconditional offers saying, “Judge us on our ability to transform lives, not prior attainment”.
The reality is we just do not know the impact of the rise in unconditional offers without tracking students in, through and out of study looking at the retention, attainment and success. The discussion that has dominated the headlines is the irresponsibility of HE providers rather than the responsibility of all stakeholders and the benefits of unconditional offers to applicants.
Without a proper discussion based on accurate data on the impact of unconditional offers, there is a danger that the baby will be thrown out of the bath water with universities under pressure to abandon this type of offer.
What is clear though is with the shifting landscape in higher education and the increasing costs of study to the individual student, it has never been more important to collect the correct big data that will provide us with the knowledge to evolve our provision and provide a high quality student experience that benefits the individual, society and business and industry.
Some insights into these gaps in data would be really valuable to policy makers and regulators. For example understanding how the UCAS acceptance data differed from the HESA stats for different Polar quintile and ethnicity would have been really helpful for me as a Director of Fair Access.
Les Ebdon