This article is more than 7 years old

It’s time for bursaries to demonstrate their impact

Whether bursaries actually work at widening access has long been a controversial topic. Les Ebdon writes how OFFA has introduced a new tool to help universities evaluate bursaries' impact, which will be required for future access agreements.
This article is more than 7 years old

Les Ebdon is a consultant with Applied Inspiration. He was the Director for Fair Access and before that vice chancellor of the University of Bedfordshire.

Financial support delivered through access agreements has been something of a hot topic in higher education recently. It’s a complex and sometimes contentious issue.

It is no secret that I have raised questions about the amount of money institutions are spending in this area – big investment, but little understanding of what works. OFFA is now tackling this head on.

OFFA’s research to date has been unable to find any link between bursaries and students’ choice of where to study, or their likelihood of continuing with their studies. However, I do, of course, regularly speak with staff working on the frontline in widening participation. Many have told me of evidence they have to suggest that bursaries can have an impact for some students, in some circumstances.

This has not fallen on deaf ears, and OFFA has been working to help universities properly assess the impact of their financial support. In doing so, we realise this is a complex issue. There is no blanket definition of ‘effectiveness’, and no single answer to the question “do bursaries work?”. When we look at the impact of financial support, we need to consider both the context in which it is given, and how it is combined with other forms of student support.

So last year, we commissioned a team at Sheffield Hallam University to carry out a new project, with a new research question. Moving away from the national level approach of previous research, we asked the team to take an institution level approach, focussing on how individual universities can better evaluate whether the bursaries they award are having an impact on their students.

It is important to explain what this research is not trying to do. It wasn’t the aim to prove whether or not financial support is effective. So, I’m afraid anyone hoping for evidence that financial support provides a silver bullet to solve all fair access issues is going to be disappointed.

However, the team have developed a really interesting set of evaluation tools. A statistical model tracks students who receive bursaries from enrolment to graduation, comparing their outcomes with non-recipients. This data can then be integrated with students’ responses to short surveys and more in-depth interviews. Taken together, this information can provide institutions with a holistic picture of the impact of financial support – at an institutional level. The report and tools have been published on OFFA’s website today, ready for institutions to use as they start thinking about drawing up their 2018-19 access agreements.

A lot of work has gone into making sure that the model can be easily implemented in institutions across the board. It is flexible enough to be adapted for different contexts and uses data which is readily available or easily collected. OFFA has also been working with HESA to help provide institutions with data which will streamline the process for staff who are using the model.

We have said for some time that good evaluation is vital to ensuring investment is focused where it has the greatest impact in improving fair access. There are challenges involved in improving and embedding evaluation, and we are committed to supporting the sector to meet these challenges. This research has the potential to help institutions in their efforts.

It is also very timely. Earlier this year, OFFA took an especially close look at how institutions are evaluating the impact of their access agreement spend. Financial support came under a spotlight: over 20% of institutions had not carried out any evaluation in this area at all – around £50 million invested without a clear idea of the impact. A similarly large proportion only evaluated their financial support packages by gathering feedback from students, rather than looking at changes in behaviour. Of course, it is important to listen to students’ views, and this new tool encourages institutions to continue collecting this feedback. But for evaluation to be really meaningful, it must go deeper, measuring impact as well as attitudes.

I accept that some institutions will be further along than others in evaluating their work. But let’s be clear: with this tool now available, I expect to see a step change in practice. I will no longer accept an access agreement with significant levels of spend on financial support unless I can also see real commitment to robust evaluation of its impact.

At the same time, I will be keeping the pressure up to ensure a greater focus on outreach work. To support students from disadvantaged backgrounds to get in and get on, it is crucial that universities and colleges are working closely with schools to raise aspirations and attainment. I remain convinced that this sustained, targeted work which begins early, especially in communities with limited experience of higher education, is key if we are to make further progress in narrowing the shocking participation gap in higher education.

This research is a great step forward. Universities invest significantly in financial support, and I am confident that adopting these methodologies will allow them to understand the results of their investment. Where financial support is working well, I will support it. But where the evidence shows there is a lack of impact, I will be expecting institutions to reconsider their spending, targeting activities and support which make a demonstrable difference across every stage of the student lifecycle.

Leave a Reply