The uncomfortable truth of access evaluation

Lee Elliott Major, the Chief Executive of the Sutton Trust, recently noted that with an estimated £800 million a year spent by UK universities on outreach activities, the low level of evaluation that takes place is “criminal”.

The heightened need to focus on outcomes rather than just inputs in the new Office for Students access and participation regime underlines this view.

The pursuit of truth

The Cambridge-Somerville Youth Study (1939-45) provides a vivid example of why the pursuit of truth through robust evaluation is so important. Indeed, I would recommend that every university outreach practitioner listens to the Freakonomics podcast produced by Stephen Dubner entitled: “When Helping Hurts: Good intentions are nice, but with so many resources poured into social programs, wouldn’t it be even nicer to know what actually works?”.  The study itself was pioneering in its use of a control group to see the effect of mentoring on 250 young boys deemed at risk of delinquency.

As Dubner notes, this is important “because whenever you’re trying to establish cause and effect, in anything — whether it’s a chemical reaction or a mentorship program — you need to be able to isolate the inputs and reliably measure the outputs. That’s a lot easier to do in chemistry than it is with something as complicated as people’s lives.

Through use of a control group and a steadfast commitment to the pursuit of truth over a significant period of time it was revealed that the intervention produced negative outcomes against all seven of its key measures: these included life expectancy, mental and physical health, alcoholism and likelihood of a criminal record.

What appeared to be an incredibly positive source of support for the boys involved (and this was echoed by the views of the participants themselves) was revealed to have had a detrimental effect.

Shifting the focus

What does this mean for universities and access and outreach practitioners? I would suggest that like those involved with the Cambridge-Somerville Youth Study, we must maintain a determined focus on establishing the true impact of our work. Major is right – not enough time or resource is dedicated to evaluation of outreach and widening participation activity led by universities.

On more than one occasion I have heard outreach practitioners say that they do not believe in randomised control trials because it means some young people will miss out. I would suggest that this fundamentally misses the point. This position assumes from the beginning that the outreach work in question will make a positive impact; we simply should never assume this. Programmes cannot be continually delivered based on the great intentions of their designers and deliverers, but on the proven outcomes they achieve.

What should focus the mind further is the widening gap in HE access between poorer students  and their more affluent peers which has continued since the Second World War. An increase in poorer students progressing to HE has taken place, particularly since the expansion of university places in 1992. And more children eligible for free school meals are progressing to HE than ever before. But the gap in HE progression between free school meal and non-free school meal pupils has remained stubbornly fixed for the past decade at around 17-18%. This includes during a period of significant investment in this area in the form of Aim Higher.

The jury is out

In his article, “Proud to Be Wrong”, Ben Thomas refers to how determined the great biologist, Charles Darwin, was in the pursuit of truth. In one vivid illustration, Thomas shares a conversation between Darwin, his wife Emma and a close friend on their happiest memories, when Darwin recalled memories of visiting a scenic park. He later returned to them after going to bed, bursting into the room in his nightshirt to correct himself, stating “I was wrong! That wasn’t my happiest memory! The happiest day of my life was one summer in Cambridge”.

This was an anecdote that members of Darwin’s family would retell with humour and pride. Thomas notes: “It’s easy to hear this story and laugh, and picture Darwin as an ageing bookworm, too flustered by details to follow a normal conversation. But even that picture hints at something deeper: what mattered most to him wasn’t keeping the conversation flowing, or even being right, but providing only the most accurate information possible to his loved ones”.

A jury assembles

We should aspire to something similar in our access work. In my view the jury is still firmly out on the broader impact of university access and outreach provision in how effective it has been over a sustained period in narrowing the HE progression gap. Indeed, in the many cases where evaluation is severely lacking, we could say that the jury has not yet even been assembled. This is not acceptable.

The Sutton Trust’s 2015 Report, Evaluating Access found no UK-based studies evaluating access strategies and approaches using robust designs to establish effectiveness. It cannot remain this way.

My personal commitment is that through the National Collaborative Outreach Programme consortium, Aspire to HE, and the wider outreach provision delivered by the University of Wolverhampton, we will be determine the full true impact of our work. This involves being bold in adjusting our provision when required to ensure we can always answer two key questions; are we making the positive difference we set out to make, and how do we know? Young people and local communities that our work seeks to support deserve this from us.  

It’s too early to say if our approach is working. But we are committed to finding out.

4 responses to “The uncomfortable truth of access evaluation

  1. Sutton Trust might practice what they preach when releasing reports to the media as reporting typically highlights selective statistics. Features on Wonke also tend to be lacking in robustness, as evidenced by the recent proposal to use lotteries to assign university places.

  2. I have stated to the OfS that they focus upon institutional level data for access evaluation, and that they do not appear to have any data or an understanding about all (England) access/outreach participants (the cohort) who have engaged with access/outreach interventions (across all HEIs) in any one academic year, to know what impact has occurred.

    My suggestion to them is that they longitudinally track the national cohort to understand: their demographics; the amount from disadvantaged groups engaged; the progression of disadvantaged groups; the success of disadvantaged groups and their progression, which they could do annually. It is important for the OfS to ask HEI’s to evaluate their access work, but should they take responsibility to evaluate the outcomes/impact of the access work delivered nationally by all HEI’s? It will be interesting to see if the OfS consider this approach……

  3. The HEFCE guidance for the national Aimhigher Prorgramme was not that specific in terms of targeting and only requested programmes to target disadvantaged pupils with the potential to progress to HE. Programmes developed their own targeting models and these included POLAR YPR (Q1-2), IMD, IDACI, NS-SEC, FSM and no parental HE. Therefore, the example of FSM HE entry rates not increasing during this time is not surprising as pupils were not always targeted in this way.

  4. So in terms of my above comment the important point is that HEFCE required Aim Higher programmes to target pupils with the potential to progress to HE. In turn this approach tended to exclude many FSM students due to their lower rates of attainment.

Leave a Reply