Anyone working in the widening access space will be aware of the renewed focus on evaluation that has emerged.
Indeed, one of John Blake’s earliest speeches as Director of Fair Access and Participation at OfS is best remembered for his line on “evaluation, evaluation, evaluation”. And recent plans to overhaul the APP process have continued this theme.
Effective evaluation is crucial and creates a healthy culture of accountability and learning while enabling institutions to focus resources and hone their interventions to achieve more. At IntoUniversity, we have long believed in the value of robust evaluation and have prioritised stringent and effective measurement of our programme for many years. So, we welcomed this new emphasis.
The reward for work well done is the opportunity to do more
But it’s also important we do not lose sight of the good work that has already happened in recent years or create an overly risk-averse approach to tackling the challenges we face.
The work we do at IntoUniversity is long-term and recognises the complex needs of the young people we work with. We work with young people from seven years old until they are 18, and provide a multi-stranded programme of support which factors in a number of different types of interventions. Evaluating these in a vacuum is misleading.
As an example: we run residential trips with young people as part of a prolonged series of interventions over several years. While available data might suggest that residential trips have limited impact on access as isolated interventions, we strongly believe they form an important and effective part of our wider long-term programme. Separating them and judging their success in a vacuum fails to recognise how multi-pronged strategic approaches can and do work towards tackling HE inequalities.
Nuance, nuance, nuance
Young people face challenges at every stage of their student lifecycle, and so interventions need to be sophisticated and varied, and take in a range of academic and non-academic skill sets. An overly simplistic approach to evaluation risks losing this nuance in exchange for efforts which are easy to evaluate. We can not afford to overlook interventions that are not neatly measured but which are just as necessary – if not more so.
Providers should be encouraged to approach complex issues with creativity, innovation, and a long-term lens. And there needs to be an encouragement for organisations to look for a balance between risk-taking and risk aversion. Otherwise, we will see restricted evaluation requirements provoking a “race to the bottom” whereby providers become risk averse, and innovation is dampened. The knock-on impact here will be on young people, and their outcomes as effective, existing or new interventions are lost or avoided because institutions are fearful of not being able to provide the sufficient required proof of their efficacy.
We are privileged to be part of a sector in which there is so much passion and a genuine desire to improve and make an impact on the lives of young people. While we can all acknowledge that many challenges still remain in our efforts to make higher education truly equitable, lots of great work has already been done and continues to be done.
We wholeheartedly support the OfS in making evaluation a priority – now, as a sector, we need to come together to establish ways to evaluate our efforts effectively without oversimplifying the nature of the work we do and the variables at play.