We’re not in Kansas anymore: the new standards of evidence

As a WP practitioner, Julian Crockford of the University of Sheffield reviews the new standards for evidence set by the Office for Students in their latest suite of publications.
This article is more than 1 year old

Christmas came either a bit late or very early for university-based outreach practitioners, with a smorgasbord of publications from the Office for Students over the last couple of months.

Alongside the long-awaited, and much trailed Access and Participation Plan guidance, evaluators had their own special presents in the form of two interlocking sets of guidance, expanding on OFFA’s 2017 Proposed Standards of Evaluation Practice, and a rather complicated self-assessment evaluation toolkit in February. OfS rounded it all off with a mammoth release of access and participation data at a provider level.

No amount of parcel squeezing could have prepared us for the radical nature of these publications. Indeed, this particular HEI-based WP evaluator is feeling unexpectedly enthused and taken-aback by their scale and ambition. They are long, intensive, thorough, and really, really good. I would go so far as to suggest that this new guidance has the potential to be a real game-changer and whisk us off into an entirely new evaluation landscape.

Methodological myopia

The first thing the new OfS position does is to break the methodological Nelson Hold that has long divided the sector along the lines of C.P. Snow’s two cultures – between, in this case, the proponents of Randomised Control Trials and… er…. everyone else. In both directions, this division has encouraged a form of methodological myopia, miring us in epistemological struggles that often serve to distract us from the real business of establishing impact.

The 2017 OFFA Proposed Standards of Evaluation was criticised for implicitly, or inadvertently, feeding the “gold standard” narrative. The new guidance is unexpectedly, and refreshingly, pragmatic in this regard – sagely advising that outreach evaluators should use the best methods available to them to answer the searching questions they have set themselves. Rather than implying a methodological hierarchy with RCTs at the peak, the new approach rests on an interconnected and iterative series of evaluation types (narrative > empirical > causal), from which evaluators are to select the most appropriate to their situation and resourcing.

Hefty gauntlet

The real treasure here, however, comes in the form of a hefty gauntlet thrown down to the sector; challenging us to make a massive and radical leap in our approach to evaluation. In the first place, there is a new insistence on the need to take a whole institution approach, prioritising and resourcing evaluation from the very top; inextricably interweaving it into the design, implementation, delivery and review of WP outreach interventions. This is a significant challenge for a sector which has, albeit with notable exceptions, tended to regard evaluation as – at best – a bit player in the real business of delivery, and at worse a tick-box irritant, to be grudgingly accommodated as a necessary but distracting evil.

Perhaps the most revolutionary aspect is the instruction to “question everything” at all levels. By insisting that HEIs develop a detailed theory of change for widening participation practice, from strategic approaches down and coalface practices up, the OfS challenges us to question all our assumptions about what we do and what works and then to evidence and test them. This could, and should, give universities the courage to take a long look at outreach activities, to think deeply and hard and rethink where necessary. Progress in social mobility and ensuring fair access to the many opportunities that HE offers has been, as many have suggested, painfully slow, and as a sector we have sometimes bumped along lacking the courage to ask ourselves really challenging questions.

Elephant in the room

But this guidance asks us to ask and answer uncomfortable questions, to spotlight the elephant in the room, and focus only on those activities we have effective evidence for. The guidance, perhaps for the first time, and in a very un-Regulatory way, dispenses with the need for frameworks and tables of expectations, and acknowledges the highly complex nature of this kind of work (cough – Teaching Excellence) and, even more importantly, provides us with a highly detailed, expansive, ambitiously wide-ranging, and admirably thoughtful set of suggestions for how we can think our way to more effective outcomes.

The ball is now firmly in the sector’s court and it is up to us to rise to the challenges posed by this guidance and the self-assessment tool, to engage the deep thinking interrogative approach that the OfS are calling for. We can take heart that, if as a sector, we engage fully and grapple authentically with the challenges with which we have been presented, it feels like we stand a real hope of making the kind of progress towards a fair and inclusive HE system that has underpinned our outreach efforts to date.

One response to “We’re not in Kansas anymore: the new standards of evidence

  1. This actually feels very familiar to me. In 2010 I was asked to collate all previous WP policies my institute had implemented and try to find some common measures and policy. What I found was a very consistent pattern of behaviour by institutes and ‘funders’. Occasionally some benevolent bunch of suits would go “HERE! Take this cash and go help the poor and disenfranchised get a university to degree. Aren’t we amazing! We must be because I’ve mentioned this cash to the press about a billion times, mostly so they’ll get off our backs”…..

    The institute then asks the question “How should we use the money? I mean what WP aspects should we focus on?”. The benevolent overlords would respond “I don’t know? It’s free money, don’t look a gift horse in the mouth. We trust you to figure it out. We’re not going to spend MORE time and money doing that when really we just want to look like we’re being charitable”.

    So the institute dutifully doffs its cap and tries to figure out what to do with the money. Like many hap-hazard, almost ad-hoc projects there is mixed success and even more mixed abilities measuring success. It’s at this point the overlords say “Of course, we will be auditing you on the money we sent”…. “How can you audit us when you never gave us any rules?” the institutes respond. “We gave you loads of money! Don’t be irresponsible with it”.

    I found 16 separate projects, all with the label of Widening Participation, prior to 2010. I’m aware of two more since then. All of them completely different in approach, measurement and amount of funding and nearly all of them developed by the institute itself. So to read that OfS and OFFA are doing the same thing again comes as no surprise. The difference this time is they’re looking to capitalise on the institutes hard work.

Leave a Reply