This article is more than 5 years old

Is the shift to outcomes as simple as it sounds?

Julian Crockford looks in detail at OfS' shift to measuring by outcomes, asking if it's really as simple as it sounds
This article is more than 5 years old

Julian Crockford is a researcher and evaluator in the Student Experience Evaluation and Research team at Sheffield Hallam University.

When the Office for Students released an outline of its future regulatory position for widening participation before Christmas, many of us were, I suspect, too distracted by the raging Twitter debate about whether Die Hard really counts as a Christmas film to give it our full attention.

But now we’re back we can return to our consideration of OfS’ aspirational A new approach to regulating access and participation in English higher education. While its “bold new approach to supporting social mobility, equality and diversity through higher education, and … desire to be radical and ambitious in reducing the gaps in equality of opportunity” sounds like something to be welcomed, some of its broader implications make for disconcerting reading.

One of the outcomes of OfS’ recent consultation is the removal of formal regulatory expectations about institutional spending. This, coupled with leaks from the Augar review hinting at potential reduction in tuition fees (and thus a need for institutions to rebalance financial priorities), is a worry for widening participation colleagues. The counterweight to this more liberal approach to institutional spending regimes is provided by the regulator’s much heralded approach, in which institutions will be judged by their outcomes rather than (financial) inputs, measured by their progress against quantitative targets of their own and OfS’ devising.

Little attention, however, is given in the publication as to how these targets (or OfS’s admirably ambitious key performance indicators) are to be met.

Letting nature take its course

Setting quantitative targets, while a cornerstone of the Deliverology school of New Public Management, is really only half the story. Crudely put, such approaches assume that the policy makers’ skill lies in selecting appropriate objectives and expertly coupling them with suitable quantitative measures before imposing them on public services – and then letting nature take its course. Responsibility for determining exactly how these goals are to be accomplished is delegated elsewhere, and policy subjects are presumed to glide frictionlessly from stated problem to quantified solution, like a dancer in Swan Lake on Ice.

Unsurprisingly, given chief Deliverologist Michael Barber’s helmsmanship of OfS, this strategy explicitly underwrites OfS’ new approach. Guidance indicates that institutions will be expected to submit an annual impact report which, where targets have not been hit, is to be accompanied by “an action plan setting out any steps that need to be taken to make improvements to their current plan”. What this does is to effectively shift all the heavy lifting of working out just how to achieve OfS’ key performance indicators onto HE providers.

There’s the rub

The challenges that OfS has set itself, and by extension the sector, are the most stubborn and wicked of widening participation objectives: reducing the marked participation, retention and attainment outcomes gaps that exist between different groups of students, particularly along lines of socio-economic disadvantage, ethnicity and disability. That sector-wide progress in closing in these gaps is so slow is, at least part, down to their complexity and wickedness.

Take the sector-wide disparity in degree outcomes between BME students and their white counterparts. OfS acknowledges the complex nature of the problem, when it distinguishes between structural factors (“such as entry qualification, subject of study, age of students, and the provider at which a student studies”) and other issues, over which HE providers are assumed to have more control. Significantly, they conclude, “some providers have already made significant progress in closing unexplained gaps in non-continuation and attainment, and so we are setting our target to eliminate the unexplained gaps over a shorter period of time”. The term “unexplained” here, and on a number of other occasions in the guidance, carries a disproportionate amount of ideological weight and lies at the crux of the fundamental tensions at the heart of this approach.

Notwithstanding work already being done across the sector in reducing disparities in degree outcomes by ethnicity (e.g. at Kingston University and Wolverhampton University), addressing and resolving these deeply troubling differences will be a complicated, highly context-dependent process.

Even a cursory and very partial selection of potential causal or contributory factors throws up a range of issues that need to be considered. These include students’ engagement with their curriculum, their sense of fit with an institution, the match between student and institutional social and cultural capital, broader challenges around the provision of inclusive learning and teaching, and so on.

Research into these area tends to be high level and there is currently much less practically orientated work to demonstrate to HE providers how these issues can be addressed and resolved in their own contexts.  If, as OfS suggests, the provider-specific contextual impacts of these factors remain “unexplained” (under-researched, and undiscussed) then HE providers have no practical levers or established techniques for responding to them.

The recently announced tender for a WP Evidence and Impact Exchange (EIX) may be intended to respond to these concerns and result in the building of an evidence base of what works, giving providers the levers they need to change the facts on the ground.

But the deployment of the What Works Network as a potential model and aspiration for the EIX rings alarm bells. The approach draws heavily on a quasi-scientific approach to evaluation. The EIX will be seeking to provide “evidence of impact” and “assessing how effective policies and practices are against an agreed set of outcomes”.

While a dedicated resource to generate and collate evidence is much needed there are fears that it may rely on black box approaches to evaluation and evidence like as randomised control trials, which promote the measurement of outcome changes, over white box investigations of  how those changes came about – not just what works, but how, why and when and for who.

If this is the case, then the EIX threatens to be more of the same, cast in a deliverology mould, while ducking the awkward questions of what, practically, can a diverse range of providers with a diverse student body do to close the gaps with which we have been tasked.

Tough choices

Lacking the kinds of explanatory factors that a white box approach might generate, HE providers are effectively being asked to achieve the impossible; to navigate to a far off destination across a hugely complex environment with no road map. Nonetheless, with regulatory pressure piling up to get to the designated destination as quickly as possible, providers might decide that they have only a limited range of options from which to choose.

They could throw everything they can marshall from limited resources at the problem, scatter gun style, in the hope that something will work. They might look for ways to game the numbers to achieve the desired quantitative outcome, without necessarily having to change the situation on the ground. Alternatively, an HE provider could decide to delay setting off on its quest until it has its hands on a map of the landscape to guide it. But an institutions trying to navigate a highly complex social reality by relying only on unexplained contributory or causal factors, might find itself quickly lost and tempted to resort to option one.

A theory-driven future

Theory-driven approaches are concerned not just with whether an intervention works or does not work, but also how and why it does so. From this perspective, it’s important to develop change models which sketch out practitioners’ understandings of the causal processes assumed to lead to outcome goals (such as equalising the outcomes results of different student groups) and action models which outline the practical activities that must be taken to produce the desired changes.

We should prioritise time and resources to develop a theory-driven approach, investing more in our efforts to investigate and understand both the problems that confront us and potential solutions.

Ideally we would want OfS to actively incentivise and reward higher education providers for wide-ranging knowledge production in this area. This might include thinking about, researching, understanding, and communicating the complex nature of the problems that we face so that we can collectively make faster, more radical, changes and progress. Rather than focusing solely on outcome targets, the regulator would be equally as interested in how we close the unexplained knowledge gaps, creating the conditions through which we can develop effective, sustainable and permanent solutions and reach the laudable goals that we all share.

One response to “Is the shift to outcomes as simple as it sounds?

Leave a Reply