As a long-time evaluation enthusiast, I was pleased to see the intensified focus on it proposed in the current Office for Students (OfS) consultation on a new approach to regulating equality of opportunity.
Key reasons to party are the proposal to elevate the status of evaluation to the purview of institutional senior leadership and to “encourage” providers to increase the volume and quality of evaluation.
I’m concerned, however, about proposals that Access and Participation Plans include a 4 year project plan for the evaluation process and commit to publication outputs.
While this is clearly intended to hold institutional feet to the fire, it also threatens to lock in evaluators (and their limited resources) to a fixed programme of activity – and threatens to limit responsiveness to events (sudden and rapid cost of living crisis, anyone?) sector changes or interesting developments and possibilities.
The emphasis on publishing outcomes also threatens to set an impossibly high bar for outcome sharing of all types. Could promising potential and partial outcomes be tossed out with the failure to publish bathwater?
Pump up the volume
These reservations aside, the proposal that providers include a comprehensive intervention strategy and a theory of change, seems a perfect excuse to bust out the dancing shoes.
While we might be forgiven a nervous gulp when thinking about the potential resource implications, we might also want to raise a glass to the ambitions underwriting this. At first glance, the proposals look like a positive step towards a sector-wide theory-driven approach to addressing the equality of opportunity risks.
But disappointment sets in when considering the theory of change structure provided in Appendix D. A purist might complain that it’s a logic model rather than a theory of change proper, and as such it just doesn’t go far enough to realise OfS’ admirable ambitions.
The model provided jumps from “inputs” to “activities” to “outputs” to “outcomes” without asking us to consider how activities actually generate the desired outcomes. Logic models are great for mapping implementation and resourcing issues.
But they don’t require us to invest any serious reflection in how our interventions cause the changes we want to see.
In contrast, a more comprehensive theory of change approach encourages and supports a detailed consideration of the mechanics of our interventions. In short, it forces us to consider exactly how our interventions work, how they create the equality of opportunity changes we want to see.
And this detailed consideration of change mechanisms is essential if we want to understand more about how and why our activities work and how we might go about applying successful activities to other contexts. Without this serious engineering work, logic models can make widening participation look like a form of arcane magic as activities lead seamlessly to desired outcomes.
We’re gonna have a good time tonight (celebration)
In contrast the current TASO (centre for Transforming Access and Student Outcomes) project to explore and pilot evaluation methodologies designed for interventions with small numbers of participants promotes the value of a more “thinky” approach to developing theories of change.
These methods were originally of interest because they suggest ways of conducting robust evaluation with small numbers of participants, in contrast to the higher statistical power required by trial-based and quantitative designs. But an interesting corollary has emerged as the project has progressed. (Full disclosure: I have a minor role in the project, helping ‘translate’ the methods into a WP sector context).
Many of the methods discussed in the TASO guidance, including Contribution Analysis, Process Tracing, General Elimination Methodology, actively employ theory of change as part of a rich, exploratory, process. In these approaches, TOCs are designed with a focus on the “mechanisms” or contributory factors of change, the actions or contexts that cause the change to happen.
Even more tantalisingly, many of these methods actively enrich, deepen and develop these TOCs as part of the evaluation process, often by inviting participants, and other key stakeholders, to explore their experience, understanding and perceptions of how these interventions actually work.
This is where the party’s at
What emerges from this process is a rich understanding of how and why interventions do what they do, and what contextual factors, or alternative hypotheses, that contribute to or generate the changes we want to see.
As a result, these methods are often better at grappling with the necessary complexity of work in this space; activities which often involve heterogenous groups of students with different backgrounds, interests and capacities, or outcomes generated by a range of different causal mechanisms, which interact and inform or influence each other.
Perhaps then, as a sector we should accept that once you go beneath the surface, things are a lot more complicated than they initially seem. But to break out the theoretical can opener and take a peak beneath the hood of our interventions, we need tools that enable and encourage us to do this – and I’m concerned that the logic model approach proposed in the latest APP consultation just doesn’t go far enough. It encourages us to continue riding on rails towards simple neat linear causal models, and to ignore the more complex and complicated gubbins going on beneath.
I’m aware that the OfS could, quite legitimately, complain that this forces them into a damned if you do / damned if you don’t paradox. Either they push the sector and its thinking hard and get slammed for raising expectations that will be difficult for providers to meet (especially small and specialists and others with limited resources) or they get moaned at for not going far enough. I’m tempted to lean more towards the harder road – with an awareness of provider limitations.
If we don’t take a disruptive approach and force a paradigm shift in the way we as a sector think about and evaluate our equality and widening participation activities, we’re just not going to be able to resolve the ambitious equality of opportunity challenges that OfS has set us. Complicated problems require complex thinking and a determination to grapple with this would be a real cause for celebration.