On Tuesday I chaired a conference on the Knowledge Exchange Framework (KEF).
It’s been nearly six months since Framework Johnson introduced us to “the third leg of the HE stool” – KEF – received by many academics at the time as “yet another bloody league table”.
However, speakers at the event were quick to highlight how much work on knowledge exchange happened before the October announcement, has happened since, and is still to do. Although it’s useful to give REF, TEF, and KEF equal profile, in reality, the three are very different. As one speaker put it, the latter should be a formative rather than summative assessment and is a less “mature”, still-evolving field.
The diverse and holistic nature of knowledge exchange activities, including but not limited to commercial ones, contributes to this agenda being harder to articulate. However, not only are there billions of pounds to be had, but also an opportunity for institutions to honestly self-evaluate, and then to communicate their own unique strengths. Much “KE” activity already takes place, but it needs appropriately recording, analysing, and retelling.
KE’s 3rd place image problem
Knowledge exchange has an image problem. It’s been called technology transfer, technology exchange, knowledge transfer, engagement, and a range of other names over time, the concepts and definitions remain opaque and contested. It’s everything beyond research and teaching, but also includes a bit of both. See what I mean?
Technology Transfer is the more visible and more transactional “tip of the KE iceberg” (as one delegate called it), and one that ministers tend to grasp quickly; “start-ups and spin-outs you say, we like those, don’t we?” UK universities are sometimes seen as relatively bad at technology transfer or commercialising IP, though the evidence of that doesn’t appear to stack-up.
However, wider and potentially more transformative conceptions of knowledge exchange – that include everything from outreach, student volunteering, staff development, external training, and public engagement – are harder to measure, especially across diverse institutions and regions.
KEF shouldn’t just be about money and commercialisation, but that’s the only message some audiences – such as HMT – care about. As one well-placed speaker noted, the Treasury wonks are getting more demanding, more sophisticated, and more sceptical. The sector and its representatives need to “get cleverer” at showing the precise impact caused by investments.
It’s looking like KEF will involve a leadership statement (or concordat) – for institutions to self-evaluate, define, and communicate their own KE strategy. And, of course, some metrics. The latter will probably see institutions clustered and benchmarked. Speakers had good intentions about “proportionate” reporting burdens.
A one-size-fits-all approach clearly won’t work, as another speaker said, this “isn’t a simple league table with one institution best at knowledge exchange”. Also, real life is messy, how can everything universities do be measured? In reality, much KE activity is already happening, it’s just not being captured, synthesised or described. But, nobody wants every academic completing a timesheet.
Technology could help, with a HEFCE/UKRI funded project underway to see if metadata can be used to link datasets about impact. But KE data is not always robust, for instance around public engagement. One speaker felt it was actually better to get on and do this stuff rather than talk about it: “we’re all just examining our navels with this KEF stuff”.
There’s also the question about how KEF relates to REF impact case studies. The consensus seems to be that KEF is about an institutions’ overall capabilities, not a peer-reviewed example of a single research impact in time.
One promise of KEF is that it should support better self-evaluation so that providers can work on their weaknesses honestly without finger-pointing, and crow about their strengths. It should provide institutions with valuable new information they can then tailor to different audiences. However, it should also be formative, open, and flexible – rather than box-ticking. Universities will always need to answer questions about how their impact compares (with other institutions and internationally), and how they can improve. KEF could help with that.
The general public is still remarkably unclear about what universities actually do, even after some of them studied there. For example, Oxford residents apparently don’t know that the university based there does research, while Brightonians are largely ignorant that there are two universities in the town.
However, as one speaker pointed out, in many ways “knowledge exchange” predates new-fangled innovations like “research”, and goes to the heart of why many universities were founded. And it’s not the only framework to have brand/naming/accuracy/interpretation issues.
Intrinsic motivation or funding carrots
It looks like there’s serious money to be had around this agenda, £7bn via UKRI and another £7bn via the industrial strategy, in this spending period. But, will it be concentrated in courses, regions, and institutions that are the “usual suspects”, as is increasingly the case with research funding?
Also, will the private sector be able to offer double the amount of public funding (as required by current matching policy), and if so what strings will come attached? Speakers pointed out that international comparisons with silicon valley and blockbuster drug patents aren’t that helpful. We don’t have “unicorns” popping up, whose founders are keen to give back to their alma mater. And our early-stage investment scene is more The Office than Wolf of Wall Street.
Some speakers echoed my concerns that OfS and Research England (and their parent departments), will pull institutions in different directions eventually. Research England’s parent organisation, UKRI, is a “federation” of 10 funding councils, three devolved nation funders, plus Innovate UK. The ac.uk emails have been replaced by .org.
There’s also an outstanding question about the “right balance” (to quote Johnson) between HEIF funding and mainstream QR funding. Apparently, changes to the formulas for the latter are not sexy enough to interest most ministers, whereas HEIF is more headline-friendly.
A key recurring question was whether institutions should care about doing this stuff better for their own sakes (to focus on their core missions), or should behaviour be incentivised by the promise of funding? Those from institutions with more resources seemed to lean towards the former, and vice versa.
Tensions between competition and collaboration were also raised by delegates, though this is something the sector seems to handle well in most other fields. Back in October Johnson talked of a “constructive competitive dynamic”, though there are apparently competing visions (about that vision) at UKRI, as well as from the devolved nations who reject the competition element.
Other delegates wondered if employers and businesses have sufficient influence over KEF, with responses to that seeming to again vary by institution. In my view, it shouldn’t be either/or, there are plenty of examples where businesses and HEIs are working together well, though KEF has the potential to highlight and incentivise that work better. One speaker described how KE teams can be the one-stop entry point to the university for businesses, with their own institution working towards accredited customer services.
Human beans count
Another theme from the conference was about the importance of those pesky little variables that lie at the nexus between funding, excellent research, teaching, and impact – or people, as they’re sometimes known. And it’s not just about academics.
The whole knowledge exchange field is professionalising in new ways, for instance with communities of KE practice emerging through organisations such as PraxisAuril. Research shows that the varied group of “KE practitioners” are committed to helping academic colleagues translate their research.
Vitae – the association for early career researchers – was picked out as an effective role model, quietly bolstering perceptions of the once “lost tribe” of post-doctoral researchers.
I arrived at the conference feeling a bit confused and sceptical of KEF but left a little clearer and more optimistic. This stuff is tricky, but it matters.
2 responses to “What’s the latest with the knowledge exchange framework (KEF)?”
As UKRI proceeds with its definitions of KEF, it might be worth looking at http://www.umultirank.org – an international HE comparison tool that allows users to compare universities around the world on the basis of a huge number of data indicators bunched into five ‘dimensions’: teaching, research, internationalisation, regional engagement and, you guessed it… knowledge exchange.
It’s far from perfect, but it may be sensible to consider international benchmarking as well as national and this offers an existing baseline framework and data source.
There are some interesting measures in umultirank. Indeed, we are evaluating most of them as part of the ongoing KEF analysis. Some oddities though (e.g. normalising patents by student FTEs). Income from private sources is an interesting one – Whilst it can give a useful impression of scale of activity, it can be quite discipline specific. We are exploring some more nuanced ways of taking into account the assets and underlying capacity a University has, including discipline mix, size, income from various sources, location etc.
I do agree the international dimension is interesting – many English HEPs will consider their peers to be international. However, I’d be interested to see how rigorously some metrics are gathered in some countries. When we’ve done international comparisons, the devil is often in the detail of the data definitions.