This article is more than 5 years old

KEF consultation closer

Louis Coiffait analyses the latest KEF documents, recaps what KEF and knowledge exchange even are, and poses four key questions.
This article is more than 5 years old

Louis was an Associate Editor at Wonkhe.

Today, Research England publishes three documents to help institutions prepare for the consultation later this month on the Knowledge Exchange Framework (KEF).

One of those documents summarises the 106 responses to the December 2017 call for evidence, about what data is relevant to knowledge exchange and how it should be used appropriately. In particular, it looks at how institutions can be compared in a “fair and meaningful” way. The majority of responses are read as having a “cautiously positive tone, essentially this could be useful if done well”.

Another document outlines which UK Research and Innovation (UKRI) research council data will be used to develop KEF, including the Higher Education Business & Community Interaction (HE-BCI) survey, the Higher Education Innovation Fund (HEIF) process, and other HESA and UKRI data collections. Research council grant funding to institutions, and research outcomes as disclosed by researchers to the research impact assessment platform Researchfish, have apparently been useful but are deemed unsuitable for KEF “at this time”.

The third document is a report by Tomas Coates Ulrichsen of the University of Cambridge. It highlights the diversity of the sector and proposes initial clusters of institutions with “similar sets of knowledge and physical assets” which could be benchmarked against each other. The clusters are backed by a conceptual framework that considers each university’s capabilities along three key dimensions; existing knowledge base, knowledge generation, and physical assets. Research England will build on the clustering approach and may manually assign some universities to groupings before the consultation, for instance where clusters have relatively few institutions, or where clusters contain uneasy bedfellows. Elsewhere on Wonkhe today, Hamish McAlpine of Research England explains the thinking behind the clusters further.

What’s KEF again?

You may be forgiven for being a little hazy about KEF, launched by Jo “framework” Johnson back in October 2017, his parting gift to the sector as it were.

In a 24 November 2017 letter to David Sweeney, Johnson tasked Research England with developing KEF to:

Evaluate the contribution our universities make to the exploitation of knowledge … providing comparable, benchmarked and publicly available performance information about universities’ knowledge exchange activities.

KEF is intended to give universities new tools to better understand, benchmark and improve their own performance. The aim is also to provide businesses and others with more information to see, understand and access the knowledge and expertise embedded in different English universities. And of course, it’s also a way of making universities more accountable for their knowledge exchange activities.

And what’s knowledge exchange?

As a concept “knowledge exchange” has a trickier story to tell than universities’ other big missions, teaching and research. Johnson described it as the “third leg of the stool” and some just say it’s “everything else” that universities do. And yet, none of this quite captures it, as it’s typically closely intertwined with teaching and research.

At its heart, knowledge exchange is about getting the new knowledge that is created through research and transmitted through teaching, out to other people so they can use it in some way too. That can include commercialising research into new enterprises or intellectual property, transferring new technologies from lab experiments into practice, or the wider (and harder to measure) impacts of knowledge such as public-facing campaigns, professional development, or policy influence. No wonder it’s a slippery concept.

Four questions for KEF

One key question is whether (to quote Research England) KEF will successfully evaluate the exploitation of knowledge “in its broader sense, not solely commercial”. It’s looking like the harder things to measure – such as public engagement and cultural impact – will be considered later, as existing data sources are of little use. Perhaps future iterations of KEF will capture such things even if they don’t fit easily into metrics.

Another related concern is whether KEF’s metrics and processes have sufficient “regard to the burden and cost of collection”, to quote Johnson. It’s not as if universities don’t have enough other regulations, metrics and accountability measures to adapt to at the moment, with a disproportionate impact on smaller institutions.

And third, will the proposed clusters get the balance right between a level playing field for all institutions, and meaningful comparisons for groups of benchmarked peers? It remains to be seen if it will highlight (and make more accessible) the full range of excellent knowledge exchange activity across a highly diverse sector, as well as help to incentivise improvement.

Finally, some people might say the KEF development timescales reflect careful and consultative policymaking. A technical advisory group, chaired by Richard Jones of the University of Sheffield, has provided advice. It has had to account for the work of the knowledge exchange steering group and concordat led by Trevor McMillan of Keele University. Others involved include the Office for Students, universities, learned societies, PraxisAuril, the National Centre for Universities and Business, the devolved funding councils and executive bodies, and other UKRI councils. However, other people might say the timelines have slipped, from the Spring 2018 consultation and Autumn 2018 implementation requested by Johnson. The KEF consultation is expected to run from November 2018 to “early 2019”.

One response to “KEF consultation closer

  1. Knowledge Exchange is a 2 way street – it involves the outside world bringing their problems and their latest thinking into Universities just as much as Researchers pushing their new knowledge out into the world. Without it, academics are in danger of solving problems they have invented out of nowhere, devising solutions that nobody needs, and reinforcing the idea of “that’s an academic question” meaning that it has no bearing on real life.

Leave a Reply