The Government has tasked HEFCE, and in the longer term Research England within UK Research and Innovation (UKRI), to construct a new Knowledge Exchange Framework (KEF) system. HEFCE has announced that it is establishing a Technical Advisory Group to advise it on design and delivery of this new system.
The essence of this task is to make fair comparisons between higher education institutions on their performance in knowledge exchange, particularly focused on the delivery of the Industrial Strategy.
These fair comparisons would enable HEIs to understand their knowledge exchange (KE) effectiveness against their peers, leading to improvements in performance and efficiency. They would provide better information for the public, businesses, local bodies and others enterprises on strengths and weaknesses in the HE sector, and opportunities for partnering. They would also add to evidence on successes of KE policy and funding overall, improving transparency and accountability and shaping future strategy.
This development will attract much comment and discussion about the policy purpose and use of KEF metrics, details of which have been set out in a letter from the Universities Minister to David Sweeney, the RE Executive Chair Designate. However, the immediate task in HEFCE is to develop a sound implementation approach to this task.
Making it happen
There are clear challenges in implementing a fair institutional KE comparisons system. However, there are also important opportunities to learn more from this exercise, and to develop our wider understanding of and evidence on KE and commercialisation. This is particularly important given future challenges posed by Government to universities. Universities will benefit from the additional funding announced in the Industrial Strategy, with the Higher Education Innovation Fund rising to £250 million by 2020-21, but much is expected of universities in return. Universities are flagged in the Industrial Strategy as a key part of its delivery, both in working with business and in local economic development. Universities – and their strong relationships with businesses and investors – will also be vital to tackling the 2.4% R&D GDP challenge set out in the Strategy, which requires a significant increase in public and private investment in R&D.
So what will we learn from KEF metrics system development?
What could we learn?
First, we shall learn how to share knowledge and expertise about KE and commercialisation across new agencies in UKRI and Office for Students, wider partners (such as Devolved Administrations) and experts from universities and the wider KE ecosystem.
The group assembled to advise HEFCE and then Research England brings together analytical minds from different disciplines and different contexts. The group will also benefit from advice and support from HESA, HEFCE Analytical Services (and in future from the Office for Students) and HEFCE/Research England, UKRI and Research Council KE analysts. The work will build on established sources like the Higher Education Business and Community Interaction (HEBCI) survey, but will reach out for wider data from across the UKRI Councils – for example from ResearchFish or Innovate UK grant data – and elsewhere.
Second, we will think more intensively and thoroughly about the gaps in our collective data, and the potential avenues to fill those gaps. This has always been an important issue for HEFCE, but this new exercise puts added impetus and effort into this quest. There may be short term implications in relation to work HESA will do to review the HEBCI survey, but there is also likely to be a long-term issue for UKRI. As example, the McMillan review of good practice in technology transfer, referenced in the Government’s Industrial Strategy White Paper, noted the significant challenges in producing better metrics on spin-out and licensing performance of universities, and this is a critical area to support improvement in commercialisation policy and practice.
Third, to make fair comparisons, the technical group will need to investigate thoroughly the characteristics of universities that influence their KE performance. This is important to enable us to compare universities against their “peers”, as set out in the Minister’s letter. It is also important to be able to set “benchmarks”. Benchmarks need to present reasonable levels of expected performance, factoring in underlying features of the institution which are not directly related to KE performance.
Variables of scale
One example of the challenge in fair comparisons is taking account of scale. How do we take account of relevant dimensions to the size of the institution, such as the number of academics (or its research volume) or students, in making comparisons of performance in, for example, creating new spin out or graduate start up companies. This is an important factor often forgotten in international comparisons.
As example, UK policy makers focus on success of Stanford University in America in commercialisation. However, Stanford’s success needs to be set in context of its resources, and research concentration. A 2014 MIT Skolkovo report highlighted that Stanford had an annual revenue budget of $4.1 billion and 6,980 undergraduate students. Oxford University, as the largest UK university discussed in the report, had an annual budget of $1.8 billion, and 16,745 students. The effectiveness of Oxford in producing spin-out companies (which is impressive) needs to be judged against overseas comparators in the context of input capacity, research volume and resources.
HEFCE has published a series of expert reviews on the issues of university KE characteristics ,and on options to cluster universities with their peers, from KEF metrics advisory group member Tomas Coates-Ulrichsen of the Centre for Science, Technology and Innovation (CSTI) at the University of Cambridge. Other factors that may affect KE performance include research intensity, discipline mix and place.
These will be exciting and interesting times for KE and commercialisation wonks. We will share more data, and reduce burden through multiple uses of our data. We will explore opportunities to expand our datasets, to improve our descriptions of KE performance and hence effectiveness of our policy and funding instruments. And we will learn and promote better understanding of the varied characteristics of universities, and how these characteristics shape the contributions of higher education to the economy and society.