This article is more than 5 years old

Open science, research indicators and reproducibility

Neil Jacobs of Jisc ponders the latest developments in open science, here and in the EU.
This article is more than 5 years old

Neil is Head of Scholarly Communications Support at Jisc and also a Jisc representative on the Open Access Implementation Group.

There’s been a flurry of reports and announcements on open science over the last few weeks.

Plenty of good holiday reading – though, like the UK early summer, perhaps as much heat as light. Certainly, UKRI’s policy teams are likely to have heavy suitcases in August.

At July’s Coalition for Networked Information (CNI)/Jisc event in Oxford earlier in the year, Adam Tickell of the University of Sussex outlined four challenges for universities in the UK: a crisis in confidence in them as public good bodies; the onset of more serious financial challenges; increased regulation; and the need to demonstrate measurable societal benefits. All four of these are enmeshed in questions of open research, incentives and indicators.

While Tickell argued that open research enables universities to play a vital role in countering fake news and bolstering the legitimacy of science, others at the event questioned whether academics have fully taken on board the change in the public settlement on research funding – especially marked in the UK. If research is not reproducible, or in other ways verifiable, then openness will be counter-productive, industry will not trust scientific findings, and the value taxpayers expect from higher research funding will be elusive. The potential damage to universities could be considerable. There is, therefore, an urgent call on universities and researchers to actively encourage an environment that promotes openness and verifiable research, and a positive research culture. All the signs are that we can expect UKRI to be very active on this.

A busy summer at UKRI

UKRI’s leadership role in the related areas of open science and research integrity has been given a little more form through the recommendations of the House of Commons Science and Technology Committee – one of which asks it to convene a national committee on research integrity. No doubt its early priorities will revolve around the Committee’s criticisms of the sector’s performance so far in implementing the Research Integrity Concordat, but there are also notable recommendations around access to, and reuse of, research data. These may get woven into the university minister’s reflections and guidance to UKRI following the delivery – by a group led by Pam Thomas of Warwick University – of a roadmap for open research data infrastructure, which is due to land on his desk shortly. That report will also be required reading for the UKRI team working in parallel on a UK “research and innovation infrastructure” roadmap.

And, of course, there is the review of UKRI’s policy on open access to research publications, and the gathering momentum behind the Forum for Responsible Research Metrics (FRRM). Last month this Forum released both a progress report and recommendations to Research Excellence Framework (REF) panels on the use of research indicators in the “environment” and “impact” sections, complementing previous advice on their use in the “output” section of REF. The overall message is to take care, to pick from a menu of indicators that might be suitable for a particular institution’s research environment, and to avoid indicators that are not robust or transparent, or that impose an undue burden.

The right indicators

The list of indicators failing those tests is long and, disappointingly for open science advocates, includes indicators for the uptake of ORCID, of permissive licences for research outputs, and measurements of the financial commitment of the institution to various aspects of open science. The FRRM guidance gets a mention in the consultation on the draft REF panel criteria, though it “should not be regarded as mandatory nor a ‘checklist’ of additional requirements”.

The guidance and criteria have been cautious to avoid, as far as possible, any chance that they will inadvertently promote uses of indicators that could damage research integrity – though, of course, much will depend on what universities think the panels will really pay attention to. There remains a need for us as a community to make the responsible use of research indicators as everyday as the responsible use of any other data in research. But, of course, the recent Science and Technology Committee report on research integrity notes some shortcomings there too.

Europe’s “Plan S”?

Things are no quieter in Europe. While sceptics might suspect the EU of trying to get as much influence on the UK as possible while they still can, the pace of activity is much more likely to be related to the change in the Commissioners in 2019. Robert Jans Smits has been appointed as a special “open access envoy” by the European Commission (EC), charged with introducing some impatience into opening-up Europe’s research outputs at a reasonable cost. A recent high-level meeting with senior figures across Europe has promised us “Plan S”, as opaque as it is intriguing.

More openly, perhaps, the EC’s Open Science Policy Platform (OSPP) has released its “integrated advice”, making recommendations to a wide range of players on an equally wide range of topics, including research rewards and incentives, indicators and metrics, research integrity, scholarly communication and the EC’s flagship initiative, the European Open Science Cloud (EOSC). Notable among those recommendations, and highly relevant to UKRI and the REF is one that states:

“The data, metadata and methods that are relevant to research evaluation, including but not limited to citations, downloads and other potential indicators of academic re-use, should be publicly available for independent scrutiny and analysis by researchers, institutions, funders and other stakeholders”.

This could be taken as support for initiatives to make citation and other data openly available. That would certainly be welcomed by many of those pressing the case for UKRI to join the Wellcome Trust in recruiting academics themselves into meta-research projects to increase the sector’s levels of self-awareness. In addition, those awaiting Research England’s decision on which service will provide citation data to some of the REF panels may watch for any implicit reference to the OSPP recommendation.


Leave a Reply