Letter from Australia: ERA, EI add up to not much

ERA as an acronym is just too cute. It stands for Excellence in Research for Australia and is our equivalent of the UK’s REF. It begs headlines such as Beginning of a new ERA, which is why I’ve avoided the trap and stuck to something entirely more dull.

This was its fourth iteration of the ERA – the first was in 2010. As usual there has been a mountain of braying and bragging from institutions and peak groups and it has all been pretty well justified. From what I can see, every single university did better than the previous iteration in 2015 with a staggering twelve universities scoring at least one 5* (well above world quality) for the first time.

Before I get into it, here’s a beginners guide: the Australian Research Council judges the quality of research in 22 broad disciplines and 157 subject areas against world standards and then awards that research one of five categories from 5* (well above world standard) to 1* well below world standard. Universities might conduct research in some areas that is not included in the ARC because there is not enough volume to meet certain thresholds. Unlike in the UK, there is no funding attached to performance on the ERA – although that was the original intention but subsequently abandoned.

[Full screen]

Some observations

Before I try and make some sense of out this let me make a number of observations.

Universities Australia pointed out that more than 90% of Australian university research assessed by the ARC is rated as world class or higher — “putting Australia at the forefront of the global research effort”.

As mentioned, twelve universities scored their first ever 5*. These include Newcastle and the University of Tasmania, which received four 5*s each. Western Sydney, University of the Sunshine Coast and the University of Southern Queensland each received three 5*s. That’s impressive.

Flinders University is the only public comprehensive university to receive not a single 5*. Although it has improved its overall performance including moving from one 4* in 2015 to six this year and reducing the number of 3* from eighteen to eleven, I imagine there has been a fair amount of heartache and introspection going on at Flinders over the past few days.

Four universities received only 5*and 4* – Melbourne, Queensland, Sydney and UNSW – up from two in 2015.

UNSW made the greatest games increasing its number of 5* from 10 to 17 over three years. That is phenomenal.

At the other end of the spectrum, only eight universities were assessed to have produced research well below world standard (1*) and only one of these was in the STEM area (Federation). Two universities – Bond and Notre Dame – received two 1*.

Rankers rights

[Full screen]

So it’s a journalist’s inclination to reduce everything to a ranking. I know it’s not particularly sophisticated, but using the same methodology as we used at The Australian for each ERA iteration since 2010, we can start seeing some trends.

The Go8 understandably continue to dominate and understandably so. However, both Queensland University of Technology and Deakin are contenders. The top 10 show strength across their entire research outputs.

There appears to be a lot of mobility in some of the smaller research players such as Southern Cross University and University of Southern Queensland. I suspect it is the huge role individual research groups play in their overall performance which can subsequently shape their overall position. However, a number of smaller and regional universities have obviously put in place good long-term strategies to improve the quality of their research performance, concentrating on certain areas. Given this was the original aim of the ERA, then it’s aim has been met with gusto across the sector.

Some thoughts

These stellar results come against a backdrop of recent funding cuts. Last December, $330m was cut from a university research stream – which wouldn’t have any impact on the current ERA results. However, a series of other cuts estimated to be around $2.1bn will be felt because of the way universities cross subsidise research from student teaching grants. Can universities continue to trend upwards in spite of the cuts? We will have to wait and see, but I suspect the answer is yes.

Obviously, the upward trajectory of every university in the country suggests a couple of things: one they are actually getting better at conducting high-quality research and the ERA has provided an effective means of focusing minds on what research is conducted and why. That is a good thing.

Secondly, universities are getting better at submitting their ERA data and reports. It’s hard to know whether this means the quality of the actual research has improved or whether the quality of the data and reporting has improved.

Indeed, the ERA has spawned an entire new industry with every university now having teams whose entire job is to collect and submit ERA data to ensure their institution’s performance is seen in the best possible light. The Go8 estimates its costs its institutions around $8 million a year. I’ve seen one estimate of the cost to the entire sector as upwards of $100m, although I suspect that is a highly inflated figure. Either way, it’s still a lot of money.

As one insider noted, preparing for ERA is like teaching for the test. Universities know what they have to do to perform well having had three previous iterations to learn from.

Prestige is a huge motivator – it attracts international research collaborations as well as international students. Maybe at least some of the results we see are driven by what research universities put forward. They might, for example, feel an obligation to continue research in obscure and unfashionable areas – let’s say medieval studies – but produce just enough to fall under the threshold to be included in the ERA. Is this gaming the system? Probably. Does it matter? Probably not.

The huge disparity between performance in the STEM subjects and the HASS areas likely has much more to do with the data that is collected and the methodologies applied than a true indication of research performance. In the broad field of education, for example, Melbourne University is the only institution to score a 5* – although all public universities submitted data and were judged in that area. Monash and Melbourne are the only institutions to score 5* in economics. UNSW the only one is creative arts and writing. Yet 21 universities were rated as 5* in environmental sciences. You see my point.

The impact factor

Two days after the ERA was released the ARC also released the first Engagement and Impact Assessment. I think we could aptly describe it as all cheese and no chalk. It’s just an embarrassment for both the ARC and the universities that spent so much time and money submitting case studies.

As one person said to me: “It’s nothing more than an essay writing competition”. And that is being generous.

Another said: “It’s just a load of s#@t.” And that is still being generous.

Is it all worth it. Some universities certainly are questioning the time and expense required to submit to these exercises. With no carrot (funding) and no stick (loss of it), there are no real motivations other than prestige and how the results can be applied to marketing campaigns.

In the end, it’s a huge bureaucratic exercise that is lost on the average taxpayer. And while the ERA has had significant impacts on institutional behaviour and approaches, the engagement and impact report seems to lack any clout whatsoever. Can it be redeemed? We will have to see what happens after the May election.

 

One response to “Letter from Australia: ERA, EI add up to not much

  1. I don’t understand the point below – the top 10 on the various “journalistic” rankings are Go8 plus QUT and Macquarie. This point seems a bit harsh on Macquarie. Is this an error?

    “The Go8 understandably continue to dominate and understandably so. However, both Queensland University of Technology and Deakin are contenders. The top 10 show strength across their entire research outputs.”

Leave a Reply