A rather obscure world university ranking
The URAP ranking has been going for a few years now and indeed I wrote about it here back in 2013. It really still has yet to gain any purchase in the rough, tough battle for the top of the ranking of rankings though.
University Ranking by Academic Performance (URAP) Research Laboratory was established at Informatics Institute of Middle East Technical University in 2009. Main objective of URAP is to develop a ranking system for the world universities based on academic performances which determined by quality and quantity of scholarly publications. In line with this objective yearly World Ranking of 2000 Higher Education Institutions have been released since 2010.
Things really haven’t moved on then in recent years. And given that the ranking is pretty much largely based on publications and citations, both of which are covered fully in other league tables this perhaps explains something. Let’s have a closer look at the methodology (which includes a couple of exciting formulae):
URAP 2014-2015 World Ranking is based on 6 academic performance indicators. Since URAP is an academic performance based ranking, publications constitute the basis of the ranking methodology. Both quality and quantity of publications and international research collaboration performance are used as indicators. The indicators, the data sources, and the duration of coverage are summarized in the table below.
A detailed description of each indicator is provided below:
Article: is a measure of current scientific productivity which includes articles published in 2013 and indexed by Web of Science and listed by Incites. Article number covers articles, reviews and notes. The weight of this indicator on the overall ranking is %21.
Citation: is a measure of research impact and scored according to the total number of citations received in 2011-2013 for the articles published in 2011-2013 and indexed by Web of Science. The effect of citation on the overall ranking is %21.
Total Document: is the measure of sustainability and continuity of scientific productivity and presented by the total document count which covers all scholarly literature including conference papers, reviews, letters, discussions, scripts in addition to journal articles published during 2011-2013 period. The weight of this indicator is %10.
Article Impact Total (AIT): is a measure of scientific productivity corrected by the institution’s normalized CPP(1) with respect to the world CPP in 23 subject areas between 2011 and 2013. The ratio of the institution’s CPP and the world CPP indicates whether the institution is performing above or below the world average in that field. This ratio is multiplied by the number of publications in that field and then summed across the 23 fields. which is summarized in the following formula:
This indicator aims to balance the institution’s scientific productivity with the field normalized impact generated by those publications in each field. The weight of this indicator is %18.
Citation Impact Total (CIT): is a measure of research impact corrected by the institution’s normalized CPP with respect to the world CPP in 23 subject areas between 2011 and 2013. The ratio of the institution’s CPP and the world CPP indicates whether the institution is performing above or below the world average in that field. This ratio is multiplied by the number of citations in that field and then summed across the 23 fields. This indicator aims to balance the institution’s scientific impact with the field normalized impact generated by the publications in each field, which is summarized in the following formula:
The contribution of this indicator to the overall ranking is %15.
International Collaboration: is a measure of global acceptance of a university. International collaboration data, which is based on the total number of publications made in collaboration with foreign universities, is obtained from InCites for the years 2011-2013. The weight of this indicator is %15 in the overall ranking. For the 2014 URAP World Ranking, bibliometric data is obtained through Thomson Reuters’ InCites research analytics service, which provides an interface to the Web of Science database. The 23 subject areas used in the ranking are based on the discipline classification matrix developed by the Australian Research Council for journals indexed in Web of Science.
All pretty clear then. However, one of the reasons for creating a new world ranking surely has to be a desire to improve the representation of your university and your country in the league tables. This really doesn’t seem to have happened here though. Although 76 Turkish universities do appear in the top 2,000 (yes, 2,000) and the originating institution, the Middle East Technical University, tops this national list they are still only 433rd in the world (which could be suggesting to them that they might like to find some more helpful criteria).
But, I hear you ask, how did UK universities fare in this challenging environment? Not bad actually. The UK top 20 is virtually unchanged from three years ago with just a few minor positional swaps and Queen Mary replacing Aberdeen in the top set.
Nevertheless, it’s hard to be too critical because, as back in 2013, they do send out these very impressive looking certificates to confirm the ranking position. Splendid!