Academic Ranking of World Universities

The Academic Ranking of World Universities (ARWU), commonly known as the Shanghai Ranking, is a publication that was founded and compiled by the Shanghai Jiaotong University to rank universities globally. The rankings have been conducted since 2003 and updated annually. Since 2009, the rankings have been published by the Shanghai Ranking Consultancy. ARWU is the first global ranking of universities to be published. It was initially designed to provide a global benchmark against the various universities in China so they could assess their progress.

The Academic Ranking of World Universities is regarded to be one of the three most influential and widely observed international university rankings, along with the QS World University Rankings and the Times Higher Education World University Rankings. Its consistent and objective methodology is praised when compared with other rankings. However, it has also been criticized for its heavy focus on the natural sciences over the social sciences or humanities, and over the quality of instruction.

Methodology
The ranking compares 1200 higher education institutions worldwide according to a formula that took into account alumni winning Nobel Prizes and Fields Medals (10 percent), staff winning Nobel Prizes and Fields Medals (20 percent), highly-cited researchers in 21 broad subject categories (20 percent), articles published in the journals Nature and Science (20 percent), the Science Citation Index and Social Sciences Citation Index (20 percent) and the per capita academic performance (on the indicators above) of an institution (10 percent). The methodology is set out in an academic article by its originators, N.C. Liu and Y. Cheng.

The methodology used by the Shanghai Rankings is largely academic and research oriented.

Influence
As the first multi-indicator ranking of global universities, ARWU has attracted a great deal of attention from universities, governments and media. A survey on higher education published by The Economist in 2005 commented ARWU as "the most widely used annual ranking of the world's research universities." In 2010, the Chronicle of Higher Education called ARWU "the best-known and most influential global ranking of universities".

One of the factors in the significant influence of ARWU is that its methodology is said to look globally sound and transparent. EU Research Headlines reported the ARWU's work on 31st Dec 2003: "The universities were carefully evaluated using several indicators of research performance." Chancellor of Oxford University, Prof. Chris Patten, said "the methodology looks fairly solid ... it looks like a pretty good stab at a fair comparison." Vice-Chancellor of Australian National University, Prof. Ian Chubb, said "The SJTU rankings were reported quickly and widely around the world… (and they) offer an important comparative view of research performance and reputation." Margison (2007) also commented the ARWU ranking that one of the strengths of "the academically rigorous and globally inclusive Jiao Tong approach" is "constantly tuning its rankings and invites open collaboration in that." Philip G. Altbach named ARWU's "consistency, clarity of purpose, and transparency" as significant strengths.

The ARWU ranking and its content have been widely cited and applied as a starting point for identifying national strengths and weaknesses as well as facilitating reform and setting new initiatives. Bill Destler (2008), the president of the Rochester Institute of Technology, draw reference to the ARWU ranking to analyze the comparative advantages the Western Europe and US have in terms of intellectual talent and creativity in his publication in the journal Nature.

European commissioner of Education, Jan Figel, pointed out in an interview in 2007 that "if you look at the Shanghai index, we are the strongest continent in terms of numbers and potential but we are also shifting into a secondary position in terms of quality and attractiveness. If we don't act we will see an uptake or overtake by Chinese or Indian universities." Also, Enserink (2007) referred to ARWU and argued in a paper published in Science that "France's poor showing in the Shanghai ranking ... helped trigger a national debate about higher education that resulted in a new law... giving universities more freedom." The world leading think tank Rand Corporation used the ARWU ranking as evidence in their consultancy paper to the European Institute of Innovation and Technology.

In two subsequent research papers published by Academic Leadership (2009), then in an article published by the Times Higher Education (2009), Paul Z. Jambor of Korea University established the connection between any unfavorable image/reputation universities may develop (and/or their association, by country, to those universities linked to the wrongdoing) to a halt in their climb or even to a drop in their THE – QS World University Rankings. This is because 40% and 10% of THE – QS World Methodology is based on Academic Peer Review and Employer Review respectively. In essence, any unfavorable image developed by a group of universities, associated by country, tends to harm their collective rankings. For this reason, universities worldwide should seriously consider adhering to internationally accepted standards so that they do not run the risk of sliding in the ranks on the international front. Consequently, a number of critics consider this aspect of THE – QS World University Rankings unfair and even biased.

The new Times Higher Education World University Rankings (THE-Reuters), published since 2010 is based on a revised Methodology. In the Methodology of the THE-Reuters World University Rankigs, the 'Papers per research and academic staff' {at 4.5%} and the 'Citation impact (normalised average citation per paper)' {at 32.5%} indicators make it evident that a university's ranking heavily relies on the number and quality of research papers written by its faculty. With 95% of research papers written in English, the relationship between English language use and a university's subsequent ranking thus becomes ever more clear. Jambor highlights the connection between actual English use and university rankings in a pair of research papers respectively published by the US Department of Education: ERIC and Academic Leadership.

Criticism
College and university rankings often stimulate controversy (see Criticism of college and university rankings (North America) and Criticism of college and university rankings (2007 United States)) and the ARWU is no exception. A 2007 paper published in the journal Scientometrics found that the results from the Shanghai rankings could not be reproduced from raw data using the method described by Liu and Cheng.

In a report from April 2009, J-C. Billaut, D. Bouyssou and Ph. Vincke analyze how the ARWU works, using their insights as specialists of Multiple Criteria Decision Making (MCDM). Their main conclusions are that the criteria used are not relevant; that the aggregation methodology has a number of major problems; and that insufficient attention has been paid to fundamental choices of criteria.

The ARWU researchers themselves, N.C Liu and Y Cheng, think that the quality of universities cannot be precisely measured by mere numbers and any ranking must be controversial. They suggest that university and college rankings should be used with caution and their methodologies must be understood clearly before reporting or using the results.

Others have pointed out, the ARWU is known for "relying solely on research indicators", and "the ranking is heavily weighted toward institutions whose faculty or alumni have won Nobel Prizes": it does not measure "the quality of teaching or the quality of humanities."

Ioannides et al. suggested that (in common with all ranking systems they reviewed), the ranking lacked construct validity modest concordance between the Shanghai and Times rankings. They highlighted measurement precision, and transparent methodology as important issues.

Like the Times Higher Education's rankings, the ARWU has been criticized by the European Commission as well as some EU member states for "favour[ing] Anglo-Saxon higher education institutions".

Rankings
The table below contains the overall rankings as ordinal numbers (i.e., 1 is best, 2 is second best, etc.) from 2003 to 2012 for all universities that ranked in the top 100 in one of the years tabulated. The ranking is omitted for years in which the school did not land within the top 100. Note the full ranking contains over 500 universities. If a university is not listed in this table, it did not rank in the top 100 in any of the years tabulated.