A call by a group of US colleges earlier this month to boycott the most influential university ranking in the United States has shone the spotlight on the problem of institutional rankings. Experts argue that these are based on dubious methodology and spurious data, yet they have huge influence. But help is at hand: European academics are putting some rigour into rankings by tackling the problem themselves.

On 5 May, Douglas Bennett, president of Earlham College in Richmond, Indiana, and 11 other college presidents asked colleagues to refuse to fill out surveys for the U.S. News & World Report. That survey of institutions, they argued, “implies a false precision and authority that is not warranted by the data they use”. Another 17 colleges have since signed up.

“All current university rankings are flawed to some extent; most, fundamentally,” says Alan Gilbert, president and vice-chancellor of the University of Manchester in Britain. “But rankings are here to stay, and it is therefore worth the time and effort to get them right.”

The rankings in the U.S. News & World Report and those published by the British Times Higher Education Supplement (THES) depend heavily on surveys of thousands of experts — a system that some contest. A third popular ranking, by Jiao Tong University in Shanghai, China, is based on more quantitative measures, such as citations, numbers of Nobel prizewinners and publications in Nature and Science. But even these measures are not straightforward.

Thomson Scientific's ISI citation data are notoriously poor for use in rankings; names of institutions are spelled differently from one article to the next, and university affiliations are sometimes omitted altogether. After cleaning up ISI data on all UK papers for such effects, the Leeds-based consultancy Evidence Ltd, found the true number of papers from the University of Oxford, for example, to be 40% higher than listed by ISI, says director Jonathan Adams.

Researchers at Leiden University in the Netherlands have similarly recompiled the ISI database for 400 universities: half a million papers per year. Their system produces various rankings based on different indicators. One, for example, weights citations on the basis of their scientific field, so that a university that does well in a heavily cited field doesn't get an artificial extra boost (see table).

Table 1 European rankings: one way or another

The German Center for Higher Education Development (CHE) also offers rankings à la carte for almost 300 German, Austrian and Swiss universities — users can query its online database by individual discipline, number of publications or student outcomes, for example. With European Union support, the CHE is expanding the system to cover all Europe. Adams says that more complex systems such as these are better than oversimplified summary tables, which policy-makers tend to take at face value.

The US Commission on the Future of Higher Education is considering creating a similar public database, which would offer competition to the U.S. News & World Report. Bennett says he will work with such groups that use “professional standards of measurement”.