A government-funded study that has ranked Spain's 44 public universities by giving each of them a score between one and ten has stirred up a fierce controversy.
Older universities came out best — seven of the top ten are at least 100 years old — and many of their officials say that the study confirms the quality of their work and shows the usefulness of comparative evaluation.
But younger universities, particularly those that are more technologically orientated, complain that the survey is based on outdated criteria. Young institutions accounted for eight of the ten universities that scored less than five.
The study was funded by the Ministry of Education and Culture. Scores were based on a survey by the National Statistics Institute. Six criteria were measured: educational development, organizational structure, teaching resources, participation of women, PhD activity and student success rate.
Carles Solà, rector of the top-scoring institution, the Autonomous University of Barcelona, says that his university's score of 8.5 “is congruent with previous data from diverse Spanish and European sources”. He says that the evaluation process “may well open the doors to external observers in order to start objectives-based programmes”.
Rafael Puyol, rector of the Complutense University of Madrid, which came second in the ranking, says that the result shows that, despite an “explosion” in student demand in recent years, “we have never overlooked quality”. His university has more academic centres and students than any other in Spain.
The report has also been welcomed by Darío Villanueva, rector of the University of Santiago de Compostela, who says its results, “including our seventh position”, are similar to those of an annual quality evaluation produced by a private company. The implication, he says, is that an evaluation culture already familiar in other countries “is beginning to emerge in Spain”.
But Jaume Pagès, rector of the Polytechnic University of Catalonia in Barcelona, says the report is “inaccurate, unfinished, rash and lacking methodological rigour”. According to Pagès, the quality of a university should be measured by whether “a series of internal and external objectives previously defined by the institution” have been reached — not through the use of indicators “arbitrarily established regardless of the university's goals”.
He is particularly critical of the lack of external factors, such as those related to the social and economic environment or the number of students enrolled in international exchange programmes. Pagès says the report merely measures how close universities come to a “predefined, conventional, generalized and obsolete pattern”, and adds that he is determined “to officially clarify problems involving quality in our universities”.
Low-scoring universities have been especially critical of the failure to take account of links with private companies, for example through research contracts or patents.
Although the authors of the study say they were unable to include such a measure because of a lack of data, it appears to be a major reason why the ‘technological’ universities scored worse than expected.
The senior author of the study, Jesús de Miguel, director of the department of sociology and analysis of organizations at the University of Barcelona, and a social sciences consultant to the European Commission, has come under pressure from academics and the media. The rector of his university, Antoni Caparrós, has described the report as “not the work of my university”.
Gemma Rauret, director of the public agency responsible for universities in Catalonia, points out that the report has not been submitted to experts' criticisms “but has sought media publicity”. However, says Rauret, “its most positive aspect is that it has triggered a rethinking of the need to supply more and better information”.
In response to the study, Saturnino de la Plaza, president of the Council of Rectors of the Spanish Universities and rector of the Polytechnic University of Madrid, says that the council is to set up a commission that will evaluate the quality of universities but will not provide any type of ranking.
De la Plaza says the study was biased against technological institutions. “You can only compare like with like.”
About this article