All academics will be familiar with the phrase 'publish or perish'. Usually it is meant loosely — a recognition of the impact that scholarly publications can have on career progression, and in particular on the likelihood of success in applications for research funding and job promotion. But it now seems that administrators at the University of Sydney, Australia, are taking this idea to its literal extreme, threatening more than a hundred academics who've published fewer than three papers in two years with dismissal or demotion1,2.

The most obvious consequence of such a crude measure is that it could force academics to publish trivial work in low-ranking journals, to ensure their 'outputs' are well above the point at which their jobs are at risk. This might make for a healthy bean-count to impress government bureaucrats, but it will do little to improve the standing of an institution that aspires to be counted among the world's best.

No one disputes the need for universities to set standards that they expect their academics and researchers to live up to, and it is right and proper for elite institutions to set standards that are challenging. The key is to set standards that motivate an institution's stakeholders to achieve its aims, without falling foul of unintended consequences. How do you do this? It depends on whom you ask.

In the methodology used by The Times newspaper to calculate its latest ranking of the world's universities3, the number of papers published constitutes only 6% of a university's absolute score4. Requiring that your academics publish more won't move your institute far up the list. This is not to say that a high Times ranking is the standard to which universities should aspire, but a focus on producing more is a distraction from achieving better. And it encourages researchers to salami-slice coherent bodies of work into many different papers — a practice that benefits no one but publishers, and which Nature Physics and its sister titles strongly discourage5.

Credit: © ISTOCKPHOTO.COM/FLOORTJE

In the United Kingdom, there has been heated debate over the government's introduction of the Research Excellence Framework, which replaced the Research Assessment Exercise as a means of deciding how to distribute funding among universities6. The guiding principle of the new framework is expressed in one word — impact. But impact means different things to different people, and can be measured in many ways. It is often assumed to mean economic impact, which in turn leads many to fear that politicians don't appreciate the importance of fundamental research to scientific progress, or that they consider it to be a luxury that can only be justified in times of plenty. This needn't be the case, and science ministers from successive governments and both sides of politics have insisted that it is not — that impact will be defined in the broadest sense. How this will work in practice, however, remains to be seen.

Another worrying aspect of the measures being carried out at the University of Sydney is the implication that research, rather than education, is the most important function of a world-class university. The university has reportedly experienced a sharp drop in income from student fees — a trend that is expected to continue. And so it is sensible to expand its income base and to increase the contribution from research. But although the university targets those who it feels are underperforming in research, it isn't clear whether it has similar concerns or will take similar actions against those who may be underperforming in education.

Excellence in research and excellence in education go hand in hand. There will always be a tension between university teaching and university research, and so there should be. Educators who are also engaged in pushing back the frontiers of human knowledge are better able to explain the context, relevance and urgency of the subject they teach. But fundamentals too are important. Moreover, if we don't expect the same standards from university educators as we do university researchers, where are the next generation of scientists going to come from? This is often overlooked: for example, in the United States, a report prepared by the National Task Force on Teacher Education in Physics found little interest among university physics departments in preparing their graduates for teaching at any level7.

There are no shortcuts to excellence. Neither are there simple ways to measure it. We should demand more from those who try.