Suppose you are devising a technique to transfer proteins from a gel to a plastic substrate for easier analysis. Useful, maybe — but will you gain kudos for it? A notable finding of last year’s survey of the 100 most cited papers on the Web of Science (see Nature 514, 550; 2014) was how many of them reported such apparently mundane methodological research (this protein-transfer method came in at number six).
Not all prosaic work reaches such bibliometric heights, but that does not deny its value. Overcoming the hurdles of nanoparticle drug delivery, for example, requires the painstaking characterization of pathways and rates of breakdown and loss in the body — work that is probably unpublishable, let alone unglamorous. One can cite comparable demands for detail to get just about any bright idea to work in practice — but it’s usually the initial idea, not the hard grind, that garners the praise. The incentives for such boring but essential collection of fine-grained data to solve a specific problem are vanishing in a publish-or-perish culture.
Meanwhile, a recent analysis of discovery and innovation in biomedicine, using the molecules studied as value markers, finds that the choice of research problems is becoming more conservative and risk-averse (A. Rzhetsky et al. Proc. Natl Acad. Sci. USA 112, 14569–14574; 2015). One might quibble with the scope of the study, but its general conclusions — that current norms discourage risk and therefore slow down scientific advance, and that the problem is worsening — ring true.
Attempts to hit the publishable ‘sweet spot’ by avoiding both the prosaic and the risky are likely to reduce the efficiency of scientific discovery. But a fashionably despairing cry of ‘Science is broken!’ is not the way forward. The wider virtue of Rzhetsky et al.’s study is that it floats the notion of tuning practices and institutions to accelerate the process of scientific discovery. The researchers conclude, for example, that publication of experimental failures would assist this goal by avoiding wasteful repetition. Journals chasing impact factors might not welcome that, but they are no longer the sole repositories of scientific findings. Rzhetsky et al. also suggest some shifts in institutional structures that might help promote riskier, but potentially more groundbreaking, research — for example, spreading both risk and credit among teams or organizations.
The danger is that efforts to streamline discovery simply become codified into another set of guidelines and procedures, creating yet more hoops for grant applicants to jump through.
A better first step would be to recognize the message that research on complex systems has emphasized: efficiencies are much more likely to come from the bottom up. The aim is to design systems with basic rules of engagement for participating agents that best enable an optimal state to emerge. Such principles typically confer adaptability, diversity and robustness. There could be a wider mix of grant sources and sizes, say, less rigid disciplinary boundaries, and wider acceptance that citation records are not the only measure of worth.
But perhaps more than anything, the current narrowing of objectives, opportunities and strategies in science reflects an erosion of trust. Obsessive focus on ‘impact’ and regular scrutiny of bibliometric data betray a lack of trust that would have sunk many discoveries and discoverers of the past. Bibliometrics might sometimes be hard to avoid as a first-pass filter for appointments, but a steady stream of publications is not the only, or even the best, measure of potential.
Attempts to tackle these widely acknowledged problems are typically little more than a timid rearranging of deckchairs. Partly that’s because they are seen as someone else’s problem: the culprits are never the complainants, but the referees, grant agencies and tenure committees who oppress them. Yet oddly enough, these obstructive folk are, almost without exception, scientists too (or at least, they once were). Inefficiencies can exact a huge price. It is time to oil the gears.
- Journal name:
- Date published: