In July, the House of Commons Science and Technology Committee published a comprehensive report “Peer review in scientific publication” based on input from researchers, funding bodies and publishers. Parliament's undertaking to understand and assess the value of the peer review process, which remains fundamental to ensuring quality in life science publications, should be applauded. The committee concludes that despite its flaws, pre-publication peer review is vital and cannot be dismantled. However, the report also highlights much-needed improvements to the process including training of early career scientists in peer review. The committee recognizes the crucial role of reviewers and their tremendous efforts and calls for better recognition of this work, but does not make any concrete proposals to this end. As we have previously discussed in these pages (“Reviewing refereeing”), we agree that this fundamental contribution should be appropriately acknowledged, not just by journal editors and publishers, but also by tenure-granting committees, funding agencies and other bodies that evaluate researcher performance and their contributions to a field.

The report also discusses avenues for reducing the burden on reviewers, such as editorial pre-screening, efforts to increase a journal's 'reviewer pool', and the possibility to transfer manuscripts between journals together with the referee reports. These approaches are indeed employed by Nature journals to facilitate a constructive and efficient peer review process for authors and referees alike. The committee rightly notes the important role of post-publication review through online commentary and of social media tools in communicating published work and discussing its merits and weaknesses. To this end, we have recently begun highlighting Faculty of 1000 coverage of our papers on our homepage. Of course, no analysis of peer review is complete without a discussion of impact factors or the pressures on researchers to publish in high-impact journals. The committee warns against using impact factors as a proxy for measuring the quality of a publication, and exhorts funders and research institutions to assess individual works. We agree; impact factors of journals are an imperfect proxy for measuring the significance of a study, and there is no substitute for evaluating an individual publication on its own merit. Finally, the report discusses research integrity. Journals, including this one, have some means to detect data manipulation (see “Guide to Authors: calling all authors!” and “Combating scientific misconduct”). However, the committee finds the oversight of research integrity at other levels unsatisfactory. Their call for the establishment of an oversight body in the UK, and their recommendation that research institutes have a formal process to deal with ethical issues should be implemented.

In conclusion, the pre-publication peer review system, in spite of its deficiencies, is likely to remain an integral part of the dissemination of research findings. Not only does the scientific community rely on it, but scientific advice to governments and information to the public should be based on robust data.