Abstract
This material was presented as a poster at the Seventh International Congress on Peer Review and Biomedical Publication, 8–10 September 2013, Chicago, IL, USA.
Main
Reviewing manuscripts is an important activity for academic scientists, but there have been few systematic training mechanisms. Since mid-2009, Laboratory Investigation (LI) has been offering promising trainees an opportunity to review manuscripts for LI as Editorial Interns (EIs). The goal of the program is in line with the educational mission of the owner of the journal, the United States and Canadian Academy of Pathology. In addition, we felt it was valuable to determine whether a select group of young and inexperienced reviewers could provide quality reviews that assist authors in improving their manuscripts.
Previously developed programs for training young scientists in peer review have met with mixed success. Houry et al.1 developed a mentoring system to train new reviewers for the journal Annals of Emergency Medicine. The new reviewers were either trained using the new system or given the standard written material traditionally presented to reviewers. The authors found that pairing the new reviewers with senior reviewers for discussion of the manuscript reviews did not result in higher-quality reviews.
Callaham et al.2 examined the relationship between variables that are indications of academic status and the quality of reviews received and found that the variables held little or no predictive power as to the quality of reviews provided. The variables included academic rank, formal training in critical appraisal or statistics, being principal investigator of a grant, working in a university hospital, youth, being on an editorial board, and being on grant study sections.
As the training and the status of reviewers do not guarantee good reviews, what makes for a good reviewer? A study by Black et al.3 revealed that a younger age was a predictor for quality reviews and that review quality increased as more time spent on the review (to a point). With these studies in mind, we wished to measure the usefulness of the reviews provided by a restricted subset of referees: young investigators who did not often get the opportunity to review but had the interest in doing so and were willing to take the time to try. The question of course is how best to do this and whether others have created a model that we could use rather than ‘re-invent the wheel’. The answer we postulated might be analogous to how the Central Intelligence Agency identified and recruited intelligence officers, ie, spies, or at least how America did it during the late half of the 20th century. Even the casual reader of the espionage literature knows the broad outline of how it was done. Recruiters, often former government employees, working as academics served as talent spotters on the campuses of America’s great universities and colleges. We elected to follow a similar path.
EIs (98) were recruited on the basis of the recommendations of our talent spotters—department chairs (68 EIs, 69.4%) and editors of the journal (11 EIs, 11.2%), although some EIs were self-referred (19 EIs, 19.4%). The EIs were trainees who were not yet qualified to be bona fide reviewers: they were generally graduate students, post-docs, medical students, residents, or fellows, ie, trainees in basic biomedical research programs at major universities. Manuscripts were sent to EIs on the basis of their areas of expertise. It was often difficult to match EIs to manuscripts because the scope of LI is so wide. A conscious decision was made in that there was no training or mentoring of the EIs. They were given the opportunity to act as peer reviewers without advice or intervention. These young people were expected to judge their own performance by comparing their comments with those of the senior reviewers. Manuscripts that were evaluated by the EIs were also assessed by two senior reviewers. EIs submitted their reviews under the same guidelines as these senior reviewers.
Manuscripts were assigned to Associate Editors (AEs) on the basis of the subject matter, as an individual AE is responsible for overseeing peer review of each manuscript.
Thus, it was the AE who decided whether an EI’s review was included in the decision letter. An EI review was deemed fit to share with an author if it had a succinct recap of the manuscript to show that the reviewer understood the hypothesis of the study, described the deficiencies of the study, discussed whether the data justified the conclusions, offered reasonable suggestions to improve the manuscript, and commented on the novelty of the work. The criteria for an effective manuscript review are described in more detail in Ref. 4. Thus, the rating scale of EI reviews had only two scores: acceptable reviews that were sent to the authors and unacceptable reviews that were not. When an AE chose to include an EI review in a decision letter, it was not revealed to the authors that the review was from a junior scientist, it simply appeared as one of three or more reviews.
The LI editors used a three-point rating scale for the EIs: ‘Good reviewer’ (provided four or more acceptable reviews), ‘poor reviewer’ (did not respond to one or more invitations to review, declined to review for more than half of the invitations, failed to turn in one or more reviews, and the area of expertise did not fit the journal’s needs) and ‘not enough information to judge’. On rare occasion, AEs communicated with the EIs.
So what happened? Seventy-six EIs were requested to evaluate 290 out of 1643 manuscripts or roughly 18% of manuscripts that were reviewed. The total number of submissions during the timeframe of the study was 2611. In response to requests to review, EIs accepted assignments 77% of the time (222/290), which compares favorably with senior reviewers (67%; 3618/5432). EIs returned over 99% of reviews promised (221/222), and review time was 11 days on average. Senior reviewers provided promised reviews 92% of the time (3317/3618) and took an average of 13 days. Ninety percent of the EI reviews were included in decision letters (200/221). Overall, 41% of EIs (40/98) promptly responded to requests to review, reviewed four or more manuscripts, and provided reviews that the editors shared with the authors. Twelve percent (11/98) of the reviewers either did not respond to multiple requests to review, consistently declined to review without offering a reason, provided multiple reviews that the editors deemed unfit to send to the authors, or did not receive any assignments because their area of expertise was too narrow and the opportunity did not present itself. There was not enough information to reach a conclusion on the remainder. Overall interns recruited by a journal editor were by far the best (73.7%) at being consistently good reviewers, whereas the ‘self-referred’ interns led all groups as being not a good fit for the journal.
From this, we concluded that the EIs agreed to review at a high rate and quickly turned in quality reviews. These results indicate that it might be worthwhile to engage and cultivate reviewers early in their careers, when they may be more enthusiastic about reviewing and have more time than a senior principal investigator. We judge the Laboratory Investigation Editorial Internships to be a successful program, as we were able to identify promising young reviewers. The latest list of EIs is found in this issue of LI and we thank them all. We got a great deal out of the program, and we hope they did too. The first group is now being licensed to review.
The editors of Laboratory Investigation would like to thank our Editorial Interns for 2013:
Brian Abe
Lingbao Ai
Hakan Aydin
Stefano Bacci
Kathryn Behling
Jennifer Chou
Yi Ding
Nicole Draper
Xiaoli Du
Gong Feng
Heather Francis
Paul Furmanczyk
Joy Gibson
Jennifer Giltnane
Zihua Gong
Arvin Gouw
Yu Han
Eldad Hod
Heng Hong
Austin Jackson
Ruirui Ji
Kun Jiang
C.Dirk Keene
Daniel Kleven
James Kohler
Ravindra Kolhe
Melissa Landek-Salgado
Ang Li
Yiting Lim
Lin Lin
Michael Linden
Haiyun Ling
Qin Liu
Lawrence Low
Xin Lu
Sreeharsha Masineni
KimGreg Mayhall
Prashanthi Menon
Eugen Minca
Haitao Niu
Maja Oktay
Olorunseun Ogunwobi
Sarah Ondrejka
Rish Pai
Deepa Patil
Bryan Patonay
Alexandros Polydorides
William Puszyk
Michael Rivera
Toni Roberts
Cory Robinson
Joseph Sailors
Natasha Savage
Lisa Senzel
Deborah Sevilla
Aaron Shaver
Wei Shen
Lynette Sholl
Hai Song
Douglas Stairs
Heather Stevenson
Yi Tang
Matthew Titmus
SriHariKrishna Vellanki
Girish Venkataraman
Kimberly Walter
Aibing Wang
Qinhong Wang
Beibei Wu
Chang Xiao
Ping Xie
Yutao Yan
Yisheng Yang
Xianping Yang
Bu Yin
Peng Yu
Jingsong Yuan
Bodi Zhang
Jun Zhang
Juhua Zhou
References
Houry D, Green S, Callaham M . Does mentoring new peer reviewers improve review quality? A randomized trial. BMC Med Educ 2012;12:83.
Callaham ML, Tercier J . The relationship of previous training and experience of journal peer reviewers to subsequent review quality. PLoS Med 2007;4:e40.
Black N, van Rooyen S, Godlee F et al. What makes a good reviewer and a good review for a general medical journal? JAMA 1998;280:231.
Neill US . How to write an effective referee report. J Clin Invest 2009;119:1058.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Competing interests
Catherine M Ketcham is the Managing Editor of Laboratory Investigation and an employee of Ketcham Solutions Inc., which receives payment from the United States and Canadian Academy of Pathology for journal management. Robert W Hardy is a Senior Associate Editor of Laboratory Investigation and receives a stipend from the United States and Canadian Academy of Pathology. Brian P Rubin is a Senior Associate Editor of Laboratory Investigation and receives a stipend from the United States and Canadian Academy of Pathology. Gene P Siegal is the Editor-in-Chief of Laboratory Investigation and receives a stipend from the United States and Canadian Academy of Pathology.
Rights and permissions
About this article
Cite this article
Ketcham, C., Hardy, R., Rubin, B. et al. Finding a new generation of spies and manuscript reviewers. Lab Invest 93, 1262–1264 (2013). https://doi.org/10.1038/labinvest.2013.125
Published:
Issue Date:
DOI: https://doi.org/10.1038/labinvest.2013.125
Keywords
This article is cited by
-
Let's go swimming
Laboratory Investigation (2019)