The Nature Index allows us to track contributions by countries and by research institutions — academic, government and commercial — to selective scientific journals, independently chosen by active scientists. Analysis of this database provides insight into global hotspots for high-quality research.
The aim of the Nature Index is to provide an indicator of patterns of high-quality research output across the globe. At its core are 68 journals, independently chosen by researchers as being where they would want to publish their most significant research. We identify author affiliations on each paper, as well as tease out the relationships between research organizations, to allow us to track scientific output for institutions and countries. Snapshot data from the Nature Index are openly available under a Creative Commons licence, so that users can analyse scientific research outputs themselves.
The group of journals at the heart of the Nature Index is chosen under the following fundamental principles:
The journals included are selected by a panel of active scientists, independently of the Nature Publishing Group.
The choices reflect researchers' perception of the journals' content, rather than measures such as impact factor.
We believe that, at the time of selection, the list amounts to a reasonably consensual, upper echelon of journals in the natural sciences. It includes some of the most highly selective journals within the main disciplines of the natural sciences as well as highly selective multidisciplinary journals.
The list of 68 journals used in this initial version of the Nature Index was compiled in 2011. The journals included, and their number, will be reviewed before the next edition.
We gave prime responsibility for the selection of journals to two panels of scientists, one drawn from the physical sciences, the other from the life sciences — each headed by a chairperson. Preliminary suggestions for the chairs of the panels were made by the editorial staff of Nature who are involved in the peer review and selection of submitted research papers. The Editor-in-Chief of Nature signed off on the choice of chairs:
Chair of Physical Sciences Panel: John Morton, then at the University of Oxford, now at the London Centre for Nanotechnology and Department of Electronic and Electrical Engineering, University College London
Chair of Life Sciences Panel: Yin-Biao Sun, Randall Division of Cell and Molecular Biophysics, Kings College London
At the chairs' request, an initial list of panel members was proposed by the editorial staff of Nature journals. The criteria for panel members were: they be established and fully active in research (therefore more likely to be mid-career than late-career); they should be drawn from the main disciplines of natural science; they should represent all active science regions worldwide; and there should be a gender balance.
The chairs signed off on the ultimate choice of panel members. They include 68 scientists in all, not counting the chairs.
We asked each panellist to list the journals in which they would most like to publish their best work, to a maximum of 10. They were asked to list these journals in order of preference.
To aggregate these responses, each panellist's first journal was awarded ten points, the second journal nine points, and so forth. We recorded both the total number of points that a journal accumulated and the number of panellists who voted for a journal. The chairs used these scores to analyse the popularity of each journal identified.
We felt it important to obtain a broader degree of input and validation, so we conducted a large-scale survey of researchers.
We emailed 100,000 scientists in the life, physical and medical sciences with an online questionnaire. We targeted a broad geographical mix of scientists across Europe, North America, Asia and the rest of the world, receiving more than 2,800 responses from across the major disciplines of the natural sciences. Only scientists who indicated that they have published in the past two years were included in the survey results, to ensure we polled active scientists.
There was a high degree of convergence between the panel and survey outputs for the most popular journals, with more divergence further down the list. But this process was not a number-crunching exercise. The purpose was to assist our panellists and especially the chairs in producing a final list of journals, taking all qualitative judgements and quantitative inputs into account. The final selection was entirely the responsibility of the panel chairs.
The final step in this exercise was to compare our selected journals against the total output of research papers. Our aim was that the ratio of disciplines within the Nature Index should be roughly in line with annual scientific output, with no single discipline contributing to the Nature Index to an inequitable degree. If there are any gross imbalances, these should be acknowledged so that users can take them into account when assessing the patterns of high-quality research output.
For this reason we provide three measures within the index: the raw article count (AC); the fractional count (FC), which apportions article count for each contributor; and a weighted-fractional count (WFC), which applies a weighting to correct for the one gross imbalance we identified in terms of discipline representation in the Nature Index (for more about how these are calculated, see 'A guide to the Nature Index', page S94).
That one striking imbalance was in the field of astronomy and astrophysics, where our selected journals represent about 50% of all papers published in international journals in this discipline.
This proportion was approximately five-times the equivalent figures for other fields. Therefore, although the data for astronomy and astrophysics are compiled in exactly the same way as for all other disciplines, in the WFC articles from these journals are assigned one-fifth the weight of other articles. Or to put it another way: the fractional count from those journals is multiplied by 0.2 to derive the WFC. This is intended to provide due representation to all fields covered by the Nature Index in any multi-disciplinary analysis of institutional outputs.
In the freely available Nature Index dataset, we leave it to the user to decide how best to use the three measures.
Clearly different measures and different subsets of the index will be more appropriate depending on the particular interests of the user. However, for the purposes of analysing and assessing global patterns within this supplement, we have, in general, focused on the WFC and the AC because they provide interesting and complementary information.
We recognize that the weighting we have applied as a broad-brush correction for the over-representation of astronomy articles is only one of many that might be applied.
More intricate and complicated weightings aimed at normalising the data for factors such as funding levels, numbers of researchers and so on, are just some of the measures that might be taken into account in assessing the patterns within the Nature Index. We encourage users to suggest and apply their own ideas for weighting the data.
The approach applied to the weighted-fractional count and all decisions regarding the selection of journals in the Nature Index were signed off by the chairs of the journal-selection panels.
Significance of the Nature Index
The journals at the heart of the Nature Index were selected for this purpose alone. They reflect the judgment of the panellists and, ultimately, the panel chairs. This exercise was not intended to provide any absolute comparison between journals within disciplines. The process is founded on a pragmatically minded aggregation of judgments, and the lower-cut-off point is entirely subjective. There is no implication that a journal lying below that threshold is in any way inferior to all of those above it. And indeed, there is every chance that these journals will be included in future iterations of the index.
“We hope the index will find a niche among the tools to track and quantify research.”
We hope that the Nature Index will find a niche among the tools that research organizations use to track and quantify research outputs and to develop comparisons across peer institutions.
About this article
Authorship in top-ranked mathematical and physical journals: Role of gender on self-perceptions and bibliographic evidence
Quantitative Science Studies (2020)
PLOS ONE (2018)
Journal of the Association for Information Science and Technology (2017)
Analytical Chemistry (2015)