It is much touted that research output — as defined by the number of published papers — is growing exponentially1. This scientific fecundity can perhaps be interpreted as a positive indicator of societal attitudes towards science and the resources available to support it. However, those working on the many and varied environmental problems that stem from human activities tend to feel that the sizes of the problems at hand dwarf the resources available to tackle them. Even so, this enormous and rapidly growing body of scientific evidence presents real opportunities to expand and synthesize scientific knowledge and address pressing socio-environmental challenges. In the climate change domain timeliness is often key and decisions need to be made using the best available information, rather than being continually delayed by a desire for more data and analysis. Synthesis of the existing evidence is one way to meet this requirement.

There is of course no such thing as a free lunch, and capitalizing on the opportunities offered by an extensive research corpus is associated with some real challenges. In the first instance, a voluminous and often disparate literature represents a significant logistical difficulty for researchers. Just keeping abreast of a field of study is now a very significant task in its own right. It may be that technology has some of the answers to managing this task and software tools are likely to play a growing role in making sense of the literature2,3.

Aiming to go further and synthesize the available evidence requires a systematic review (see the Comment by Neal Haddaway and Biljana Macura), which typically intends to satisfy one of two goals: to assess the efficacy of specific interventions – such as payment for carbon storage – or to reach some broader generalizations across studies – for example, changing species distributions in response to warming. Where suitable quantitative data are available, subsequent meta-analysis can also be employed4. Far from being new innovations, these methods have been around for some 40 years4 and have been applied to environmental research questions for over a decade5. However, the proliferation of synthesis research has not always followed best-practice methods and reporting, as often occurs with a field undergoing rapid growth. In many cases this undermines the value of the work and its utility for policy and practical applications (see Comment). These issues are not unique to synthesis research and reflect wider concerns regarding research transparency and reporting standards, which some have referred to as a ‘reproducibility crisis’ in science6. To improve transparency, Nature research journals, including Nature Climate Change, have instituted a series of reporting documents that are published alongside research papers to maintain reporting standards7.

Synthesis has perhaps often been seen as something that researchers do as an ‘add-on’ to their main research, and consequently is not always perceived as requiring as much methodological care as core scientific research activities. However, as the literature grows and the need for evidence-informed policies becomes ever more pressing, that attitude is increasingly problematic. Rather, synthesis needs to be treated as an important part of the research endeavour, requiring just as much rigour and transparency as any other aspect of science. A Comment in this issue outlines the main types of literature synthesis, some of the typical biases and limitations that occur, and how reporting standards can help to improve the situation.

At Nature Climate Change there are currently no specific reporting standards for synthesis research. However, synthesis clearly requires rigorous methods. For systematic reviews and submissions incorporating meta-analysis — we currently receive many more of the latter — we seek a reviewer to comment specifically on the synthesis methods and we strongly recommend that authors consult some of the guidelines for synthesis that are now available8,9, and provide one of the associated reporting documents, such as ROSES (https://www.roses-reporting.com) or PRISMA (http://prisma-statement.org/Default.aspx), as Supplementary Information, along with their submission. Absence of a reporting checklist would not prevent us seeking expert review on a paper that meets our editorial criteria for novelty and interest, however, in all such cases if the paper is invited for resubmission after review we request that an appropriate reporting document be included with the revised manuscript. Although this is not compulsory we believe that it will aid with rapid review of synthesis research, and ensure that the findings are as robust and useful as possible.

It seems clear that demand for evidence synthesis is only going to increase. The size of this task is already large and while we hope that technological developments will partially come to the rescue, useful synthesis research will always hinge on the application of well-conceived, rigorous and clearly documented methods.