We present a consensus-based checklist to improve and document the transparency of research reports in social and behavioural research. An accompanying online application allows users to complete the form and generate a report that they can submit with their manuscript or post to a public repository.
Good science requires transparency
Ideally, science is characterized by a ‘show me’ norm, meaning that claims should be based on observations that are reported transparently, honestly and completely1. When parts of the scientific process remain hidden, the trustworthiness of the associated conclusions is eroded. This erosion of trust affects the credibility not only of specific articles, but—when a lack of transparency is the norm—perhaps even entire disciplines. Transparency is required not only for evaluating and reproducing results (from the same data), but also for research synthesis and meta-analysis from the raw data and for effective replication and extension of that work. Particularly when the research is funded by public resources, transparency and openness constitute a societal obligation.
In recent years many social and behavioural scientists have expressed a lack of confidence in some past findings2, partly due to unsuccessful replications. Among the causes for this low replication rate are underspecified methods, analyses and reporting practices. These research practices can be difficult to detect and can easily produce unjustifiably optimistic research reports. Such lack of transparency need not be intentional or deliberately deceptive. Human reasoning is vulnerable to a host of pernicious and often subtle biases, such as hindsight bias, confirmation bias and motivated reasoning, all of which can drive researchers to unwittingly present a distorted picture of their results.
The practical side of transparency
How can scientists increase the transparency of their work? To begin with, they could adopt open research practices such as study preregistration and data sharing3,4,5. Many journals, institutions and funders now encourage or require researchers to adopt these practices. Some scientific subfields have seen broad initiatives to promote transparency standards for reporting and summarizing research findings, such as START, SPIRIT, PRISMA, STROBE and CONSORT (see https://www.equator-network.org). A few journals ask authors to answer checklist questions about statistical and methodological practices (e.g., the Nature Life Sciences Reporting Summary)6 and transparency (for example, Psychological Science). Journals can signal that they value open practices by offering ‘badges’ that acknowledge open data, code and materials7. The Transparency and Openness Promotion (TOP) guidelines8, endorsed by many journals, promote the availability of all research items, including data, materials and code. Authors can declare their adherence to these TOP standards by adding a transparency statement in their articles (TOP Statement)9. Collectively, these somewhat piecemeal innovations illustrate a science-wide shift toward greater transparency in research reports.
We provide a consensus-based, comprehensive transparency checklist that behavioural and social science researchers can use to improve and document the transparency of their research, especially for confirmatory work. The checklist reinforces the norm of transparency by identifying concrete actions that researchers can take to enhance transparency at all the major stages of the research process. Responses to the checklist items can be submitted along with a manuscript, providing reviewers, editors and, eventually, readers with critical information about the research process necessary to evaluate the robustness of a finding. Journals could adopt this checklist as a standard part of the submission process, thereby improving documentation of the transparency of the research that they publish.
We developed the checklist contents using a preregistered ‘reactive-Delphi’ expert consensus process10, with the goal of ensuring that the contents cover most of the elements relevant to transparency and accountability in behavioural research. The initial set of items was evaluated by 45 behavioural and social science journal editors-in-chief and associate editors, as well as 18 open-science advocates. The Transparency Checklist was iteratively modified by deleting, adding and rewording the items until a sufficiently high level of acceptability and consensus were reached and no strong counter arguments for single items were made (for the selection of the participants and the details of the consensus procedure see Supplementary Information). As a result, the checklist represents a consensus among these experts.
The final version of the Transparency Checklist 1.0 contains 36 items that cover four components of a study: preregistration; methods; results and discussion; and data, code and materials availability. For each item, authors select the appropriate answer from prespecified options. It is important to emphasize that none of the responses on the checklist is a priori good or bad and that the transparency report provides researchers the opportunity to explain their choices at the end of each section.
In addition to the full checklist, we provide a shortened 12-item version (Fig. 1). By reducing the demands on researchers’ time to a minimum, the shortened list may facilitate broader adoption, especially among journals that intend to promote transparency but are reluctant to ask authors to complete a 36-item list. We created online applications for the two checklists that allow users to complete the form and generate a report that they can submit with their manuscript and/or post to a public repository (Box 1). The checklist is subject to continual improvement, and users can always access the most current version on the checklist website; access to previous versions will be provided on a subpage.
This checklist presents a consensus-based solution to a difficult task: identifying the most important steps needed for achieving transparent research in the social and behavioural sciences. Although this checklist was developed for social and behavioural researchers who conduct and report confirmatory research on primary data, other research approaches and disciplines might find value in it and adapt it to their field’s needs. We believe that consensus-based solutions and user-friendly tools are necessary to achieve meaningful change in scientific practice. While there may certainly remain important topics the current version fails to cover, nonetheless we trust that this version provides a useful to facilitate starting point for transparency reporting. The checklist is subject to continual improvement, and we encourage researchers, funding agencies and journals to provide feedback and recommendations. We also encourage meta-researchers to assess the use of the checklist and its impact in the transparency of research.
All anonymized raw and processed data as well as the survey materials are publicly shared on the Open Science Framework page of the project: https://osf.io/v5p2r/. Our methodology and data-analysis plan were preregistered before the project. The preregistration document can be accessed at: https://osf.io/v5p2r/registrations.
Merton, R. The Sociology of Science: Theoretical and Empirical Investigations (University of Chicago Press, 1973).
Baker, M. Nature 533, 452–454 (2016).
Chambers, C. D. Cortex 49, 609–610 (2013).
Gernsbacher, M. A. Adv. Methods Pract. Psychol. Sci. 1, 403–414 (2018).
Munafò, M. R. et al. Nat. Hum. Behav. 1, 0021 (2017).
Campbell, P. Nature 496, 398 (2013).
Kidwell, M. C. et al. PLoS Biol. 14, e1002456 (2016).
Nosek, B. A. et al. Science 348, 1422–1425 (2015).
Aalbersberg, I. J. et al. Making Science Transparent By Default; Introducing the TOP Statement. Preprint at OSF https://osf.io/preprints/sm78t/ (2018).
McKenna, H. P. J. Adv. Nurs. 19, 1221–1225 (1994).
We thank F.Schönbrodt and A.T. Foldes for their technical help with the application.
S.K. is Chief Editor of the journal Nature Human Behaviour. S.K. has recused herself from any aspect of decision-making on this manuscript and played no part in the assignment of this manuscript to in-house editors or peer reviewers. She was also separated and blinded from the editorial process from submission inception to decision. The other authors declared no competing interests.
About this article
Cite this article
Aczel, B., Szaszi, B., Sarafoglou, A. et al. A consensus-based transparency checklist. Nat Hum Behav 4, 4–6 (2020). https://doi.org/10.1038/s41562-019-0772-6
BMC Research Notes (2022)
A machine learning analysis of the relationship of demographics and social gathering attendance from 41 countries during pandemic
Scientific Reports (2022)
Journal of Business and Psychology (2022)