The recent revisions to the Common Rule are potentially important for behavioural and social scientists. Yet, they are not far-reaching enough for those of us faced with a sea change in the nature and source of data on human beings. New approaches are necessary for research in the social and behavioural sciences to move forward by developing our own standards and data infrastructures.

Here's the challenge. The recommendations on informed consent and identifiability simply will not translate to a world in which large-scale data are generated from the administration of government programmes or from the digital exhaust of mobile phones, social media or commercial transactions. In particular, the guidance on informed consent is very much in the context of clinical trials. And the Common Rule description of what it means for data to be identifiable does not explicitly recognize how vast new amounts of data on individuals’ digital behavioural habits can be used to readily identify people in ways never before possible. As such, it is difficult to understand how institutional review boards (IRBs) can be charged with ensuring that “appropriate privacy and security safeguards are in place to protect research subjects” or how “the secretary of health and human services (HHS) will issue guidance to assist IRBs in appropriately protecting subjects’ privacy and confidentiality” (Federal Register 82, 7149–7274; 2017). If the US Census Bureau, which has vast experience and expertise in collecting and disseminating social science data, has essentially declared it impossible to release secure public-use microdata files using current approaches, it is not reasonable to expect either IRBs or the secretary of HHS to be able to do so.

It is not surprising that the Department of HHS provides insufficient guidance to social and behavioural scientists. Their goal — to develop uniform regulations across federal agencies — is a noble one, but our sciences are not central to the mission of many agencies. Yet it is imperative that there are better standards for social and behavioural scientists, lest important and high-quality research languish undone. We must actively foster common approaches that allow accessing and linking data for research. As David Ellwood, former Assistant Secretary for Planning and Evaluation in the Department of HHS, reports, the experience described by one large-city public health commissioner is too common in both the policy and the research sectors: “We commissioners meet periodically to discuss specific childhood deaths in the city. In most cases, we each have a thick file on the child or family. But the only time we compare notes is after the child is dead” (personal communication).

Hope is not a plan [...] we should move as a community to develop our own standards.

Hope is not a plan.

Rather than hoping for better approaches in the future, we should move as a community to develop our own standards that can be used by IRBs looking for guidance. We have an unmatched opportunity to move from fragile, artisan, one-off approaches to addressing privacy and confidentiality to establish national standards. There is the potential for a new data infrastructure to evolve that can enable the joining of datasets across federal and local agencies and improve decision-making at all levels. At the local level, mayors, governors, agency managers and citizens alike have increasingly demanded new technology tools and approaches to accelerate policy improvements. Local governments have created a new job title, chief data officer, built dashboards, initiated predictive analytic and smart sensor projects in the name of better efficiency and accountability, and even improved community engagement. At the federal level, the Evidence-Based Policymaking Commission Act of 2016, otherwise known as the Ryan-Murray Act, was signed into law on 30 March 2016, and the commission is identifying approaches to access data for programme evaluation.

We should also work as a community to make a series of investments in data accessibility, designed to address the specific needs of social and behavioural scientists. The following could transform our research capacity. A combination of state-of-the art technical strategies and thoughtful human oversight and screening could dramatically improve privacy and usage protections. A variety of standardized mechanisms could be developed for different confidentiality situations, ranging from de-identification, to mathematical mechanisms to add noise to the data, to secure enclaves (ultimately in the cloud) with mechanisms for certifying safe users, safe analyses, and safe products. Strategies to reduce legal hurdles could be addressed through both training and the development of more up-to-date templates.

Funding interventions have transformed other fields. The Wellcome Trust supported the Bermuda Accord in the case of the Human Genome Project; the Sloan Foundation supported the Sloan Digital Sky Survey in the case of astrophysics. The same approach could work here. A consortium of private foundations could be formed to (1) find innovative and powerful new solutions to important social problems, and (2) develop standards and technologies to dramatically simplify, routinize, and expand the data capacity and uses to create those more effective solutions.