European policymakers have been discussing new rules on data protection for years, and scientists and universities — like everyone else across the continent — are about to see the results. Entering into force on 25 May, a new law known as the General Data Protection Regulation (GDPR), is designed to protect the personal privacy of citizens and will overhaul how personal data are collected, handled, processed and stored. It’s a welcome move to safeguard individuals and is the biggest shake-up of data protection in more than 20 years.
However, as this journal has noted before, earlier drafts of the law posed a problem for science and the research community. Of particular concern was the issue of consent — the draft language suggested researchers would be required to seek renewed consent to reuse data collected for a different purpose, which could have introduced delays and made some research impractical. But many in the research community worked relentlessly to warn policymakers of the potential harm. In response, officials put in place rules that exempt research from some of the requirements, provided the proper safeguards are in place. Universities and organizations have introduced plans to make sure they are. The bulk of the work should be done.
The passing of the final GDPR rules is, therefore, a good example of political engagement by researchers and their advocates, and a sensible and informed reaction from policymakers. Those involved, on both sides, deserve great credit. Harmonization of how data can be sourced, stored and used would, in theory, be good for research. It could smooth the difficulties that scientists face when they try to pool analysis of genomic data and tissue samples across national borders. Such sharing could help scientists to organize powerful trials with large numbers of participants.
But although there is some cause for celebration, there are still outstanding issues. And that means that the same researchers and advocates must remain vigilant.
The problem is that individual European countries have been left to decide some issues for themselves — for example, how scientific data can be processed. This flexibility is intended to allow countries to fit the rules around existing systems and different cultures, but it might leave nations out of step. Researchers who work under different systems could struggle to share data with each other. That could lead to delays in negotiations between institutions wanting to create collaborative contracts that enable data sharing.
To help prevent this and to offer a unified approach, academics, industry representatives and patients have been meeting over the past year to distil the complex regulation into a user-friendly guide. This planned code of conduct aims to provide a simple ‘how-to’ guide for scientists, for example, by explaining differences in the way countries such as Germany and the United Kingdom define ‘anonymized’ data. The resulting Code of Conduct for Health Research, overseen by the biobank network BBMRI-ERIC (see J.-E. Litton Nature 541, 437; 2017), is almost ready for consultation. But meanwhile, medical research remains vulnerable to unintended consequences of the new law.
That’s because, until the code of conduct is in place to offer clear guidance about how to comply with the GDPR, day-to-day decisions on how to interpret the law will be left to individual institutions’ legal departments. It would be understandable if they chose to err on the side of caution and place restrictions on sharing data for fear of breaking the law.
Even when the code is finalized, it must still be approved by the European Data Protection Board (EDPB), which has not yet said how organizations can submit such codes for evaluation, or how long the process will take.
Some have argued that delays in the code becoming available could be beneficial, because they would allow the research community to thrash out the details of this complicated area of the law. But others worry that if the process drags on too long, medical research will suffer. What starts as a cautious position on how best to share data in line with the law could drift into normal practice.
That would be a missed opportunity and could risk undermining the good work done so far. Officials on the EDPB must not allow that to happen. The code must be approved and put into practice as soon as possible. It’s important to protect people’s personal data; but it’s also important to ensure data can be used with integrity to support valuable research.