Experts, regulators mull how to foster technologies without posing undue risk to patients.
Genetic testing has entered a new realm, with the ability to read a person's genetic code and predict how it will affect his or her health. But US regulators are struggling to work out how the tests should be governed, with a particular sticking point being who decides what the genetic read-outs mean in terms of health and disease.
So far, the US Food and Drug Administration (FDA) has approved genetic tests only for specific conditions. This includes the approval on 19 February of a test developed by 23andMe of Mountain View, California, to determine whether people carry a gene variant that could lead to their offspring developing Bloom syndrome, a rare disorder characterized by small stature and multiple health problems.
With the massive number of genome-based diagnostics that are possible, the agency cannot practically continue with the painstaking approach it has taken in approving these tests. So on 20 February, the FDA is running a workshop at which scientists, doctors and regulators will discuss a strategy put forward by the agency in December that aims to allow the technology to flourish but clamp down on a 'wild west' atmosphere in which some companies are making unproven claims about how well the tests can predict health patterns.
There are two main aspects that need to be considered, says Joshua Sharfstein, former deputy commissioner of the FDA: “Is the sequence what the machine says it is and what does it mean." For a doctor to recommend a therapy on the basis of test results, he or she must have confidence that the test results accurately predict which therapy is going to work best, Sharfstein says.
Part of the problem is that some of the genetic aberrations picked up by the tests will be linked to a wide array of diseases, and scientists often do not have enough data to pick which of these associations are medically relevant. They also struggle to predict the medical effects of variants that occur in only a few people. That means that it is currently impossible to evaluate in advance how well a test would perform for every possible data point or disease risk, or how the information should guide treatment.
However, companies have already started marketing the tests, and the FDA's attempts to ensure that test providers report accurate information have been criticized as being overly cautious. "Labs need to continue to make the technology better, and we can't require such steep regulatory steps for every iteration of a test that it prevents the innovation that needs to happen," says Heidi Rehm, director of the Laboratory for Molecular Medicine at Partners HealthCare Personalized Medicine in Cambridge, Massachusetts.
The FDA wants to certify the technical accuracy of the technologies by evaluating how well the sequencing machines read a subset of genetic data. Determining whether the tests are being interpreted correctly is trickier, and for this, the agency has proposed relying heavily on public databases that associate health traits with genetic data.
However, public databases do not yet hold enough information to make these decisions. The US National Institutes of Health has funded the construction of a database called ClinVar, which collects information on genetic variants linked to disease, and is funding experts to curate the data in a project known as ClinGen. But that effort is still getting off the ground. "There is not a lot of ClinGen expert-curated data to come out of this yet," says Rehm, who hopes that money requested by President Barack Obama will help to speed up that work. This month Obama asked US lawmakers to give the FDA US$10 million to help it to execute the strategy as part of a 'Precision Medicine Initiative' that he announced in his State of the Union address.
It is also not clear how such an approach would deal with tests based on information in proprietary databases.
"The FDA is concerned that people will be using secret formulas that tell people the clinical significance of a test without any real review, versus an open scientific process," says Sharfstein, who is now at the Johns Hopkins Bloomberg School of Public Health in Baltimore, Maryland.
Michael Watson, executive director of the American College of Medical Genetics and Genomics in Bethesda, Maryland, suggests that regulation of sequencing-based tests should work similarly to that of radiological devices; the agency reviews the technical performance of machines, but leaves the interpretation of the images up to professional societies.
"My sense is that a lot of this is going to end up looking like imaging looks," Watson says. "The FDA basically says, 'we're going to make sure this machine is manufactured correctly so you can see everything that you need to see', and there's a whole radiology community that interprets what the radiogram says when the machine is in use."
Follow Erika on Twitter @Erika_Check.
Related links in Nature Research
Cancer-gene data sharing boosted 2014-Jun-10
Regulation: The FDA is overcautious on consumer genomics 2014-Jan-15
23andMe ordered to halt sales of DNA tests 2013-Nov-25
Related external links
FDA meeting information: Optimizing FDA’s Regulatory Oversight of Next Generation Sequencing Diagnostic Tests
FDA discussion paper: Optimizing FDA’s Regulatory Oversight of Next Generation Sequencing Diagnostic Tests—Preliminary Discussion Paper
Rights and permissions
About this article
Cite this article
Check Hayden, E. US regulators try to tame 'wild west' of DNA testing. Nature (2015). https://doi.org/10.1038/nature.2015.16962