“One hundred years ago no one imagined the impact that electrical energy would have on every aspect of daily life. Similarly, it is impossible to predict how the science of genetics will alter opportunities to advance individual, family, and community health.”1

Crystal-balling is a very precarious enterprise because, as Yogi Berra said, “It’s tough to make predictions, especially about the future… The future ain’t what it used to be.”2 Nevertheless, I believe that it is incumbent upon us as an organization and as a profession to look ahead, if not a hundred years, at least into the next ten or twenty, to get a sense of where genetics and, particularly, medical genetics are likely to be.

There appears to be pervasive belief in both scientific and public circles that genetic testing is going to be the cornerstone of much, if not all, of what medicine holds for the future. And, what is generally meant is the wide scale testing for susceptibility to common diseases and for responsiveness to drugs—what has come to be described as genetic profiling. However, there are many who think that there is more hype than hope in what is being predicted, and I think that is important for us as geneticists to decide which it is. We need to do this to be able to define where we fit into the medical and health care system—what might be called our scope of practice—so that we can establish meaningful interactions with other medical specialists and professionals and with the public. And, we need to do this so that we can design our training programs to meet future needs. Therefore, I want to share with you my own personal and unofficial attempt to come to grips with the role that genetic testing will play in the future and with how we as medical geneticists should relate to it.

However, before tackling these issues, I need to acknowledge that genetic testing and screening of many types are currently being done and will continue to be done in the future. Some are performed on an individual basis. These include pre- and postnatal diagnostic testing for monogenic and chromosomal disorders by cytogenetic, biochemical, and DNA mutation analysis, presymptomatic diagnosis of Huntington disease and other high penetrance late-onset neurodegenerative disorders, and diagnostic and presymptomatic testing for high penetrance familial cancer mutations. Others are performed on a population basis: maternal serum screening, newborn screening, and heterozygote detection for a variety of conditions.

I present this list to make four points. First, the indications and targets for the current forms of testing will expand as technologies change (as, for example, in newborn screening with tandem mass spectrometry)3 and as new disease-causing mutations are discovered. Second, medical geneticists and genetic counselors are not the only parties involved with genetic testing, and there are many interactions between them and primary care providers, other medical specialists, and public health programs. Third, virtually all of the current forms of genetic testing have raised a large number of social, legal, and ethical issues. And, fourth, as the program of this meeting reveals, the American College of Medical Genetics is very much concerned with genetic testing in the present.

Having acknowledged the present, I now turn to the perception that genetic testing and profiling are the wave of the future. Consider this quotation from Time magazine: “Even more genetic gee-wizardry lies just down the road. Using bio-chips… scientists should be able to identify genetic errors almost as quickly as a supermarket scanner prices a load of groceries… Genetic researchers are already talking about using ‘FISH… and chips’… to look for any number of genetic characteristics, including the more elusive web of genes that may lurk behind familial patterns of heart disease and stroke, cancer, diabetes, Alzheimer’s, various kinds of mental disorders and even gingivitis… ‘We’ll soon be governed by a new paradigm—genomic medicine—with tests and ultimately treatment for every disease linked to the human genome’… With the prestidigitation of gene amplification, only a single drop of blood or snippet of hair or a scraping of skin can reveal the full length of the human genome, including its myriad flaws.”4

The writing is a little florid, but the message is one that the scientific community has itself been promulgating. Here are but a few examples: “…[W]ithin the next 2 decades it will be technically feasible to sequence the genome of every new baby—providing them with a rundown of each and every one of their genes and their associated risk of developing certain diseases. This will enable them to seek preventable measures and adopt healthier lifestyles… There are benefits to having the ability to examine our genetic make up. Genetic technology could lead to an era of personalized medicine and better-tailored preventive treatment.”5 “The great potential of the genomic era is the development of interventions to prevent or better manage costly, chronic diseases… Medical interventions could include drugs and preventive measures that are tailored to a person’s genetic profile… Health professionals will increasingly use tests and family histories to assess risk for disease in individual patients, families, and populations. Once clinicians and public health professionals identify increased risk, they can recommend preventive measures…”1 “The potential is enormous for pharmacogenomics to yield a powerful set of molecular diagnostic methods that will become routine tools with which clinicians will select medications and drug doses for individual patients. A patient’s genotype needs to be determined only once for any given gene, because except for rare somatic mutations, it does not change. Genotyping methods are improving so rapidly that it will soon be simple to test for thousands of single-nucleotide polymorphisms in one assay.”6

The same vision has been put forward others closer to us. For example, Muin Khoury and Linda and Ed McCabe recently wrote that: “Over the next decade or two, it seems likely that we will screen entire populations or specific subgroups for genetic information in order to target interventions to individual patients that will improve their health and prevent disease… In the future, genetic information will increasingly be used to determine individual susceptibility to common disorders such as heart disease, diabetes, and cancer. Such screening will identify groups at risk so that primary-prevention efforts (such as diet an exercise) and secondary-prevention efforts (early detection or pharmacologic intervention) can be initiated. Such information could lead to the modification of screening recommendations, which are currently based on population averages.”7

Francis Collins, in his frequently quoted 1999 Shattuck lecture,8 has a figure in which he shows the steps involved in the “genetic revolution.” The pathway goes from “disease with genetic component” through “map” to “clone gene.” From there it is on to “diagnostics,” which bifurcates to “preventive medicine” and “pharmacogenomics.” The same type of thinking is echoed in the “vision for the future of genomics research” recently promulgated by the National Human Genome Research Institute.9 This vision is cast in terms of 15 Grand Challenges, two of which speak to the issue of genetic testing: Grand Challenge II-3: Develop genome-based approaches to prediction of disease susceptibility and drug response… Grand Challenge II-5: Investigate how genetic risk information is conveyed in clinical settings, how that information influences health strategies and behaviors, and how these affect health outcomes and costs.

Several steps are proposed for responding to the first of these challenges. These include the unbiased determination of risks associated with variants, reduction in the cost of genotyping, research on whether information will change health behaviors, oversight for clinical validity, and education.

The overall assumption for the goals encompassed by both challenges is that predictive risk information will be used by individuals to “develop an individualized prevention or treatment plan.” Indeed, there are a few terms that recur in many of the statements about genetic testing in the future: “genetic profiling,” “tailoring,” and “personalized” or “individualized.” So, the mantra goes something like the following:Genetic profiling will permit the tailoring of health care, prevention strategies, treatments, and/or interventions and will thereby make personalized or individualized medicine possible.

The quotations I have presented are but a brief sampling of the loud and steady drum beat of predictions and assertions that genetic testing for predispositions to common diseases and for the prevention of drug toxicity and promotion of drug efficacy are what genetics holds for the future. However, as I have already noted, there are many who do not accept this vision, and, therefore, the first question that I set out to answer for myself is the following: How credible is the promise of an all-encompassing personalized medicine based on wide-scale genetic testing? I shall consider testing for disease susceptibility first.

I have just presented some of the positive views asserting the likelihood of this eventuality, but Holtzman and Marteau see things differently. “Statements like these clothe medicine in a genetics mantle. The result of efforts to identify genes that have a role in common diseases suggests a different picture: the genetic mantle may prove to be like the emperor’s new clothes… Although we do not contend that the genetic mantle is as imperceptible as the emperor’s new clothes were, it is not made of the silks and ermines that some claim it to be. Those who make medical and science policies in the next decade would do well to see beyond the hype.”10

The arguments behind this statement and others like it fall into two very broad categories: whether genetic testing for complex traits can be done and whether it should be done. Starting with the can it be done, the first thing that has to be said is that complex traits are indeed complex. Arthur Beaudet summarized this very nicely in his 1998 presidential address to the American Society of Human Genetics: “Most genetic traits of interest in populations of humans and other organisms are determined by many factors, including genetic and environmental components, which interact in often unpredictable ways. For such complex traits, the whole is not only greater than the sum of its parts, it may be different from the sum of its parts. Thus complex traits have a genetic architecture that consists of the genetic and environmental factors that contribute to the trait, as well as their magnitude and their interactions.”11

Given this complexity, can testing for susceptibility to common diseases actually be done? The issues here are principally genetic and epidemiological and are concerned with our ability to identify susceptibility alleles and with their frequency in the population, penetrance, absolute and relative risks, and positive and negative predictive values. To date, as Joel Hirschhorn and colleagues have pointed out,12 most of the reported associations have not been robust. In their analysis, only 6 of 166 associations that were studied three or more times were consistently replicated. This variability among reports has been attributed to a variety of factors. Some, such as the alleles that are being looked at and the size and selection of the population, are essentially technical in nature and are potentially correctable, although perhaps not easily. However, two factors—weak genetic effects with low relative risks and gene-gene and gene-environment interactions—are inherent in what is being looked at, and they cannot be dismissed. Because of this, Angus Clarke has made the following sweeping assertion: “The scientific rationale for carrying out genetic research into these [complex] diseases has never included the development of tests that will identify the risk for healthy individuals that they will develop the disease. The fact that so many genes and nongenetic factors are involved in the etiology of these common diseases means that the identification of inherited predisposition is of little use at the individual level; it will never be possible to predict those who will be affected nor to know when an individual will develop a disease if he does so at all.”13

I would agree, based on what has been accomplished thus far and is likely to be done in the near future, that Clarke is correct about both the who and the when. However, it seems likely to me that, with the passage of time, alleles that confer susceptibility or protection and that affect penetrance, timing, and severity, will be identified at many different loci and for many different conditions. Precedents for this exist in the analysis of modifier genes in the mouse. Furthermore, although we already have reasonably good information about many of the lifestyle and environmental factors that influence the development of common diseases, it likely that more will be recognized. The implication of all of this for me, and others as well,1416 is that genetic risk assessment will attain sufficient predictive power to be of use only if the analyses of many genetic loci are combined with the evaluations of nongenetic lifestyle and environmental factors. In other words, it will be necessary to multiplex and combine the analyses, and this is what I think will happen. Testing one or a few genes at a time will not be the route to follow.

Therefore, although I concur with Hirschhorn and colleagues that “a ‘DNA chip’ that can determine crucial genotypes and accurately predict future health is unlikely to become a widespread and useful screening tool in the near future,”12 I can visualize, in the not too distant future, the development of computer-based algorithms that will combine the output of such chips or their equivalents with a broad environmental assessment to produce better estimates of risk for the development of common diseases than would otherwise be possible. Given that the genetic factors remain constant, these estimates will, of course, be highly sensitive to the nongenetic factors acting at the time of the evaluation and could change significantly if these factors were altered.

I do not think that any of this is going to be easy to accomplish or expect that it will occur very rapidly, but I do not doubt that it will ultimately happen. What we are likely to see along the way are a series of intermediate stages of multiplexed testing as the numbers of loci being entered into the prediction algorithms are increasing. An interesting theoretical example of this has already been provided by Yang and colleagues15 who have shown that in assessing the risk for venous thrombosis, a combination of three tests—factor V Leiden, prothrombin G20210A, and protein C deficiency—will increase the positive predictive value of testing by 8-fold.

When I say that I think that a highly multiplexed form of risk assessment that incorporates genetic testing will eventually become possible and will provide better estimates of risk than we now have, I do not mean to imply that the results will be absolutely definitive. The information will be probabilistic, and we shall always be dealing with estimates of risk, not predictions of certain outcomes. For some, such uncertain information is without value,13 whereas for others, it suffers from being too “abstract.”17 However, life is filled with probabilities and risks rather than certainties. When we cross the street or fly in an airplane, we do not have an absolute certainly that we will survive the experience. In fact, there are definite, albeit low, probabilities that we will not. Probabilistic information is not new to the practice of medicine, and many, if not most, diagnostic and therapeutic decisions are made on the basis of probabilities. What is important is that, even though the results of testing are probabilities and not certainties, they can still form the basis on which people make decisions.

However, merely stating the belief that susceptibility testing or profiling can be done does not mean that it actually will or, indeed, should be done. This will ultimately be determined by whether testing really offers more than currently available forms of risk assessment in terms of predictive power and securing compliance; in other words, by whether it will have greater clinical utility. In discussing whether genetic testing for cardiovascular disease will be useful, Humphries and colleagues18 have asked whether the genetic test or tests will have additional predictive power over and above the accepted risk factors that can already be easily, and usually inexpensively, measured with high reproducibility and replicability. These accepted risk factors would include family history, for which Maren Scheuner and others have made a strong case,19 a number of lifestyle and environmental factors, and relevant markers of disease and disease predisposition, such as blood pressure and serum cholesterol, lipids, and C-reactive protein.18,20 The short answer to the Humphries et al. question is that in the short run there is no evidence that genetic testing will add significantly to currently available modalities for risk assessment. My personal feeling is that in the future it will, but this remains to be shown.

A related matter is whether the results of genetic testing would enhance a person’s motivation to comply with recommendations for therapy and changes in lifestyle. As Marteau and Lerman put it, “Providing people with personalized information is not new. The question is whether responses will be any different if the information is based on DNA.”21 Haga and colleagues are not very optimistic: “If all would benefit from a healthy diet, exercise, smoking cessation or prudent alcohol intake, regardless of genotype, the added value of the test is unclear unless it can be shown to motivate compliance in those who test positive without reducing compliance in those who test negative. Unfortunately, current data suggest little reason for optimism concerning the potential for genetic tests to motivate behavioral change.”22

We already know that changing behavior is difficult, and providing people with genetic information on risk may not increase their motivation to change behavior unless effective therapies are available and beliefs in them are reinforced. In some cases, genetic information could conceivably decrease motivation if it engendered a deterministic belief in some pre-ordained outcome.22 It seems clear, therefore, that enhancing motivation and securing compliance with whatever interventions are indicated by the results of the overall risk assessment will determine whether the testing will be worthwhile, assuming, of course, that effective interventions actually exist. In their absence, or if they result in more harm than good, there is really no justification for the testing.23,24

Because I have been speaking about the effectiveness of genetic testing to motivate changes in lifestyle and environment, I need to point out that perhaps the most extreme argument used against genetic testing is that it just is not necessary. The basis for this, the argument goes, is that because most, if not all, of the common diseases are products of adverse environmental factors, they are therefore best approached on a population-wide basis by environmental and lifestyle modification and by improvement of economic and social conditions.25,26 The importance of such nongenetic factors is certainly undeniable, and Walter Willett points out that “We have been able to identify modifiable behavioral factors, including specific aspects of diet, overweight, inactivity, and smoking that account for over 70% of stroke and colon cancer, over 80% of coronary heart disease, and over 90% of adult onset diabetes….”14

The importance of nongenetic factors is also substantiated by the fact that when groups migrate from regions of low risk to ones of high risk for common diseases such as cancer, the incidence of the disease rapidly increases to match that of the new location. The same message has been derived from the marked increase in the incidence of asthma over the past several decades, and it has been suggested that “Although the causes are disputed, environmental or lifestyle factors—outdoors or at home—must underlie the increase.”27

The real danger, the critics believe, is that attempts to identify genetic risk factors will detract from efforts to eliminate the environmental and lifestyle ones. Therefore, some think that we should forget about testing altogether and get on with modifying the environment for everyone.26 The risk factors, they say, are “mass phenomena” and therefore require “mass preventive approaches.”28

I certainly do not take exception to the goal of eliminating environmental and lifestyle risk factors, but I do not see it as a the only approach to disease prevention. Population-based programs aimed at smoking cessation and at the detection and treatment of hypertension are considered to be quite effective,29 and it has been estimated that each of these, if universally implemented, could increase the life expectancy of the entire population of 35-year-old males in the United States by about a year.30 However, the results of several population-based multiple risk factor intervention studies for the prevention of coronary heart disease have shown showed only modest benefits, if any.31,32 Less than optimal compliance with the intervention program is undoubtedly one of the major reasons for the poor outcome, and compliance may be difficult unless there is a high degree of motivation. Hopefully, a more comprehensive approach based on a combined genetic and nongenetic risk assessment might provide this.

I now want to turn briefly to pharmacogenetic testing, for which the purpose seems to be more straight forward than susceptibility testing—to look for variants that might make a person more or less sensitive to the therapeutic and toxic effects of drugs. There has certainly been enormous press on the subject, and even some of the strongest critics of susceptibility testing have seen promise in this approach.10 However, to get directly to the point, I quote from a recent review: “Most drugs are metabolized by several different enzymes, can be transported by other types of proteins, and ultimately interact with one or more targets. If several steps in this type of pathway were to display genetic variation—that is, if the effects were polygenic—clear multimodal frequency distributions … would quickly be replaced by multiple overlapping distributions. Therefore, even if inheritance influenced the effect of a drug, the relatively simple, one-to-one relationship observed for [the cytochrome P450 isoform] and TPMT [thiopurine S-methyl-transferase] would not be obvious.”33 And, as I shall mention shortly, even CYP2D6 is problematic.

Given this complexity, the same approach that I discussed for susceptibility testing will be undoubtedly be required—the multiplexing of tests to look at multiple genes that simultaneously affect the metabolism and effectiveness of individual drugs.34 And, in further analogy with testing for common disease susceptibility, it is clear that the handling and action of drugs is not only polygenic, it is also multifactorial. Drugs are taken by people, and there are other factors beyond the genetic ones that have an influence. These include other illnesses, liver and kidney function, diet, exposure to other drugs and chemicals, and compliance with dosage and timing schedules,35,36 and not all of these can easily worked into a prediction algorithm. Therefore, when all is said and done, the predictions that are likely to be possible on the basis of pharmacogenetic testing, even when genes of major effect are in play, are again destined in most instances to be highly probabilistic and not absolute.37 I think that the Food and Drug Administration summarized the situation very well in its recently released draft document on Guidance for Industry [for] Pharmacogenomic Data Submissions: “Much of the concern about FDA actions in this area is based on the perception that pharmacogenomics testing is likely to give very definitive answers about safety and effectiveness in subpopulations. This may happen sometimes (as in oncology) and in such cases, rapid development of a diagnostic test in highly encouraged. However, this is unlikely to be the ordinary case. In most instances, genotype or gene expression profile is likely to be one of a number of factors, so that probability of an adverse event or a favorable response would be increased, but the outcome not inevitable. For this reason, genetic markers can ordinarily be handled like other predictive markers in the clinical arena.”35

Of the large number of known or potentially clinically relevant genetic polymorphisms that influence the metabolism and effects of drugs that have been identified,33,34,38,39 testing for only one, the TPMT variant, is commercially available and used somewhat routinely in clinical practice.40 Roche Diagnostics recently tried to bring its AmpliChip P450, a microarray-based test for two cytochrome P450 family members, CYP2D6 and CYP2C19, to market, but their application was put on hold by the Food and Drug Administration because they had not submitted the test for premarket review.41 However, it is of some interest to look at Roche’s public background information statement about the test: “MICROARRAY (“DNA CHIP”) AND ROCHE AmpliChip CYP450 BACKGROUNDER: AmpliChip CYP450 can detect naturally occurring genetic variations or common mutations in two specific genes, CYP2D6 and CYP2C19… Common variations in these genes play a crucial role in determining how a person can process or metabolize many of the drugs used for the most common conditions including analgesics, antidepressants, antihistamines, heart and blood pressure medications, and antipsychotics …With the AmpliChip CYP450, drug metabolism genotyping in the future could be used in the clinic to assist in the therapeutic decision-making with the goal of prescribing drugs that are optimally effective and safe for the individual. Currently, Adverse Drug Reactions (ADRs) in US hospitals are responsible for more than 100,000 deaths nationwide each year making it [sic!] one of the leading causes of death. Each year, more than two million hospital patients experience a serious ADR in the US, representing a significant burden on health care systems and costs.”42

This quote promises a lot and raises many fears, but one reviewer suggests that things might not be quite as simple as suggested: “Although these enzymes contribute to the metabolism of a large number of drugs, the situations where prospective genotyping of CYP2D6 and 2C19 would be clearly beneficial are limited, and additional research is required to identify circumstances in where prospective genotyping is most warranted. Interestingly, the value of CYP2D6 genotyping is likely to decrease over time, as many pharmaceutical companies will no longer pursue a compound whose primary metabolism is via CYP2D6.”34

And, the Nuffield Council on Bioethics, in its review of the ethical issues in pharmacogenetics, suggests another reason that testing for CYP2D6 might not be useful routinely: “…[T]he adverse reactions, while unpleasant, are rarely life-threatening and because alternative therapies exist…. it may be quicker and easier in many clinical settings simply to prescribe the medicines, observe any problems, and try a different medicine if necessary, rather than undertaking a pharmacogenetics test… The ability of a test to predict an outcome may be proven. But such clinical validity does not necessarily correspond with clinical utility, that is, the ability of the use of the test to improve treatment of patients.”36

Therefore, just as with susceptibility testing, the economics of the situation will ultimately require sound evidence of clinical utility if pharmacogenetic profiling is to be accepted.43 Nevertheless, as with susceptibility testing, I think that multiplexed pharmacogenetic testing will ultimately prove to be of value and that it will be used—more likely later than sooner.

At this point, I want to summarize where my analysis has taken me. In a word, yes! I do believe that there is credibility to the claims that expanded genetic testing for disease susceptibility and pharmacogenetic testing to predict drug responsiveness and toxicity will happen and will improve risk assessment, disease prevention, and drug therapy. But, to paraphrase Haga and colleagues, genomic profiling to promote a healthy lifestyle is certainly not yet “ready for prime time!”22 The expansion in testing will occur incrementally over a substantial period of time, and it will be driven by a combination of scientific discoveries, heavy commercial pressures, and public expectations. A number of social, legal, and ethical issues will need to be dealt with along the way,44 and clinical utility will ultimately determine what testing will actually be done. It is likely that genetic testing will initially have discrete targets (cardiovascular diseases, cancer, diabetes, and the like), but these will eventually be coalesced into broader and more comprehensive forms of risk assessment. The same will also be true for pharmacogenetic testing.

We are just beginning to understand what types of data will be required and how they might be acquired, and the process promises to be both lengthy and costly. It is unlikely that the AmpliChip P450 model or other approaches limited to just one or a few genes at a time will prevail in the long run. The costs would be too high and the predictive value too low. Rather, testing will be multiplexed for both susceptibility and pharmacogenetic assessments, and nongenetic factors will be part of the mix. Genetic testing will be just one component of the practice of medicine and provision of health care, and not the all-encompassing personalized medicine that has been widely trumpeted, and it will not replace population-based environmental and lifestyle modification.

The information provided by testing will in most instances be highly probabilistic in nature and will require interpretation, and this brings me to my second major question: How should we, as medical geneticists, relate to all of this?

Muin Khoury has framed the issue in an interesting way. Although the title of a recent article refers to a “continuum,” he in fact sets up a dichotomy between the typical practice of medical genetics, with a “traditional genetic services model,” which focuses on genetic counseling, and the practice of genomics in medicine and public health.45 He then asks how much of our current genetic services model can still hold in the “-omics” era? His answer is that “the paradigm of genetic services will always apply to a small proportion of individuals and families… However, a threshold between delivery of genetic services for ‘genetic disorders’ and communication of ‘genetic information’ as a routine component of practice will have to be delineated and established.”45

I am uncomfortable about this dichotomy and think that it is critical that we—the geneticists—decide whether we should and want to be involved in genetic testing, and if so, how. If we do not decide ourselves, it will assuredly be decided for us, and in a sense, this has already begun. For example, Francis Collins has been quoted to the effect that, “We have to anticipate that every health care provider is going to become a genetic counselor in the next ten years or they’re not going to be doing a good job,”46 and Tom Caskey has written that “Specialists in genetics are needed for the education of primary care physicians, who will be responsible for established medical genetics practice. Clearly, the day-to-day genetic health providers of the future will be the primary care clinicians.”47

Furthermore, a Genetics White Paper published in June of 2003 by the British National Health Service holds the following: “GPs, practice nurses and the primary care practitioner will all be able to help their patients benefit from the new genetic knowledge and its applications. They already understand the long term, psychosocial aspects of illness. They work with individuals in the context of their families over time. They are adept at identifying health problems and making appropriate referrals. They co-ordinate the care of the affected patient. And, they are at the forefront of health promotion and prevention. [Therefore,] the roles for primary care in genetics [will include] managing patients’ concerns and expectations, identifying genetic conditions, assessing risk, managing risk, screening, [and] testing…”48

A point of view similar to that of Khoury and the British National Health Service has been expressed by Guttmacher et al.49 They assert that it would be impractical and, in fact, less desirable to expand greatly the number of genetic professionals than “to enlist all health care providers as active participants” in genetic health care delivery. Furthermore, with regard to the role of the geneticist in the provision of genetic services, they draw an analogy between genetics and infectious diseases: “In the last century, the advent of tests and therapies based in understanding infectious disease and the routine use of infectious disease knowledge and tools by primary care providers did not eliminate, but instead elevated the importance of the infectious disease specialist. Similarly, genetic-based tests and therapies and the expansion of nongenetic specialist providers’ use of genetics will not relegate genetic specialists to the dustbin of medical history, but will define their roles.”49

They then go on to define what this role will be: “Because of the specialized knowledge and experience required, the diagnosis and long-term follow-up of individuals with monogenic and chromosomal conditions will remain part of the genetic specialist’s practice… The [genetic] tests that will require the genetic specialist will be those that are unusual or highly complex in terms of interpretation or impact.”49

This approach has been strongly opposed, for a variety of reasons, by Greendale and Pyeritz.50 It would, they believe, “deliver a mortal blow to clinical genetics as a discipline” by making it economically nonviable. Furthermore, they claim, most primary care providers are neither well-enough educated in genetics and nor really interested in becoming so and that, when everything is taken into account, no clinical genetics encounter can be viewed as being “routine.”

Looking at the issue from the pharmacogenetic side, Michael Malinowski, a lawyer interested in genetic testing, also does not agree that primary care physicians will be in a position to shoulder the burden alone: “In light of the towering and still rising wave of information, the all-knowing general practitioner is not a contemporary possibility. The advent of pharmacogenomics may overwhelm the medical community with an even more pervasive set of challenges… The market introduction of a multitude of innovative pharmaceuticals accompanied by genetic profiling and added decision-making … will necessitate significant changes in the delivery of care. Rather than making doctors and nurses assume the entire burden, it is likely that pharmacists and nonphysician clinicians will be assuming an expanded role in the health care process.”43

Well, pharmacists are out of my domain, but I do agree that it is unlikely that the primary care physician will be able to handle the complexities of comprehensive genetic testing and risk assessment alone, and neither will the specialist. Furthermore, I do not agree that all providers are going to become genetic counselors or, indeed, that they should. Although risk assessment might in some ways appear to be a simple black box procedure—with a DNA sample and perhaps a personal health and family history questionnaire being sent in to the laboratory, the black box, and a list of recommendations coming back by return e-mail—this is not all there will be to it. Personal risk assessment will never be an exact science: the genetics behind multiplexed susceptibility testing will be complex, both epidemiological and genetic thinking will be involved. And, as I have already emphasized, the results will always be probabilistic, and this will necessitate skills is the communication of risk information.51

But, how do we strike the balance? If genetic testing and risk assessment develop in the future as I expect that they will, primary care physicians and specialists will undoubtedly be involved. In fact, they will have to be, as they, especially the primary care providers, will be the ones caring on a long-term basis for those who are tested. However, the primary care physicians and specialists will need to have people to turn to, and I think that needs to be us, the genetic professionals—not just as educators, but as active participants in the process. We are the ones who know genetics and how testing is done. We are comfortable with family histories and probabilities and with counseling and decision-making. We are already doing genetic testing and risk assessment.

Malinowski was worried about the capacity of the primary care provider to handle just pharmacogenetic testing.43 How much greater would be the burden if comprehensive risk assessment is also part of his or her responsibilities. The major issues for geneticists and nongeneticist alike are, of course, knowledge and time. For the nongeneticist primary care providers and specialists, the knowledge problem will, I believe, eventually be dealt with. However, for this to happen, fundamental changes will need made in the integration of genetics into medical school curricula and residency training programs. Despite much talk, this has not yet even begun to occur, but when it does we need to become intimately involved. As for the geneticists, we must also be prepared if we are to be involved. The interpretation of genetic tests is not always easy, and even we can and do run into problems with the current forms of testing, as recent experiences with cystic fibrosis carrier screening and testing for APC mutations attest.52,53 When it comes to more complex risk assessment, the problems will be all the greater, and we will all need to be better prepared. I think that we have already learned this from our experience with genetic testing for breast cancer. What this means to me is that, whatever our ultimate role becomes, we need better training ourselves in risk assessment, epidemiology, complex trait analysis, and pharmacogenetics. I can visualize such training as a special track in our clinical genetics residency programs, but it also will need to be part of the core elements of all training in medical genetics and genetic counseling.

I do not know how to handle the time problem for either providers or ourselves, and I shall leave the providers’ problem to them to work out. However, for our part, we geneticists are clearly not going to be doing all of the risk assessment that will be taking place, even if we wanted to and even if our numbers greatly increased. Therefore, we shall need to make ourselves available to primary care providers and specialists and to work out relationships between us and them to ensure appropriate referral and consultation. We shall need to this proactively and not as a rear guard action. But, it is not just time. I know that economics are an issue: thinking is not highly valued in the health care reimbursement system. But, this cannot deter us from planning for the future. And, finally, we shall need to make sure that our own house in order. Clinical and laboratory geneticists and genetic counselors are all going to be in this together, and I do not believe that it will serve any of our interests to head off in different directions.

In closing, I want to affirm once again that although the American College of Medical Genetics is vitally interested in the matters that I have been discussing, what I have said this afternoon represents my own thoughts and opinions and does not reflect the official position of the College. And, to answer the question in the title of this address—hope or hype?—I would say the following. Although much of what has been said publicly about the future of genetic testing must be regarded as hype, I believe that testing will, with the passage of time, ultimately become a source of hope and that we will have to have a part in it.