Oral anticoagulation with warfarin requires a relatively individualized therapeutic approach to establish appropriate dosing. It is of the utmost importance to administer correct initial dosing given warfarin's narrow therapeutic index and devastating complications. Warfarin pharmacogenomic testing is one element of many that may be used to appropriately tailor therapy. Numerous studies have evaluated the clinical benefit of genotyping, and many times the results are equivocal. The majority of these studies have been performed in the specialized anticoagulation clinic setting.

In this issue of the journal, Caldwell and coworkers1 reported on “A randomized, controlled trial of genetic-based Coumadin initiation.” The study setting is an anticoagulation clinic and they found, despite improved initial therapeutic dose estimation when genotype was known, outcomes were ultimately impacted by clinical management regardless of genotype knowledge. It is important when interpreting these conclusions, however, to acknowledge that there are two main oral anticoagulation management settings in the United States: specialized anticoagulation clinics and community clinics (usual care, health maintenance organization setting); the services and outcomes of which are not equivalent for numerous reasons. These differences prohibit meaningful generalization of results that universally invalidate the clinical impact of pharmacogenomics testing.

Warfarin is the primary drug prescribed for long-term oral anticoagulation therapy in the United States with close to 4 million patients taking this medication.2 It is used for the prevention of thromboembolic complications of various disorders including atrial fibrillation, venous thromboembolism, and valvular heart disease.3 Despite the fact that this is a highly efficacious therapy when titrated to the appropriate therapeutic range, the titration and dosing of warfarin are the most challenging and labor-intensive aspects of this therapy. There are many factors that influence its pharmacokinetics and pharmacodynamics including patient age, concurrent medications, diet, comorbidities, and genetics.4 Patient comprehension/health literacy, education, receptivity to details regarding medical illness, and various patient demographic and psychosocial factors that are associated with adherence to therapy and the associated monitoring also play a significant role in achieving a desirable therapeutic range.57 All these factors combined, many of which are in a state of constant flux, result in significant inter- and intraindividual variability. This makes it difficult to accurately determine individual therapeutic dose to maintain a stable and appropriate international normalized ratio (INR). Accuracy in this regard, however, is crucial given warfarin's very narrow therapeutic index and the strong association between the time spent in the therapeutic range with risk of morbidity and mortality due to bleeding and thromboembolism.812

Ultimately, correct initial dosing and appropriate monitoring of the therapeutic range by INR are both key components of successful and safe oral anticoagulation therapy. The majority of studies examining treatment setting in regards to the quality/safety of oral anticoagulation management report that specialized anticoagulation clinic services are associated with better INR control and reduced rates of hospitalizations/emergency visits due to adverse events related to anticoagulation, compared with standard community care (usual care/health maintenance organization care).4,1316 Patients in anticoagulation clinic settings are in the therapeutic INR range 63% of the time. Although this is far from ideal, it is much better than for patients in community care settings who spend 11–12% less time in the therapeutic range.4,16 Given the data, it is not surprising that the American College of Chest Physicians recommends the use of specialized anticoagulation management services to improve the quality and safety of anticoagulation care.17 The exact number of patients receiving their anticoagulation care in the community versus the specialty clinic is unknown. However, it is safe to say that despite the data and the strong recommendation regarding care setting, a significant number of these patients are currently receiving their care in the usual community setting.

The discrepancy in outcomes and therapeutic success as defined by time spent in the therapeutic INR range is multifactorial. In general, community physicians' anticoagulation practices have been found to vary widely. This most likely reflects the fact that most community physicians are general practitioners. They do not have subspecialty training nor are they able to focus entirely on the nuances of anticoagulation care and provide this service with a team of ancillary staff who are also strictly dedicated to the specific task of anticoagulation. There is a wide distribution of INR testing intervals in the community setting versus personalized, regular, close monitoring in dedicated specialized anticoagulation clinics. This results in less time in the therapeutic INR range and a range of anticoagulation control for community patients. Add the prevalence of significant comorbidities in community patient populations, and this conveys a greater overall risk of treatment complications. Previous myocardial infarction, congestive heart failure, previous stroke or transient ischemic attack, and hypertension increase relative risk of thromboembolism 2–6 times, and previous stroke and hypertension are associated with a 2.5–2.8 times greater risk of bleeding complications from anticoagulation.18

Poor adherence to warfarin is common with one in five doses taken incorrectly even in the setting of a dedicated anticoagulation clinic.5 One study revealed that for each 10% increase in nonadherence to warfarin, there is a 14% increase in the risk of under-anticoagulation, which increases the risk of thromboembolism in postoperative patients and causes significantly higher rates of morbidity and mortality among stroke patients.6 Nonadherence was the most commonly cited reason for a low INR. It has been observed that low INR is addressed without a sense of urgency in the community setting, but this may be related to 2008 American College of Clinical Pharmacy guidelines that reinforce a lack of urgency in addressing low INR values, because they contain limited guidance about how to address low INR despite offering detailed instructions about how to deal with elevated INR.3

The literature also shows that a significant proportion of patients with unstable INR values are poorly informed about the reasons for and the clinical importance of oral anticoagulation therapy for their health. Most of these patients also had insufficient knowledge of oral anticoagulation mechanism of action, the rules of treatment monitoring, and the risk of thrombotic and bleeding complications, and many of these patients stated this as the reason for poor compliance.7 In general, community practices, when compared with specialized anticoagulation clinics, conceivably have fewer resources and specially trained personnel and less time to accommodate patient diversity and variable levels of health literacy. These two factors may affect patient attitudes and alter anticoagulant therapy outcomes. Importantly, it has been shown that patient education on warfarin and oral anticoagulation therapy training is effective in reducing the risk of major bleeding in older patients, and it is reasonable to believe that having the time, resources, and personnel available to educate all anticoagulated patients in this regard can result in better patient outcomes regardless of age.19 Dedicated anticoagulation clinics are poised to offer this level of education to their patients, and given the impact this has on outcomes, this is another factor that explains the differences between specialized care and general community care.

The time required of patients for chronic oral anticoagulation is also considerable.2 The frequency of monitoring can discourage many patients, especially when initiating therapy. Unstable INR values require more frequent monitoring, and identifying an appropriate initial dose to minimize instability and the associated increased monitoring may likely mitigate a bad experience for the patient. It is reasonable to believe this will improve compliance. Anticoagulation clinics have more time to spend with each patient to take a complete history and obtain all pertinent information necessary for dosing algorithms.

The time required of patients for chronic anticoagulation therapy is not only related to monitoring but also food preparation. Eating a balanced diet with appropriate levels of vitamin K requires planning, effort, and time. It is a challenge for patients to eat a balanced diet at baseline, and this challenge intensifies with oral anticoagulation. All current dosing algorithms are based on a balanced diet. A fluctuating dietary intake of vitamin K is a well-known cause of unstable anticoagulant control in patients on warfarin. There is a tenuous balance between vitamin K ingestion and the amount of warfarin needed to produce a consistent INR. INR is more sensitive to vitamin K changes in patients with low-vitamin-K status than in those with a normal or high vitamin K, and dietary vitamin K intake in unstable patients is considerably lower than in stable patients.20,21

Despite this knowledge, advice given to patients varies. Some patients are told to avoid foods high in vitamin K, whereas others are told to maintain a healthy diet containing sufficient fruits and vegetables.20,21 Sources of dietary vitamin K in the United States are also debated as are the absorption rates and bioactivity of different forms of vitamin K. Some authors state that the primary food source of vitamin K (subtype K1, phylloquinone) is green leafy vegetables.22,23 Although this is correct in the classic sense, many Americans eat mostly processed food and extremely limited fruits and vegetables. Studies have revealed that even though meat, dairy, and fast food in the United States are not adequate individual sources of vitamin K, because they are often consumed in high quantities, they may be of importance in the overall contribution in total vitamin K intake.24,25 Processed food, especially snacks and fast food, contain partially hydrogenated oils, and these foods also contribute to vitamin K intake. They contain the K2 form, dihydrophylloquinone, because it is formed during hydrogenation of plant oils.2426 The physiologic importance of the K2 form depends on it biologic activity and that is currently unknown.25,26 Nondietary factors, such as age, gender, triglyceride levels, and apoE genotype, also affect vitamin K metabolism.25 Vitamin K and dietary counseling for anticoagulation patients is a complex issue beyond the scope of what is taught in most medical schools and residencies. It may be through the multidisciplinary teams available at specialized anticoagulation clinics that “effective” balanced diet counseling is a reality.

There is a paucity of information regarding the demographics of patients that obtain anticoagulation care by specialized clinics versus community care. Furthermore, the demographics of community care settings are markedly variable depending on location (urban, suburban, and rural). It is possible that community clinics care for an increased proportion of indigent, poor, uneducated, and non-English-speaking patients. It is also possible that community clinics have an increased number of patients with significant mental illness comorbidities and substance abuse disorders. This compounds the problems experienced by the community clinics at large when administering oral anticoagulation. It is much more difficult to counsel this patient population regarding diet, and it is reasonable to believe that continuity of care, medication adherence, and regular follow-up monitoring are problematic as well. More studies of these demographic variables would be useful to understand the challenges faced by community care clinics who administer oral anticoagulation care.

The anticoagulation clinic setting, because of its organization and consistent record keeping, facilitates studies examining the effects of genotype on anticoagulation outcomes. It is not surprising that most studies on the effects of genotype on warfarin response have been performed in the specialized anticoagulation clinic setting, and the results have been equivocal. This may be due to the relative effectiveness of a system that includes adequate time, resources, and specially trained ancillary staff that is available to take adequate histories, perform thorough examinations, and educate patients, resulting in appropriate initial dosing with less INR instability. Furthermore, because of the dedicated and relatively tailored monitoring offered at these clinics, genotype information may not have a significant clinical impact on outcomes in this setting.

It is unreasonable, however, to assume all patients requiring anticoagulation therapy will have access to and be seen in dedicated anticoagulation clinics. It is also unreasonable to dismiss warfarin pharmacogenomics testing in community practice. The inadequacies of current warfarin dosing regimens highlight the need for a more individualized approach. Overanticoagulation occurs mainly in the early stages of treatment when the optimal dose is being established. The risk of bleeding during the first month of therapy is 10 times the risk at 12 months.27 It is, therefore, of the utmost importance to get the dosing right as soon as possible, and this ideally begins with administering a correct initial dose. This is especially the case in general practice settings where follow-up monitoring varies and is not as regimented and tailored as that which is offered at anticoagulation clinics.

In community practice, genotyping may substitute for the time, resources, and focused personnel that constitute the individualized approach used in the specialized clinics. Genotype information may help level the playing field allowing community practitioners to get the initial dosing right the first time thereby avoiding INR instability and its complications associated with significant morbidity and mortality. Community patients will benefit when the dosing is correct the first time, and it is reasonable to hypothesize that this will improve the quality of the anticoagulation experience and result in better patient adherence to ongoing therapy and monitoring. Further work is necessary to determine whether it is possible to incorporate the methods of anticoagulation clinics into the community setting, but until then, warfarin pharmacogenomics testing may be a key tool that community physicians can use to individualize warfarin therapy.

The effects of genotype knowledge on initial control, time in therapeutic range, and outcomes in the community care setting have not been thoroughly evaluated. More studies in this setting are necessary. However, community practices vary, making this a difficult task. Each general practitioner also cares for relatively few anticoagulation patients compared with the numbers of patients seen by practitioners in anticoagulation clinics. This presents an obstacle to obtaining the subject numbers necessary to thoroughly power a study to evaluate the effect of warfarin pharmacogenomics testing in the general community practice setting. Furthermore, the interest to do these studies is lacking in the community, and securing adequate funding for this work in the current economy is unlikely.

Individuals may point to studies performed in specialized clinic settings to support claims that warfarin pharmacogenomics testing is equivocal and unnecessary. However, it is likely that this group of studies is not truly reflective of the actual population of patients whom may benefit from genotyping. The medical community should consider this issue when attempting to generalize results.

This is the case in the study by Caldwell and coworkers.1 Although they found that genotype-informed warfarin dosing improved estimation of therapeutic dose, it ultimately did not have an impact on patient anticoagulation outcomes. This may be because the study was underpowered. It had a power of 80% to detect a 15% effect size, so it is possible that a difference may not have been detected if it truly existed. More importantly, however, is the issue of generalizability of these results. There are numerous differences in clinical management between anticoagulation clinics and community clinics. Caldwell and coworkers indicate that their current clinical management practices collectively exerted a strong impact on patient anticoagulation outcomes despite genotype-informed dosing. This may reflect the resources, specially trained support staff, and consistent and relatively tailored follow-up monitoring that constitute anticoagulation clinic care. These elements are not typically available to or a part of community anticoagulation care. Community care follow-up monitoring is variable, and this, unlike the specialized center monitoring, is not as effective in establishing stable, appropriate oral anticoagulation and may result in worse clinical outcomes.

It is, therefore, invalid to generalize results from anticoagulation clinic settings to community practice settings. Despite many obstacles to performing the necessary studies to gain more insight into this issue, it is plausible that warfarin pharmacogenomics testing may yet be scientifically proven to have a clinically relevant, meaningful role in general practice oral anticoagulation settings. Larger trials are necessary such as the National Heart Lung and Blood Institute's COAG trial.28 Unfortunately, this trial will also suffer from the limitation that the pharmacogenomics group is being compared with expert anticoagulation clinics in most of the clinical trial sites instead of general practice.

It is of utmost importance to get initial warfarin dosing right the first time in community practice settings, because most data show that community-based clinical management and follow-up monitoring are variable and not as effective at providing safe clinical outcomes. Genotype-informed dosing may improve the likelihood of accurate initial dosing, and at the current time, it would be wise not to entirely dismiss warfarin pharmacogenomics testing, especially in the community care setting.