Technological innovations continue to become exceedingly ingrained into everyday life and consumers are beginning to use consumer-grade software and hardware devices to manage their health. Smart wearables are consumer-grade, connected electronic devices that can be worn on the body as an accessory or embedded into clothing. These include smartwatches, rings and wristbands, to name a few, and they all have high processing power and numerous sophisticated sensors that can glean new health insights. An estimated 20% of US residents currently own a smart wearable device and the global market is expected to grow at a compound annual growth rate of 25%, reaching US$70 billion by 2025 (refs1,2). Although the integration of this technology in the clinical workplace is still in its infancy, it has rapidly moved through the Gartner Hype Cycle for emerging technologies3 and its adoption has further accelerated after the coronavirus disease 2019 (COVID-19) pandemic and the explosive growth of telehealth4. In this Review, we summarize the basic engineering principles of common wearable sensors and discuss their applications in cardiovascular disease prevention, diagnosis and management. We also highlight several challenges hindering their widespread adoption and how to move forward as we embark on a new decade of clinical innovation. Finally, we propose a practical ‘ABCD’ guide for clinicians to handle wearables in routine clinical practice.

Engineering principles of wearable sensors

Activity sensors

Physical activity is inversely correlated to adverse cardiovascular outcomes5 and all-cause mortality and is recommended by the AHA as one of the ‘Life’s Simple 7’ lifestyle recommendations to promote heart health6. The assessment of physical activity levels has traditionally been subjective and recorded only during clinic visits, if at all. This approach is limited by a lack of sufficient detail, recall bias and a failure to objectively assess physical activity in a real-life environment. Common statements such as “I walk five times a week for 30 minutes” do not include important information such as physical activity intensity, distance and sedentary time. Therefore, the subjective reporting of physical activity levels will become obsolete as digital health trends, such as wearables and smartphones, can objectively and accurately assess physical activity and energy expenditure through various sensors.

The triaxial accelerometer is the dominant method of activity monitoring in current wearables and measures linear acceleration along three different planes. The other major inertial sensor is the gyroscope, which measures angular motion7. The common operation principle of triaxial accelerometers is based on a seismic mass attached to a mechanical suspension system7. These devices take advantage of Newton’s second law in which mass deflection to the opposite direction of motion with a certain amount of acceleration can be measured electrically. The three most common types of accelerometers are the piezoresistive, piezoelectric and differential capacitive accelerometers, with the differential capacitive accelerometer being the most commonly used in wearables owing to its superior performance7. In differential capacitive accelerometers, the seismic mass is suspended between two electrodes and the acceleration of the seismic mass is proportional to the differential capacitance between the electrodes7. The advantages of these accelerometers include low power consumption, fast response to motion and superior accuracy compared with the piezoresistive model, which can be undermined by its temperature sensitivity7. The placement site on the human body is one of the most important factors that affect the accuracy of accelerometer measurements7. A centrally located sensor on the torso (embedded into vests, mounted by straps or attached to the skin) is best suited for detecting posture, acceleration and whole-body movements and offers the least errors compared with other body locations7. An ankle-mounted sensor is better suited to measure steps and energy expenditure. However, wrist placement has taken precedence over ankle placement in commercial wearables owing to increased compliance and convenience8.

Global Positioning System (GPS) and barometers are also included in wearables to more accurately assess physical activity. GPS utilizes a system of 24 or more satellites that are continuously emitting signals to identify their precise orbital position and time based on extremely accurate and stable atomic clocks9. With the use of complex equations that include signal emission time and speed of light and account for Einstein’s relativistic concepts, the GPS receiver can determine its distance from at least four satellites. The receiver can then trilaterate its position on earth with an accuracy of up to 4.9 m (ref.9). However, GPS function is limited by satellite geometry, signal blockage, building reflections, atmospheric conditions and receiver design features9. By contrast, barometers utilize a diaphragm mounted on a vacuum chamber that compresses proportional to pressure. The amount of deformation of the diaphragm is converted to electrical signals via a capacitive model that relies either on plates moving closer together or on a piezoresistive strain model in which resistors on the diaphragm change their electrical resistance according to the extent of deformation10. The barometer is then able to determine changes in altitude, track stair count and detect falls on the basis of the principle that atmospheric pressure decreases with increasing altitude10. However, barometric measurements can be error-prone because the device might mistake natural changes in ambient temperature and pressure as a change in altitude10. Although the use of more sensors leads to better estimation of physical activity and energy expenditure, this approach can also put further strain on battery life11.

Heart rate and rhythm sensors

Heart rate (HR) measurements during rest and exercise can be used to predict the risk of cardiovascular disease. In healthy populations, a high resting HR has been associated with an increased risk of coronary artery disease and all-cause death12 and is also well recognized as a predictor of adverse outcomes in patients with heart failure (HF)13. An impaired HR recovery after exercise correlates with increased adverse cardiovascular events14. HR variability (HRV) has also been strongly linked to the risk of adverse cardiovascular events in healthy individuals and in patients with HF with reduced ejection fraction15.

Commercial wearables measure HR and heart rhythm through electrocardiography (ECG) or photoplethysmography (PPG) by calculating beat-to-beat time intervals and using algorithms to classify heart rhythm. ECG sensors come in various forms and are the gold standard for HR and heart rhythm measurement. Chest-strap monitors and ECG patches provide continuous monitoring of heart rhythm but are less appealing to the average consumer than other options such as smartwatches given their bulkiness, limited functions and long-term inconvenience. Some smartwatches can record a single-lead ECG as needed by placing a contralateral finger on the crown (negative electrode on the side of the watch), with the back of the watch serving as the positive electrode16. Single-lead ECGs are useful to diagnose simple and common arrhythmias such as atrial fibrillation (AF). However, these single-lead ECGs are often insufficient for the accurate diagnosis of more complex arrhythmias and other conditions such as myocardial infarction (MI) or to detect interval abnormalities unless specific manoeuvres are deployed17.

PPG measures changes in microvascular blood volume that translate into pulse waves and a tachogram recording18. An emitter sends a continuous pulse of photons through the skin and a photodetector measures the variable intensity of reflected photons from the tissue18. Most wearables continuously activate the PPG during exercise whereas, during rest and sleep, PPG measurements occur only intermittently to preserve battery life. PPG tachograms, especially when augmented by single-lead ECG, can also identify arrhythmias19. Nevertheless, PPG technology has limitations. The main drawback is that the sensor works best when in direct contact with the skin, which is not always the case with wearables secured with straps. Skin colour, moisture and even tattoos have also been postulated to affect PPG accuracy20, although one study showed similar device performance across a full range of skin tones21.

Given the variability in HR accuracy of PPG sensors across different wearable devices, a number of studies have directly compared their performance. One study that investigated the accuracy of the Apple Watch 3 (Apple, USA) and the Fitbit Charge 2 (Fitbit, USA) showed that both devices provided an acceptable PPG sensor HR accuracy (<10% mean absolute percent error) across 24 h and during various activities22, including sitting, walking, running and activities of daily living (such as chores or brushing the teeth). Over 24 h, the Apple Watch 3 had a mean difference of –1.80 beats per minute and a mean agreement of 95% compared with the gold standard ECG22; the Fitbit Charge 2 had a mean difference of 3.47 beats per minute and a mean agreement of 91% compared with the gold standard ECG22. Another study evaluated the accuracy of several wrist-worn HR monitors (Fitbit Blaze, Apple Watch, Garmin Forerunner 235 (Garmin, USA), TomTom Spark Cardio (TomTom, Netherlands)) and one chest monitor (Polar H7, Polar Electro, Finland) compared with ECG in patients with established heart disease undergoing phase II or phase III cardiac rehabilitation23. The chest strap had the best concordance with the standard ECG during all activities (Lin’s concordance coefficient rc = 0.99); the accuracy of wrist-worn devices varied substantially by the type of activity and 5% of HR measurements were substantially inaccurate23. These studies highlight that PPG HR readings from wrist-worn devices during activity should be interpreted with caution20, whereas less convenient, chest-strap monitors are more accurate. Further studies examining newer devices and their performance in diverse populations are needed to further understand the limits of PPG technology and improve its performance.

Blood pressure sensors

Hypertension is a leading cause of morbidity and mortality globally24. Incorporating accurate blood pressure (BP) measurement within consumer-grade wearables has the potential to improve screening for hypertension and identify nocturnal or exercise hypertension, which have been linked to worse outcomes24. The HeartGuide wristwatch (Omron, Japan), with a built-in cuff, was compared with an ambulatory BP device in office and ambulatory conditions25. For office BP measurements, patients were seated and wearing the HeartGuide wristwatch and the standard BP measurement device in the same non-dominant arm and BP readings were taken twice by each device in alternating 30–60-second intervals. For ambulatory BP measurements, patients were given an ambulatory, upper-arm machine that measures BP at 30-minute intervals over 24 h and were instructed to use the HeartGuide device after each ambulatory BP measurement at least 10 times while awake. The mean difference (±s.d.) in systolic BP between both groups was 0.8 ± 12.8 mmHg in the office-based setting and 3.2 ± 17.0 mmHg in the ambulatory setting25. These findings are consistent with previously described limitations of wrist-based cuff BP measurements26. BP can now be measured without a cuff, increasing the feasibility and ease of monitoring BP throughout the day. This technology uses a combination of PPG and ECG measurements to estimate BP by calculating the pulse transit time, that is, the time required for the arterial pressure wave to travel from the heart to a distant vessel27. In a small study, cuff-less BP measurements were compared with ambulatory device measurements28. Mean biases between the wearable and ambulatory devices over 24 h were 0.5 mmHg (–10.1 mmHg to 11.1 mmHg) for systolic BP and 2.24 mmHg (–17.6 mmHg to 13.1 mmHg) for diastolic BP, with the mean bias widening over 7 days to –12.7 mmHg for systolic BP and –5.6 mmHg for diastolic BP. Several other variables have been considered to measure cuff-less BP such as pulse wave velocity and propagation in combination with deep learning algorithms; however, most of the studies are limited by small sample sizes and a lack of external validation29. Although encouraging, cuff-less BP monitoring is still in its infancy and requires further scrutiny.

Other sensors

Biochemical sensors can measure body fluid electrolytes with the use of electrochemical transducers, offering valuable information about plasma volume status and analyte concentrations30. However, the accuracy of these sensors changes with skin temperature, skin contamination with dust, dried sweat or other substances, and hair density. One example of biochemical sensors are the minimally invasive continuous glucose monitors that have been clinically validated but are difficult to embed in consumer-grade wearables and mostly function as a stand-alone product31. Non-invasive sensors of sweat and saliva might be more practical to integrate into wearables but still need to be carefully evaluated32.

Biomechanical sensors incorporated into clothing or shoes, such as ballistocardiograms, seismocardiograms and dielectric sensors, have been developed in an attempt to passively and continuously measure variables such as cardiac output, lung fluid volume and weight1, which could be beneficial in managing conditions such as HF. Other biomechanical sensors, such as flexible, tattoo-like sensors based on microfluidics, are also promising for non-invasive, haemodynamic, continuous monitoring. However, all these emerging sensors still require extensive clinical validation33.

Figure 1 summarizes common smart wearable devices, their embedded sensors and their applications in cardiovascular care. Table 1 lists common wearable products on the market, the published studies on these products and their regulatory status.

Fig. 1: Different smart wearable devices and their cardiovascular applications.
figure 1

Summary of common commercial smart wearables available on the market, where they are worn on the body, their built-in sensors, and the different types of measurements collected by each sensor and their various cardiovascular clinical applications. BP, blood pressure; CVD, cardiovascular disease; ECG, electrocardiogram; GPS, Global Positioning System; HR, heart rate; HRR, heart rate recovery; HRV, heart rate variability; PPG, photoplethysmography; SaO2, oxygen saturation.

Table 1 Number of studies and FDA status of common smart wearable devices on the market

Wearables in cardiovascular care

In this section, we discuss the literature supporting the use of wearable devices in cardiovascular patient care, reviewing the critical clinical studies on the most common cardiovascular applications published in the past 15 years (Table 2).

Table 2 Summary of cardiovascular clinical applications of wearable devices and key studies

Risk assessment and lifestyle interventions

Global cardiovascular disease risk assessment is traditionally based on clinical risk scores that estimate the 10-year risk. However, most of these scores do not capture the dynamic changes in personalized risk that closely follow lifestyle habits. The incorporation of subjective lifestyle behaviours in risk assessment has been challenging; therefore, objective data derived from wearables provide a renewed opportunity to make the assessment of the risk of cardiovascular disease more accurate, comprehensive and dynamic over a lifetime. Several studies have shown wearable-measured physical activity to have an inverse dose-dependent relationship with all-cause mortality5,34,35,36,37,38. Moderate-to-vigorous physical activity (MVPA), measured with the use of triaxial accelerometers, was associated with a lower mortality than light physical activity or sedentary behaviour in several US cohorts and in a Swedish population-based cohort34,35,36,37,38. Another study of women with a mean (s.d.) age of 72 (5.7) years showed that as few as 4,400 steps per day were significantly associated with a 41% reduction in mortality compared with 2,700 steps per day, but the benefits levelled at 7,500 steps per day39. Of note, stepping intensity was not associated with mortality after adjusting for steps per day.

Wearable data also facilitate the application of real-time behavioural change techniques (BCTs) such as just-in-time adaptive interventions, designed to dynamically assess user needs and provide the appropriate amount and type of intervention at the relevant time. Several trials were designed to assess the benefits of wearable-guided BCTs. The mActive trial enrolled 48 outpatients from an academic cardiovascular centre40. The prevalence of hypertension, diabetes mellitus and coronary heart disease in study participants was 50%, 23% and 29%, respectively. Trial participants were randomly assigned into two groups, blinded and unblinded to their Fitbug ORB (Fitbug, USA) activity, and evaluated in two phases40. The initial phase involved the use of only the tracking device and the second phase entailed the use of smart texts with BCTs for the unblinded group. The smart texts were automated and personalized (coaching SMS messages designed by the physician investigators and informed by real-time activity from the device). The messages were divided into positive reinforcement messages, when participants were on track or had already attained their goal of 10,000 steps per day, or boosting messages to motivate participants who were not on track to reach their goal. The text messaging group increased their daily steps by 2,534 compared with the no-texting unblinded group and by 3,376 steps compared with blinded controls40. Gamification, another type of BCT, leverages competition between members of a shared activity network to encourage lifestyle modifications. In the BE FIT Framingham trial41, involving 200 participants from 94 families, participants in the gamification group were more likely to achieve step goals and had a significant increase in mean daily steps from baseline compared with the control group (1,661 versus 636 increase; adjusted difference 953, 95% CI 505–1,401)41. Another gamification study randomly assigned participants to either a Fitbit-only group or a Fitbit plus MapTrek (a virtual global racing platform) group to promote physical activity42. Participants with MapTrek had an average of 1,455 more steps per day and 89.6 additional active minutes per week than the Fitbit-only group42. A 2 × 2 factorial randomized trial examined whether goal setting (adaptive versus static goals) and rewards (immediate versus delayed financial incentives) had an effect on step count and MVPA with the use of a Fitbit Zip in 96 participants43. The trial showed that adaptive goals outperformed static goals and small, immediate rewards outperformed larger, delayed rewards, suggesting that adaptive goals with immediate rewards should be preferred to promote physical activity. However, some researchers have questioned the value of activity trackers for health promotion44,45. A four-arm trial from Singapore randomly assigned 800 participants to control (no tracker or incentives), Fitbit Zip, Fitbit Zip plus charity incentives or Fitbit Zip plus cash incentives44. The trial showed that a cash incentive was most effective at increasing MVPA at 6 months, but this effect was not sustained 6 months after the removal of incentives. Moreover, despite the improvement in MVPA at 12 months with activity tracking, with or without incentives, these findings did not translate into an improvement in health outcomes, such as weight and BP reduction, at 12 months. The encouragement of physical activity is a cornerstone of primary and secondary prevention; therefore, cardiovascular specialists should familiarize themselves with BCT platforms that have been proven to work and recommend them to their patients, particularly those who have failed to follow traditional counselling for the promotion of physical activity.

Frequent wearable-generated HR measurements, such as resting average HR, HR recovery and HRV, can potentially be incorporated in cardiovascular risk scores given their correlation with cardiovascular disease, as described in previous sections. Moreover, longitudinal HR data can establish what is normal for an individual and, subsequently, recognize important deviations in lifestyle earlier, before cardiovascular disease develops46. HR-guided training has also been gaining popularity47; however, no clinical trials have examined the benefits of this training.

Screening and diagnosis


Initiating hypertension screening in young adulthood is widely recommended to prevent cardiovascular disease24. Oscillometric or cuff-less wearables that accurately measure BP and are continuously worn on the wrist might be more convenient in the ambulatory setting than traditional upper arm BP devices for the screening of hypertension, the self-monitoring of BP and the titration of antihypertensive drugs48. However, dedicated studies on the use of these wearable wrist devices for hypertension screening and management are needed. Continuous wearable BP measurements using novel sensors will potentially facilitate the measurement of BP during sleep or activities such as exercise when oscillometric measurements are not practical. Future studies are needed to determine whether these continuous BP data have any clinical significance33. For example, the continuous measurement of BP can have the potential to detect cardiac arrest or haemodynamic shock, thus saving lives.

Atrial fibrillation and other arrhythmias

The global burden of AF and its association with stroke, HF and mortality have been well established49. Wearables might be a convenient tool to diagnose asymptomatic or symptomatic AF20. The mSToPS study50, which included both a randomized trial and a prospective cohort, evaluated the effect of immediate versus delayed continuous ECG monitoring with the use of a Zio patch (iRhythm Technologies, USA) on new AF diagnosis at 4 months and 1 year. The study showed that ECG monitoring led to a higher rate of new AF diagnosis at 4 months and 1 year and was associated with the increased initiation of anticoagulation therapy and outpatient cardiology and primary care visits in patients without previously known AF50. The Apple Heart study19, the largest remotely conducted study to date, enrolled 419,297 participants in the USA over 8 months to ascertain whether a PPG-enabled device could detect AF in individuals without a known history of the disease. Once an initial tachogram met irregularity criteria, the algorithm scanned for PPG irregularities during periods of minimal arm movement. If four subsequent irregular tachograms were confirmed, the participant was notified of an irregular pulse via a notification on the Apple Watch and study app. The positive predictive value for AF detection was 84% and 71% for the irregular notification algorithm and individual tachograms, respectively19. In a proof-of-concept study involving the use of Apple Watch-based PPG sensor data, a deep neural network (DNN) algorithm trained with heuristic pretraining showed excellent prediction of AF (C-statistic 0.97) against the gold standard 12-lead ECG51. Another study that assessed the use of the Huawei Band 2 (Huawei, China) PRO AF algorithm showed a positive predictive value of 99.6% and a negative predictive value of 96.2%52. KardiaBand (AliveCor, USA), now commercially discontinued, coupled with the SmartRhythm 2.0 DNN was compared with implantable cardiac monitors in a study that collected >31,000 h of continuous heart rhythm data and showed a sensitivity of >97% for detecting AF episodes lasting ≥1 h (ref.53). However, the KardiaBand experienced difficulty in interpreting up to 33.3% of recordings in another analysis53. The ongoing HEARTLINE trial54 is the first randomized trial to investigate whether detecting symptomatic and asymptomatic AF with the use of an Apple Watch 4 or a newer model (with combined PPG and ECG) improves clinical outcomes20. The trial aims to recruit 150,000 US residents aged ≥65 years and evaluate whether AF detection with a wearable device would improve clinical AF diagnosis, reduce hard outcomes and increase compliance with anticoagulation therapy.

The diagnosis of symptomatic arrhythmias has also moved from burdensome strategies, such as the use of bulky Holter monitors, to more convenient wearable monitors. Wearable monitors can provide continuous, single-lead ECG monitoring, such as the Zio patch, or continuous PPG heart rhythm monitoring coupled with as-needed ECG such as the Apple Watch. Although most of these devices are only single lead, they can be as effective or even exceed the ability of conventional Holter monitoring to detect arrhythmias owing to their convenient usability over longer periods of time55,56. A DNN that used single-lead, ambulatory ECG data was able to classify 12 rhythm classes with a high diagnostic performance similar to, and perhaps exceeding, practicing cardiologists57. This technology could be applied to wearables in the future. The IPED study58 was a multicentre, randomized trial that recruited 243 patients who presented to the emergency department with palpitations and presyncope without a clear aetiology. Participants in the IPED study were randomly assigned to an intervention group with KardiaMobile SL (AliveCor, USA) or to standard care. At 90 days, a symptomatic rhythm was detected in 55.6% of participants in the intervention group compared with only 9.5% in the control group. The mean time to symptomatic rhythm detection in the intervention group was 9.5 days compared with 42.9 days in the control group58. Although KardiaMobile is not considered a wearable device, wearables with single-lead ECG can be similarly used to diagnose arrhythmias in patients with palpitations or presyncope.

Other diagnostic applications

For risk factor screening, a semi-supervised learning algorithm, developed from >57,000 person-weeks of data from Fitbit, Apple Watch and Wear OS (Google, USA), classified high cholesterol levels and hypertension with high accuracy (area under the curve (AUC) 0.7441 and 0.8086, respectively) using HR and step count data available from these commercial wearables59. In another study, a convolutional neural network developed with a training dataset of 35,970, 12-lead ECGs and validated in an independent dataset of 52,870 ECGs classified ventricular dysfunction with good accuracy60. After a median follow-up of 3.4 years, patients with a false positive (those without ventricular dysfunction by echocardiography but classified by the machine learning ECG algorithm as having left ventricular dysfunction) were at fourfold increased risk of developing ventricular dysfunction in the future compared with patients with a true negative (those without ventricular dysfunction by both echocardiography and the machine learning algorithm)60. Of note, these 12-lead ECG prediction algorithms need to be validated with the use of single-lead ECGs before incorporation into wearable devices.

Cobos Gil proposed a novel method to use the Apple Watch 4 to obtain standard and precordial leads by manipulating the placement of the watch back crystal and crown on the arms, legs and chest17. Although this approach might have limitations, it could be pragmatic as a diagnostic bridge for acute coronary syndromes when a standard ECG cannot be obtained such as in remote rural areas.

The current processing power and battery life of wearables might constrain the use of sophisticated machine learning algorithms. Therefore, Sopic et al. created a unique, two-level classification system for MI61. This system included an initial screening level, which considered only a few features to detect if any ischaemic abnormalities needed further evaluation, followed by a second-level classifier, which was more computationally demanding but more accurate compared with the screening level. The algorithm was tested on SmartCardia INYU devices (SmartCardia, Switzerland) and achieved a clinically relevant accuracy of 90% for classifying MI61. These multi-layered algorithms require extensive validation before being considered for clinical use.

Management of patients

Heart failure

In patients with HF, common wearable data, such as physical activity levels, HR, HR recovery and HRV, can be used for risk stratification. For example, one study showed that administering the 6-minute walk test through a pedometer is feasible and can be used to predict HF severity and death62. HRV in patients with mildly symptomatic HF can help to identify individuals with limited benefit from cardiac resynchronization therapy63. Therefore, the use of wearable-measured HRV might be beneficial in predicting the response to cardiac resynchronization therapy in patients with HF. Emerging biomechanical sensors, such as dielectric sensors, are also promising for HF management but require careful evaluation before clinical adoption.

Several randomized trials have assessed the value of remote invasive and non-invasive telemonitoring interventions in HF, with mixed results64. One of the landmark trials on remote telemonitoring65 randomly assigned patients with NYHA class III HF to either a treatment group, in which clinicians had access to pulmonary artery pressure readings from an implanted CardioMEMS Heart Sensor (CardioMEMS, USA), or to a control group, in which clinicians had no access to this information. After a mean follow-up of 15 months, the telemonitoring group had a 39% reduction in HF-related hospitalization compared with the control group65. The TIM-HF2 trial66 randomly assigned 1,571 patients with NYHA class II–III and a left ventricular ejection fraction (LVEF) of ≤45% (or an LVEF of >45% if receiving oral diuretics) to usual care plus remote management or to usual care only. The structured remote intervention consisted of a multicomponent system comprising a three-channel ECG (PhysioMem PM 1000, GETEMED Medizin und Informationstechnik AG, Germany), a BP device, a weight scale and an oxygen saturation device. The intervention, compared with usual care, was associated with a smaller proportion of days lost from unplanned HF-related hospital admissions and had a lower all-cause mortality (HR 0.70, 95% CI 0.50–0.96) after 393 days of follow-up66. In the TEMA-HF1 trial67, 164 patients with HF and a mean LVEF of 35 ± 15% were randomly assigned to intensive follow-up facilitated by telemonitoring (daily BP, weight and HR measurements with the use of electronic devices) or usual care. The intervention group had a significantly lower all-cause mortality (absolute reduction of 12.5%) and a lower number of follow-up days lost to death, hospitalization or dialysis. However, this effect might have been due to the thorough follow-up and not necessarily to the use of remote devices67. Although certainly promising, other trials have not shown the same favourable outcomes. The TEN-HMS home telemonitoring trial68 used wearable and non-wearable non-invasive monitors (home BP monitor, weigh scale and single-lead ECG on a wrist band) in patients with an LVEF of <40% and a recent hospital admission for HF and showed no significant differences in days lost owing to hospitalization or death compared with nurse telephone support or usual care at 240 days, although the telemonitoring group had a lower mean duration of hospital admission by 6 days. The TIM-HF trial69 randomly assigned 710 patients with an LVEF of ≤35%, NYHA class II–III and a history of HF decompensation within the previous 2 years to telemonitoring (weight, BP and heart rhythm data transmission) compared with usual care and found no differences between the groups in cardiovascular and all-cause mortality or in hospitalizations at a median follow-up of 26 months. One of the largest trials on telemonitoring in HF, the BEAT-HF study70, randomly assigned 1,457 elderly patients hospitalized for HF (median LVEF of 43%) to usual care or to remote care after hospital discharge consisting of telemonitoring (daily collection of BP, HR, symptoms and weight data) combined with telephone-based coaching. The trial had low adherence rates in the remote-care group; only 61.4% and 55.4% of patients assigned to this group were >50% adherent to telephone calls and telemonitoring, respectively, within the first 30 days. Moreover, the remote care intervention did not lower the rate of hospital readmission for any cause at 30 days or 180 days after hospital discharge70. Other smaller studies have shown mixed results52,64. Trials on HF telemonitoring have considerable differences in inclusion criteria and trial design such as the types of remote devices used and variables collected, thereby limiting the reproducibility and generalizability of these studies. In addition, most trials had a risk of bias due to the lack of blinding. Larger trials with more standardized protocols that utilize validated devices and ensure higher adherence rates might shed further light on the true value of wearable sensors in HF management.

Established atrial fibrillation

We have discussed the use of wearables to diagnose symptomatic or asymptomatic AF in previous sections; in this section, we now focus on the applications of wearables in individuals with established AF. Non-invasive, continuous heart rhythm data generated by wearables have the potential to shift the definition of AF from a categorical to a continuous and quantifiable entity, opening new frontiers in anticoagulation and rhythm control strategies. Wearables can further empower patients with AF in their care through a targeted approach to anticoagulation coinciding with the episodes of AF71. A study of patients with AF who underwent catheter ablation showed that twice-daily pulse checking and the initiation of direct oral anticoagulants when a pulse irregularity was detected was a feasible practice72. The REACT.COM pilot study73 recruited patients with an implantable cardiac monitor to initiate anticoagulation if the patient developed an AF episode of ≥1 h. This study showed a 94% reduction in anticoagulation use compared with chronic anticoagulation treatment73, with a saving of approximately US$800 per patient over 3 years74. Similarly, the iCARE-AF study71 randomly assigned patients with paroxysmal AF to continuous or as needed anticoagulation according to KardiaMobile-generated ECG recordings. The study demonstrated the feasibility of this approach but patient compliance was a concern compared with an implanted monitor71. Utilizing wearables to tailor anticoagulation regimens holds great potential but should be judiciously used in select patients according to their individual compliance and risk of stroke. Turakhia et al. are currently developing a large (n = 5,000) trial of rhythm-guided treatment with direct oral anticoagulants with the use of smartwatches (M.P.T., unpublished work). The trial will be powered for superiority for major bleeding events and non-inferiority for ischaemic stroke. Wearables might also be used to confirm persistent AF in patients before presenting for elective cardioversion, thereby preventing unnecessary hospital visits and costs53. Finally, this technology might have value in optimizing the ventricular rates at rest and during exercise in patients with permanent AF75.

Coronary artery disease

Strategies for the secondary prevention of coronary artery disease aim to prevent recurrent events by improving the control of modifiable risk factors. The MiCORE study76 enrolled 200 patients with type I MI across four US hospitals who received a guideline-driven, self-management programme comprising a mobile application integrated with an Apple Watch and a Bluetooth BP cuff. Preliminary results from 164 patients showed a 43% lower likelihood of hospital readmission at 30 days among participants receiving a guideline-driven, self-management programme than among participants in a propensity-matched, historical comparison group (n = 695)76. Furthermore, a cost-effectiveness analysis concluded that as much as US$6,000 per patient were saved by implementing this intervention77. Randomized trials examining the benefits of these platforms have the potential to revolutionize care in patients with MI. Although not yet studied, wearables can also potentially be used to titrate β-blocker dosage in patients with chronic coronary syndrome and to screen for atrial or ventricular arrhythmias after MI, especially in patients with a reduced ejection fraction.

QT interval measurement

A prolonged QT interval can predispose patients to life-threatening arrhythmias. However, monitoring this abnormality is difficult and requires a 12-lead ECG. The single-lead ECG patch BodyGuardian (Preventice Solutions Group, USA) was compared with 24-h, 12-lead Holter monitoring in 25 patients with congenital long QT and in 20 healthy individuals and the mean of the Bland-Altman plot was almost 0 with a small standard error (−1.4 ± 1.8 ms)78. Another study applied a convolutional DNN, trained from >560,000 manually annotated QTc intervals on 12-lead ECGs, to analyse prospectively collected, manually annotated lead I and II data from a standard 12-lead ECG and from the KardiaMobile 6-lead ECG in 145 patients with prolonged QTc79. The mean difference in the algorithm-predicted QTc from the KardiaMobile 6-lead ECG versus the annotated 12-lead ECG was 4.90 ms but with a large s.d. of 21.58 ms. The use of single-lead ECGs embedded in most wearables might not be practical for accurate QTc measurement, but accurate QTc measurement might be possible with innovative approaches such as device manipulation to record multi-lead recordings17 and advances in artificial intelligence (AI) algorithms.


Hyperkalaemia is a common cause of life-threatening arrhythmias in patients with cardiovascular disease, especially in individuals with underlying renal insufficiency80. ECG monitoring has been touted as a method to recognize the arrhythmogenic effects of severe hyperkalaemia such as peaked T waves, QRS widening, PR shortening and bradycardia. However, evidence is conflicting as to whether these findings are reliable, especially in patients with chronic hyperkalaemia81. A study evaluated the performance of an established DNN82 for the detection of hyperkalaemia in 10 patients undergoing haemodialysis who also underwent 4 h of ECG recording with the use of an investigational AliveCor device and concurrent blood testing83. A total of 5.4 h of data in a hyperkalaemic state and of 44.1 h otherwise was recorded. The sensitivity by duration of hyperkalaemia was 94% and the specificity was 74%, suggesting that this AI algorithm is a viable and non-invasive option for the screening of hyperkalaemia at home83. However, caveats such as high false-positive and false-negative rates and the need for continuous ECG monitoring curb the current enthusiasm supporting the use of available off-the-shelf wearables to diagnose hyperkalaemia. However, novel advances in sensors and software algorithms over the next few years might improve our ability to detect hyperkalaemia or other electrolyte abnormalities via ECG or, even better, with biochemical sensors.

Cardiac rehabilitation

Cardiac rehabilitation is a mainstay in the management of many cardiovascular conditions because it provides a structured exercise programme together with comprehensive guideline-directed secondary prevention. Telerehabilitation programmes supported by real-time wearable data might revolutionize home-based rehabilitation and relieve the inconvenience and cost associated with centre-based programmes that currently lead to low participation rates. A meta-analysis of 23 randomized trials, including a total of 2,890 patients undergoing cardiac rehabilitation after MI, revascularization or HF diagnosis, showed that a home-based form of cardiac rehabilitation was as effective as centre-based rehabilitation in improving clinical and health-related quality-of-life outcomes84. However, most of these studies did not use wearable devices. A non-inferiority, randomized trial comparing REMOTE-CR — a real-time, remote telerehabilitation platform that includes the use of a chest-worn wearable sensor (BioHarness 3, Zephyr Technology, USA) — with a centre-based programme showed that REMOTE-CR was associated with less sedentary time at 24 weeks and was more cost-effective than the centre-based programme85. The REMOTE-CR platform leveraged social cognitive theory to improve self-efficacy and self-determination (essentially, the belief that they can exercise more) in patients in the REMOTE-CR group85. Furthermore, a systematic review and meta-analysis of nine trials of patients with cardiovascular disease demonstrated that wearable monitors of physical activity that included exercise prescription or advice were superior to no device in improving fitness and step count in the maintenance phase of cardiac rehabilitation86. Although virtual home-based telerehabilitation is already available, additional high-quality and generalizable randomized clinical trials could usher in a future that will greatly expand access to this evidence-based intervention.

Peripheral vascular disease

In patients with peripheral vascular disease, the first-line treatment for intermittent claudication is a supervised exercise programme of gradually increasing intensity. A pilot study randomly assigned 37 patients with peripheral vascular disease to use a feedback-enabled Nike FuelBand (Nike, USA; now commercially discontinued) and a supervised exercise programme (intervention group) or to a supervised exercise programme without the band (control group)87. Instead of exercise prescriptions guided by the number of steps, ‘fuel points’ were used to estimate overall activity. These fuel points were reviewed at follow-up visits and customized exercise prescriptions were programmed into the devices of patients in the intervention group. The study showed significant increases in maximum walking distance, claudication distance and quality of life, sustained over 12 months, in the intervention group compared with the control group87. Two randomized trials of >100 patients with peripheral vascular disease, each utilizing the ankle-based accelerometer StepWatch 3 (Modus, USA), showed that wearable-guided exercise prescriptions were associated with a significant improvement in walking ability, speed and peak oxygen consumption88,89. Conversely, a wearable activity monitor combined with telephone coaching did not improve walking ability or quality of life but improved exercise frequency in a trial of 200 patients with peripheral vascular disease90. Other studies have shown variable results91. Although we need to continue to study the role of wearables in peripheral vascular disease, we must consider the use of the devices and exercise protocols that have proven benefits in this patient population.

Challenges and future directions

Understanding the triggers and barriers of wearable devices requires close collaboration between medicine and technology as well as knowledge of how the technology will be adopted and accepted through different stakeholders, from clinicians to regulatory bodies to users. Today, several challenges are still hindering the widespread adoption of wearable devices in clinical practice, the most important of which we discuss in this section (Table 3). Creative evaluation frameworks, pragmatic regulatory policies and simple clinician guides are needed to accelerate the integration of these devices in routine practice and propel us toward a new decade of health democratization.

Table 3 Challenges and recommendations for wearable use in clinical practice

Hardware and software accuracy and validity

Consumers, fuelled by the hype of trendy but often unproven technologies, are pushing for the rapid adoption of wearables in clinical practice but we, as a scientific community, must tread with caution. Inaccurate data is more harmful than no data. Many validation studies have questioned the accuracy of raw sensor data and software algorithms92,93,94. The heterogeneity in wearable data quality can be attributed to the lack of consensus on the development and design of digital health products and the obscurity of regulatory oversight policies. At the beginning of 2020, Coravos et al. developed a timely evaluation framework to test the accuracy and validity of connected sensor technologies, which included wearables95. In this comprehensive framework covering both the hardware and software components of these technologies, the authors outline the potential resources, evaluation criteria and target thresholds for five dimensions: verification, analytical validation and clinical validation (V3); data security and privacy; data rights and governance; utility and usability; and economic feasibility95. Meticulous scrutiny of these devices through such structured frameworks ensures that they are worthy of clinician and patient trust by the time they reach the market.

The absence of clear regulatory oversight policies governing commercial wearable devices has also led to the emergence on the market of numerous products of unknown safety and efficacy. Medical-grade sensors such as wearables with ECG are usually regulated by regulatory agencies such as the FDA risk-based medical device policy in the USA, which is the most rigorous agency and often labels these sensors as class I or II medical devices. The FDA often clears these sensors through the accelerated 510(k) pathway if the applicant can show substantial equivalence to a US legally marketed predicate device96. Rarely do these devices receive approval for a new indication label, which requires randomized trials showing clinical efficacy (Table 1). Owing to these policies, the performance of medical-grade sensors tends to be more consistent across various wearable manufacturers compared with non-medical sensors, such as activity or PPG sensors, which do not undergo FDA evaluation. Clinicians and patients must exercise caution when interpreting data from unregulated sensors. For example, abnormal PPG HR measurements are often error-prone and should be interpreted in the context of device placement, user activities and symptoms.

Software algorithms that deploy AI to analyse sensor-collected data continue to evolve in breadth and complexity, allowing the efficient processing of large amounts of data that greatly exceed human cognitive capacity. Augmented AI is expected to complement traditional data analytics and clinical care to improve patient outcomes and lower costs. However, the performance of AI algorithms in medical tasks still faces considerable technical and ethical challenges such as the ability to detect confounding variables in inadequately labelled datasets, the difficulty of integration in clinical workflows, bias and non-representative population data, risk of aggravating health disparities, interpretability of ‘black box’ unsupervised algorithms, and uncertain regulatory and tort environments97. In addition, the lack of transparency and reproducibility of AI computational algorithm methods and code in many published studies threatens scientific progress and discovery, which require that independent scientists are able to scrutinize and reproduce results98. The US National Academy of Medicine issued a reference document for the responsible development and maintenance of AI in the clinical setting that provides a guiding framework for all stakeholders, including AI developers and the medical community99. Furthermore, regulatory agencies such as the FDA have realized that the fast iterative and dynamic development of medical software technologies is vastly different from hardware. The FDA new digital health innovation plan promises a pragmatic risk-based approach to regulate software as a medical device through a pre-certification programme that aims to look first at the software developer rather than at the product100. The FDA then leverages the International Medical Device Regulators Forum risk categorization framework to help the pre-certified company determine the appropriate premarket pathway for its product. Finally, post-marketing surveillance of real-world data is used to verify the software’s efficacy and safety. As this novel strategy enters its pilot phase, the FDA should continue to solicit feedback from all stakeholders, including clinicians, scientists, technologists and industry leaders, and learn from the proposed comprehensive software and hardware evaluation frameworks such as that by Coravos et al., which could improve this programme95. Other global regulatory agencies can closely monitor the implementation of the FDA plan and improve on it to facilitate the rapid introduction of high-quality and low-risk devices into clinical care.

Meaningful use criteria and building clinical evidence

Defining meaningful use criteria for wearables requires building an extensive body of evidence that unequivocally proves benefit and rules out harm. This approach requires that we separate noise, which might be anxiety-provoking for both patients and clinicians, from actionable and meaningful clinical information. For example, does long-term, continuous ECG monitoring in asymptomatic young individuals provide any meaningful clinical information? In Table 1, we summarize the number of completed or ongoing studies listed on and examining the clinical applications of wearables. Small, proof-of-concept studies are abundant but the larger, well-designed randomized trials, such as those examining wearables in peripheral vascular disease or AF, set a worthy example for others to follow. The technology industry should learn from the veteran pharmaceutical industry that a continued investment in evidence generation, particularly randomized clinical trials, increases patient and clinician trust and maximizes the return on investment if a new label indication, not only clearance, is granted by a regulatory agency.

Given that wearable software and hardware components are validated through evaluation frameworks and regulatory pathways, the duty of our medical community and US federal agencies, such as the Centers for Medicare and Medicaid Services, will be to agree on the best practice clinical guidelines and meaningful use criteria for these devices. For example, the Apple Heart Study sparked a debate on how a small false-positive rate can increase downstream medical testing and costs if screening is applied to the entire general population. Alternatively, false negatives can have detrimental consequences, especially in symptomatic or high-risk patients in whom false reassurance can delay anticoagulation therapy and increase the risk of stroke101. Therefore, cardiology society guidelines have not yet endorsed the use of the Apple Watch for AF screening. However, the ongoing HEARTLINE trial54 aims to further evaluate whether the newer Apple Watch models, augmented by a single-lead ECG sensor, can improve screening accuracy and clinical outcomes, which could potentially change guideline recommendations. Finally, as several colleges and postgraduate training programmes across different health-care disciplines are in the process of developing telehealth curricula to cope with the COVID-19 pandemic102, these curricula should incorporate structured learning modules about common wearables, their current clinical applications and the potential to improve remote patient care.

Enacting and maintaining behavioural change

Despite the promise of wearable-guided BCTs, their ability to maintain long-term behavioural change remains a concern. As mentioned previously, a trial with 800 participants questioned the value of wearables in sustaining long-term behavioural changes and improving clinical outcomes44. In another small study, 67 participants with overweight or obesity received a Fitbit One to wear and 50% of the participants were randomly assigned to receive text message prompts45. The group receiving the text message intervention showed an increase in the number of steps. However, this effect was not sustained over the 6-week study period. Heterogeneity in the design and implementation of BCTs has been a considerable challenge and efforts have been made to standardize the methods to create these tools103. Hekler et al. proposed a framework for the use of digital BCTs, such as just-in-time adaptive interventions, to encourage and maintain health104. The merit of this framework is that it recognizes the personalized health needs of the participants. This framework highlights the importance of the ‘state-space’, essentially the readiness to respond to an intervention according to intrinsic factors (state) and contextual factors (space). For a real-world example, the framework of a behavioural intervention might use GPS and time of day to identify when a person is at work versus at home to deliver the correct amount of encouragement to engage in physical activity. However, digital models often require programming for a strict definition of the desired outcome and the type and amount of intervention proposed and are in constant flux to tailor interventions104. Zhou et al. produced a tool to pre-empt the problem of non-adherence with use of a Discontinuation Prediction Score105. The researchers utilized data from the mPED study, which enrolled women who were physically inactive, and randomly assigned participants to physical activity interventions (counselling or app-based), creating a score that would predict the risk of relapse in the following week with the use of accelerometer data. The score had an AUC of 0.9 with high accuracy (>80%) and specificity (>80%). A simulation was then performed in which financial incentives were allocated based on an individual’s predicted exercise adherence, resulting in a theoretically greater adherence compared with a random incentivization scheme. Therefore, the use of behavioural economics and cognitive psychology to develop novel social and financial incentives can potentially sustain good habits, especially when initiated at a very young age44. For example, some wearable companies have launched a connected lifestyle coaching platform for insurance plans, employers and health-care systems that leverages inherent reward programmes and financial incentives to promote healthy behaviours in individuals and families106.

Although two-thirds of millennials in one survey agreed to share their wearable data with insurance companies1, applying financial and non-financial rewards with penalties raises several ethical questions. Can consumers simply opt out of these programmes without consequences if they choose not to share their data? How are these programmes managed for individuals who cannot afford a wearable device or for those with low digital literacy? Are there clear data user agreements that govern data sharing with third parties? Future post hoc analyses of these programmes will shed light on their value in enacting long-term behavioural changes, particularly in challenging populations such as in elderly people and individuals with low digital literacy.

Hardware cost and payment models

The cost of wearables varies substantially and the cost–utility ratio is subject to the full range of health and non-health features unique to each device. A survey among 4,272 US adults reported a threefold difference in wearable use between individuals earning ≥US$75,000 and those earning <US$30,000 (ref.107). Therefore, the use of wearables in standard clinical practice might emerge as a new health disparity, with individuals unable to afford the device being subject to substandard health care. Interestingly, a study examining low-cost fitness trackers (defined as <US$105) on the market showed no correlation between the cost of the device and the number of BCTs incorporated. Some of these behavioural intervention strategies might not be cost-prohibitive for individuals or unprofitable for the manufacturer, which is encouraging108.

Reimbursement from health insurance will be crucial to narrow the financial disparity. In 2018, the American Medical Association introduced new CPT (Current Procedural Terminology) codes to incentivize the adoption of remote monitoring in clinical practice. These codes were included in the 2019 Centers for Medicare and Medicaid Services physician fee schedule and are reimbursed by private insurers. In light of COVID-19, insurers have even expanded remote monitoring to include acute as well as chronic conditions. Reimbursement for services, such as reviewing vitals, is currently more straightforward than other services such as reviewing physical activity levels or delivering wearable-guided lifestyle interventions. Payers, in the USA and around the world, should consider expanding reimbursement to include the review of various forms of wearable data and the prescription of wearable-guided BCTs. Providers might then be incentivized to use wearables regularly in their practice and even hand them to their patients for free, through loaner programmes109 or for a reasonable co-pay.

Data security and governance

Three major issues exist in relation to wearable data privacy, especially in an era when big data collection and analysis is sought after by different stakeholders and can yield unprecedented breakthroughs. First, how do we protect sensitive wearable data from undesired data breaches? Although de-identification is a possibility, the meta-data associated with the user can be theoretically used to re-identify them110. Given the sensitivity of wearable data, next-generation cybersecurity tools, such as blockchain111, should be considered because data breaches are expected as an eventuality rather than a mere possibility and because the data constantly move from one platform to another.

Second, how do we facilitate wearable data sharing for research and clinical purposes when desired by the patient? In the USA, HIPAA/HITECH (The Health Insurance Portability and Accountability Act of 1996/Health Information Technology for Economic and Clinical Health Act) privacy and security bureaucratic requirements are, to an extent, considered outdated and might need to be amended to cope with the increasing availability and heterogeneity of technologies that personalize and promote patient engagement112. Some institutions currently allow patients to waive rigid security standards for sharing health information but this opt-out system involves additional costs and is biased towards the most engaged patients. Opt-in systems with transparent privacy policies might be better equipped to improve patient engagement while maintaining autonomy112. Ultimately, trading off rigid protection for a flexible but secure system that facilitates technology adoption will depend on societal principles. We envision a future in which patients approach their health information as they approach their financial data today; but to make this transition, we must openly address patient privacy concerns and gain their trust.

Third, the plethora of data available from consumer-grade wearables necessitates that expectations are set between patients or consumers and their medical providers. Next-generation data user agreements and e-consents should transparently state details about data transmission medium and frequency, the frequency of data review, the personnel reviewing the data, and the preferred method of communication about actionable or urgent data113.

Data management

Patient records are generally dispersed in data silos across multiple platforms and health-care facilities that are not easily accessible, thereby exposing patients to safety concerns. Unleashing the full potential of wearables in patient care and research requires attaining semantic interoperability with other platforms such as electronic health records and patient portals (Fig. 2). Rather than being handicapped by scepticism114, contemporary policies should be devised to incentivize hospitals, industry and other stakeholders to achieve smooth interoperability without compromising the patients’ privacy or accessibility to their data. Big data storage and provenance (the process of tracing data origin and the movement of data between databases) also require careful attention, especially when terabytes of medical data are expected to be generated over an individual’s lifetime. Novel technologies, such as blockchain, should be considered in its implementation111.

Fig. 2: Smart wearable data workflow and integration in clinical practice.
figure 2

Schematic representation of how wearables can be optimally integrated in patient care. Raw and processed wearable data can provide actionable clinical information to health-care professionals that can help them with cardiovascular disease risk assessment, diagnosis and management. In addition, wearable data can be processed to develop personalized, real-time and adaptive health coaching interventions delivered directly to the patient. Finally, wearable data can be continuously stored in secure, personal health clouds or electronic health records (EHR) for advanced data processing and visualization and to share the data with third parties and research studies through transparent data user agreements.

A practical ABCD guide for clinicians

As consumers continue to push the use of wearable data in clinical care and as health-care leaders and policymakers sift through all the challenges mentioned above, clinicians, in the meantime, would benefit from a pragmatic guide to evaluate commercial wearables and integrate them into their clinical practice. We recommend a simple ABCD guide that clinicians can utilize in their daily practice (Table 4). Through this guide, clinicians can assess the device’s accuracy, clinical utility and validity, regulatory approval status, price, and any existing best-practice guidelines. Clinicians can then determine whether the device benefits their patients and clinical practice in terms of quality of care, efficiency, convenience and cost-effectiveness. After the device is deemed to have reasonable accuracy and clinical utility, clinical workflow integration needs to be established. This workflow includes integration with electronic health records, establishing logistics for patient onboarding and staff education, setting parameter alarms and expectations for the frequency of data review, and the development of reimbursement or payment structures. Finally, clinicians must institute data rights and governance procedures through state-of-the-art data user agreements and privacy policies that maintain patient confidentiality and trust. This simple guide can potentially accelerate wearable integration in cardiovascular practice and usher a new era of connected remote patient care.

Table 4 A clinician’s ABCD guide to wearable device use in clinical practice


A new age of consumer-driven health has arrived, with great future benefits in cardiovascular disease prevention, diagnosis and management. Currently, several challenges hinder the widespread use of wearable technologies in clinical practice. As sensor and computing technologies continue to evolve, wearables will acquire more complex functions and become an integral part of our cardiovascular practice armamentarium. These devices must be regulated through comprehensive evaluation frameworks and adequate regulatory oversight policies to ensure safety and efficacy. Moreover, a practical ABCD clinician’s guide can facilitate the integration of these devices into the clinical workplace. As COVID-19 has launched us at rocket speed into a new era of remote and decentralized patient care, this is a golden opportunity to shake off our scepticism and embrace wearable technologies in our clinical practices for the benefit of our patients.