Introduction

There is increasing awareness that remote health monitoring can be beneficial during a clinical trial1,2,3 and for routine health care4,5,6. Smartphone applications (“apps”) that help patients track symptoms and general health through electronic patient-reported outcomes (ePROs) are potentially an effective and low-burden method to collect trial and healthcare data outside the clinical setting7,8,9,10,11,12. Longitudinal data capture via a smartphone app may be a powerful approach to monitor patient progress and/or adverse events in clinical trials1,2 or usual care13,14 without the need for frequent in-person visits to a clinic.

In this relatively new area, little has been documented regarding what works best to encourage participants to consistently use mobile app interventions15,16,17. High-quality data depends on the user to regularly enter accurate information; incomplete questionnaires can compromise the quality of ePRO data18,19.

The term “engagement” has been used to discuss the extent of mobile app utilization by users20. Although engagement has not been consistently measured15, it is primarily conceptualized with three components: behavioral, cognitive, and affective21. The behavioral component—the focus of this article—is generally measured quantitatively through physical interaction of the user with the mobile health system, including metadata captured by the app, adherence with app activities, number of log-ins, time spent on each activity, and number of features accessed and screens viewed22,23. Semi-structured interviews were completed in previously published usability studies of each app; these interviews addressed some of the cognitive and affective components of engagement24,25.

This project developed and tested DigiBioMarC24 a smartphone app for individuals with cancer (“patients”), and TOGETHERCare25, a smartphone app for informal caregivers, to collect health, activity, and psychosocial data about patients and caregivers frequently and easily.

Cancer treatment may result in adverse events that require dose reductions or other interventions, making frequent and timely reporting of patient symptoms important26. There is a large body of literature about the use of mobile health apps by patients to report their health status27,28, but to date no other studies of concurrent reporting by cancer patient–caregiver dyads have been published in the peer-reviewed literature.

The aim of this article is to explore behavioral engagement with apps that collect ePROs by reporting on the adherence of both patients and caregivers, focusing on whether cancer patients and caregivers completed app activities as planned, the amount of time spent on expected activities, and associations of participants’ characteristics to explore app burden and potential obstacles to engagement.

Methods

Study design and sample

This is a single-arm, prospective study in which adult cancer patients within the Kaiser Permanente Northern California (KPNC) system and their informal caregivers were asked to use the DigiBioMarC (for cancer patients) or TOGETHERCare (for caregivers) smartphone apps for 28 days after enrollment (the study period). KPNC is an integrated healthcare delivery system serving over 4.5 million members representative of the general Northern California population29. The KPNC Institutional Review Board reviewed and approved the study protocol.

Cancer patients scheduled to receive intravenous chemotherapy or immunotherapy during the study period, were identified using the KPNC electronic health record system and were recruited by email invitations from October 2020 to March 2021. Recruitment emails were sent to 2155 potential patients. Of the 247 respondents (11% of recruitment emails), 166 (67% of respondents) were determined to be ineligible (no iPhone (n = 33), no eligible/willing caregiver (n = 24), no scheduled IV therapy during the study (n = 13), not English speaking (n = 5), physician indicated a contradiction to participation (n = 42), invalid emails (n = 41), deceased (n = 2), ineligible unspecified (n = 6)), 20 declined to participate before eligibility could be confirmed, and 7 declined to participate after learning more details about the study. No background information was tabulated for those that did not consent to participate in the study as we were restricted under our IRB approved protocol to information from the EHR only for those individuals that provided informed consent and HIPAA authorization.

Recruitment and all methods were performed in accordance with the relevant guidelines and consent were completed remotely through a video or phone call. During this call, the protocol coordinator assisted interested and eligible participants in downloading the app and instructed them on how to review and sign the informed consent document in the app. No other training was provided in how to navigate the applications. Participants were required to provide written informed consent within the application prior to proceeding with the study. Both patients and caregivers who continued responding to activities in the app through the study period and completed the two semi-structured virtual interviews received a $100 gift certificate for their time and effort. Based on the IRB recommendations, data collected through the app were not relayed to the participating patients’ or caregivers’ clinical team. The exception was that the research associate was to be automatically alerted to survey scores for mental health measures from specific surveys if they exceeded a predetermined threshold. If an alert was received, the research associate was to call the participant, ask if they were in immediate danger and if so call 911 to request the police conduct a “health and welfare check.” If not in immediate danger but they expressed distress, the research assistant was to ask if they wanted a Kaiser clinician to talk with them.

Eligibility requirements for cancer patients included being 18 years of age or older, receiving intravenous chemotherapy or immunotherapy treatments at a KPNC medical facility at the time of the study, having an iPhone 6 or higher, and the ability to read and communicate in English24. Patients were also required to have an informal caregiver to be eligible to participate in the study. The informal caregivers were identified by the patient and screened for eligibility during recruitment. Caregivers did not need to be KPNC members but did need to be 18 years of age or older, have an iPhone 6 or higher, and able to read and communicate in English. There was no requirement for the length of time the caregiver had lived with the patient. Cancer patients with severe mental illness or insufficient cognition to consent (as determined by their physician) were not eligible.

App activities and schedule

As previously described in greater detail24,25, the DigiBioMarC and TOGETHERCare apps were composed of “activities” defined as surveys as well as physical assessments. Surveys assessed the cancer patients’ symptoms and functioning, social and financial resources, wellbeing, and stress, among other measures from the perspective of the cancer patient (DigiBioMarC) and their caregiver (TOGETHERCare). Physical assessments, gait speed30 and “sit to stand”31 tasks, were to be completed by all patient participants but were not asked of caregivers.

Each time an app activity was made available to an individual participant, it counted as an “instance” of that activity. The number of instances varied for different activities. The study protocol expected, but did not mandate, participants to complete the activity within 24 h. Some activities were expected for completion only at the beginning of the study period (i.e., one instance), some at the beginning and end (i.e., two instances), and some were scheduled for completion either every 7 or 14 days (i.e., three to five instances). The intervals between instances of an activity were standardized such that after a user completed a planned activity, it could not be completed again before the minimum expected days had elapsed, depending on the planned interval. The activity remained open until it was completed. The frequency and timing of the app activities were intended to capture relevant health and wellbeing data pertaining to cancer patients. Research staff did not prompt participants by phone, email, or text to complete open activities. Instead, notifications or reminders were sent within the app to caregiver participants three times per week, and to patients every two to three days, midday on a preset schedule. These reminders were not linked to specific activities or whether the participant had completed them but were instead general reminders to complete any open activities.

While the active study duration was 28 days, for the purposes of this examination, we allowed activities to be completed within a 33-day window past enrollment to allow a few extra days for users to become familiar with the apps and study expectations.

Measures

Adherence. Using the timestamp data (app operational metadata), the study assessed app adherence as: (1) completion of all expected app activities within 33 days; (2) completion of individual app activities; (3) completion of expected app activities within 24 and 48 h; and (4) average time to complete individual app activities at the beginning and end of the study.

Based on the expected instances, we calculated denominators for the percentages of app activities completed. The numerators for these percentages are the total completions (including all study participants within each group) over the entire active study period.

We measured the app time commitment as the minutes to complete each activity based on metadata, changes in completion time over the active study period, and total time at the beginning and end of the study period, when the greatest number of activities were expected.

Potential obstacles to app adherence. To provide contextuality of the participants’ adherence and potential app burden, we collected information from the patients’ KPNC medical records about the patients’ cancer status, namely, type, stage, and presentation of cancer at study enrollment. In the apps themselves, we collected information about the participants’ social demographics (age, gender, ethnicity, race, educational attainment, employment status) and the caregivers’ perceived amount of time that they provided care at the beginning and end of the study period. We conducted semi-structured interviews with participants on approximately the seventh day of app use and again after participants completed the final app surveys, using a secure videoconferencing program. The results of these surveys have been previously reported24,25.

Analysis

The analysis is descriptive and exploratory. First, we summarized the baseline characteristics of the study participants. Second, we calculated summary statistics for the adherence measures and the average minutes spent on each activity. Third, we performed two-sided Fisher Exact tests to assess possible associations between the participant characteristics and completion of 80 + percent of the app activities within the active study period.

Results

Study participant baseline characteristics

Fifty of the 54 enrolled patient–caregiver dyads completed the study (3 patients stopped participating due to serious declines in health and one patient did not start the planned treatment making them ineligible). Seventy-six percent of the patients had advanced cancer (stage 3 or 4), and 28% had relapsed diagnosis as of their enrollment date. The caregivers in the enrolled dyads were predominantly male (62%) and spouses/partners of the cancer patients (78%), who were predominantly female (78%) (Table 1). Eighty percent of the patients were 50 years of age or older with about a quarter of them ≥ 70 years. The caregiver cohort was overall younger with 50% aged 50–69 years. Close to two-thirds of the participants (combined patients and caregivers) were white, and over 80% identified as non-Hispanic or Latino. Fifty-four percent of the caregivers and 70% of the patients had an associate degree or higher, respectively. Almost all caregivers (n = 48) and cancer patients (n = 49) had “adequate” cancer health literacy as indicated by the 6-Item Cancer Health Literacy Test (CHLT-6). Unemployment at the start of the study was higher in patients than in caregivers (68% vs 42%, respectively).

Table 1 Characteristics of study participants at baseline (N = 50 dyads).

Overall completion of expected app activities

The mean number of activities completed during the active study period was 24.2 out of 28, (standard deviation [SD] = 4.4) for patients (86% completion rate) and 35.6 out of 42 (SD = 7.1) for caregivers (84% completion rate) (Supplement 1). Seventy-six percent (76%) of patients and 66% of caregivers completed 80% or more of the expected activities, respectively. More activity completion in the caregiver group was related to more activity completion in the patient group (correlation coefficient = 0.5, p ≤ 0.001). No participant survey responses met the threshold to trigger a mental health email notification to the research assistant.

Completion of individual app activities

One hundred percent of the “About You” surveys were completed by all study participants (Table 2). For the caregivers’ repeated surveys, the “Every Other Day” survey was least completed (77.1% of the expected surveys were completed) and the “Quick Check-in” (PHQ4) survey was most completed (93.0% were completed). For the cancer patients’ repeated activities, the “Gait Speed” assessment was the least completed (79.6% completion rate) and, again, the “Quick Check-in” survey was the most completed (95.0% completion rate). For both caregivers and patients, the completion rate of more frequent activities was lower compared with less frequent ones.

Table 2 Completion of TOGETHERCare and DigiBioMarC app activities during the active study perioda.

Ninety percent or more of the “About You” surveys were completed within 24 h from the time it became available for both patients and caregivers (Table 2). Lower percentages of the weekly activities were completed within 24 h. For example, 60% of the “Short Symptom Reports” (PRO-CTCAE) surveys were completed by caregivers, and 52% of the “Gait Speed” and “Sit and Stand” physical assessments were completed by cancer patients within 24 h. Expanding the completion window to 48 h increased the percentage completed by a range of 2.2% (caregivers’ “About You”) to 28.1% (caregivers’ “COVID-19 Patient Impact”). The high completion rate of the “Every Other Day” survey by caregivers at the onset of the study began to decline starting at the fourth of 14 expected surveys. By the ninth survey, only 76% of the caregivers completed it (Supplement 2).

Time to complete individual app activities

The average time it took participants to complete the app activities are provided in Table 3. The average time for the cancer patients to complete app activities at baseline ranged from 0.8 min (SD = 0.5) for the “Fast 4” survey to 4.9 min (SD = 1.8) for “COVID-19 Your Input” survey. Except for the caregivers’ time to complete “App Feedback,” the time to complete app activities declined slightly over time. The time of day when app activities were completed is included in Supplement 3.

Table 3 Minutes to complete activities once started, by instance.

Completion of app activities by background characteristics

Although none of the caregivers’ or patients’ characteristics, including age, gender, and race, were significantly associated with completing 80% or more of the app activities during the active study period (Supplement 4), we observed some trends. Adherence was overall high across all age categories among patients and reached 90% in patients ≥ 70 years of age. Notably, while the lowest rate adherence was seen among patients aged 50–69, caregivers in this age group were the most compliant (75%). Higher levels of education (associate degree or higher) were marginally (p = 0.0592) associated with lower completion of app activities during the active study period.

Caregiving time commitment also was marginally associated with study adherence at both baseline and end of study. When collapsing the reported caregiving time commitment by caregivers to < 20 h versus ≥ 20 h per week, a different trend emerged; at baseline, the rate of adherence by caregivers who committed < 20 h of caregiving was lower than for those who committed ≥ 20 h (59% vs 73%, respectively). This trend became statistically significant at the end of the study, with 92% adherence among caregivers who committed ≥ 20 h caring for their patients compared with only 56% among caregivers committing < 20 h of care (p = 0.034, data not shown).

Adherence among cancer patients with late-stage disease (3 and 4) was higher than adherence among patients diagnosed with early-stage disease (81.6% vs. 58.3%, respectively) (Supplement 4). Inversely, adherence was higher among caregivers of patients with early-stage disease (83.3%) compared to caregivers of patients with late-stage disease (60.5%).

Discussion

In summary, timestamp metadata analysis showed that both adult cancer patients and their informal caregivers were highly adherent over the one-month study duration, with app activity completion at 86% for cancer patients and 84% for caregivers. The percentage of individual activities completed ranged from 91 to 100% for caregivers, and from 79.6% to 100% for patients. The activities that were the least completed by patients were the active physical assessments, gait speed and the sit to stand activity. We suspect these were difficult for patients to complete when they were not feeling well. Cancer patients often felt fatigued, 94% and 80% at the beginning and end of the study, respectively. The surveys that were least completed by caregivers over time were the “Every Other Day” surveys. We believe these surveys might have been too frequent, leading to caregiver burnout on completing them, or the caregivers saw reporting on their own status as less relevant than that of the cancer patient.

Patients and caregivers showed a comparable overall adherence rate (> 85%) for the weekly PRO-CTCAE and PROMIS Physical Function surveys, suggesting the importance of remote monitoring of patients’ symptoms and side effects to patients and caregivers alike. Adherence increased when we allowed 48 h between when the activity appeared in the app and completion, which suggests that whenever possible, it is important to provide a reasonable reporting time window to allow for activity completion. The average time it took caregivers to complete each app activity ranged from less than a minute to 5.8 min; the average time for patients ranged from less than a minute to 4.9 min. Our results do not support the perception that older patients are less compliant with digital health technologies, as the highest (90%) adherence in cancer patients was among the oldest group (≥ 70 years) of participants. This is consistent with a high level of adherence to using physical trackers found in elderly patients with atrial fibrillation32.

Interpreting our findings of results in light of known literature, cancer patient adherence with our app-based approach for treatment remote monitoring falls within the range of other studies of electronic clinical outcomes reporting, including 64.7% average patient adherence for weekly reporting during chemotherapy33, 74% median adherence for reporting three times per week during treatment with immune checkpoint inhibitors7, and 83% adherence for daily reporting in patients with breast or prostate cancer34. The United Kingdom electronic self-reporting of adverse events for patients undergoing cancer treatment eRAPID program found 72.2% patient adherence to adverse-event symptom reporting13.

The importance of our results is that this study enhances our knowledge about app engagement by demonstrating that metadata can be utilized to inform clinical teams about engagement behaviors of their study participants related to remote data capture and monitoring. This rapid assessment of study activities does not require accessing and analyzing clinical outcome data. Instead, metadata can be used to generate insights on risks of non-adherence during clinical trial data collection or during standard treatment.

There are several limitations of this study. First, we had a relatively small sample size (n = 50 patient–caregiver dyads), so the study was not powered for testing hypotheses about predictors of and barriers to app usage. In addition, only those patients who responded to the invitation to participate in the study were included, and these patients may have been more likely to be adherent. Second, notifications were sent on set dates and times, irrespective of whether activities were completed. The apps have since been updated to coordinate notifications with when activities are due. Third, this pilot study was not designed to test the clinical workflow related to ePROs collected by the apps. The app report was not integrated into the EHR and responses did not go to healthcare providers for review. Further research is needed to assess the feasibility and effect of transmitting urgent data from the apps to healthcare providers. Fourth, there are two recent studies comparing Kaiser membership with the general population as described in the California Health Interview Survey35, and one comparing with the US Census and Behavioral Risk Factor Surveillance System29. Both found that compared to the total population in the catchment area, the Kaiser members were similar in race, ethnicity, education and income. However, in this study, we did not attempt to achieve a representative sample of participants. The eligibility criteria for the study skewed the population towards females because breast cancer is frequently treated with IV chemotherapy. We suspect that recruitment via email and the requirement that study participants had to own an iPhone 6 or higher and be English speakers may have skewed the representativeness of participants by race/ethnicity. We have since developed an Android version. Finally, our month-long study was not long enough to demonstrate sustained adherence over longer courses of IV chemotherapy.

This study’s strengths included its demonstration of adherence with mobile health apps in a population undergoing active cancer treatment, with patients receiving IV chemotherapy or immunotherapy—therapies that frequently result in side effects. The sample included patients with a mix of cancer types. Observations were reported by patients through our DigiBioMarC app, and by their caregivers through the TOGETHERCare app. This provides the advantage of a second set of eyes on the patient’s symptoms, potentially providing an additive advantage in accuracy and completeness to improve remote monitoring abilities.

Future directions include incorporating more diverse participants across socio-economic, racial, ethnic, and health and digital literacy levels. This will help us examine whether remote monitoring of this type can reduce persistent health disparities by removing requirements to go to specific study locations, which may not be easily accessible with limited resources or from rural areas. We would also like to compare adherence to mobile apps with emailed or texted ePROs. In addition, based upon multiple responses in the semi-structured interviews, we hypothesize that adherence and engagement will be enhanced if clinicians receive the patient reported outcome data. Some of the team plans to explore how to implement this workflow in a frictionless manner within the EHR, with flags for PROs or activities of concern.

Conclusions

The high rate of participant adherence (86% for cancer patients and 84% for caregivers) in this timestamp metadata analysis suggests that the DigiBioMarC and TOGETHERCare apps can be used to collect patient- and caregiver-reported outcomes data during intensive treatment.

This study assessed user adherence using app metadata to look at completion of expected activities, time to complete activities, and completion of activities by participant characteristics. We suggest that metadata can be used to generate insights on risks of ePRO and remote monitoring of symptoms, health, and adverse events during clinical trial data collection or during standard treatment.