Published on in Vol 5, No 1 (2022): Jan-Mar

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/30941, first published .
Using a Patient Portal to Increase Enrollment in a Newborn Screening Research Study: Observational Study

Using a Patient Portal to Increase Enrollment in a Newborn Screening Research Study: Observational Study

Using a Patient Portal to Increase Enrollment in a Newborn Screening Research Study: Observational Study

Original Paper

1RTI International, Research Triangle Park, NC, United States

2Department of Medicine, University of North Carolina Chapel Hill, Chapel Hill, NC, United States

Corresponding Author:

Lisa M Gehtland, MD

RTI International

3040 E. Cornwallis Road

Research Triangle Park, NC, 27709

United States

Phone: 1 919 541 8054

Email: lgehtland@rti.org


Background: Many research studies fail to enroll enough research participants. Patient-facing electronic health record applications, known as patient portals, may be used to send research invitations to eligible patients.

Objective: The first aim was to determine if receipt of a patient portal research recruitment invitation was associated with enrollment in a large ongoing study of newborns (Early Check). The second aim was to determine if there were differences in opening the patient portal research recruitment invitation and study enrollment by race and ethnicity, age, or rural/urban home address.

Methods: We used a computable phenotype and queried the health care system’s clinical data warehouse to identify women whose newborns would likely be eligible. Research recruitment invitations were sent through the women’s patient portals. We conducted logistic regressions to test whether women enrolled their newborns after receipt of a patient portal invitation and whether there were differences by race and ethnicity, age, and rural/urban home address.

Results: Research recruitment invitations were sent to 4510 women not yet enrolled through their patient portals between November 22, 2019, through March 5, 2020. Among women who received a patient portal invitation, 3.6% (161/4510) enrolled their newborns within 27 days. The odds of enrolling among women who opened the invitation was nearly 9 times the odds of enrolling among women who did not open their invitation (SE 3.24, OR 8.86, 95% CI 4.33-18.13; P<.001). On average, it took 3.92 days for women to enroll their newborn in the study, with 64% (97/161) enrolling their newborn within 1 day of opening the invitation. There were disparities by race and urbanicity in enrollment in the study after receipt of a patient portal research invitation but not by age. Black women were less likely to enroll their newborns than White women (SE 0.09, OR 0.29, 95% CI 0.16-0.55; P<.001), and women in urban zip codes were more likely to enroll their newborns than women in rural zip codes (SE 0.97, OR 3.03, 95% CI 1.62-5.67; P=.001). Black women (SE 0.05, OR 0.67, 95% CI 0.57-0.78; P<.001) and Hispanic women (SE 0.07, OR 0.73, 95% CI 0.60-0.89; P=.002) were less likely to open the research invitation compared to White women.

Conclusions: Patient portals are an effective way to recruit participants for research studies, but there are substantial racial and ethnic disparities and disparities by urban/rural status in the use of patient portals, the opening of a patient portal invitation, and enrollment in the study.

Trial Registration: ClinicalTrials.gov NCT03655223; https://clinicaltrials.gov/ct2/show/NCT03655223

JMIR Pediatr Parent 2022;5(1):e30941

doi:10.2196/30941

Keywords



Advent of Patient Portals and Their Use in Research Recruitment

Failure to recruit a sufficient number of participants is a common barrier to the successful and timely completion of research studies [1]. Insufficient accrual of participants may require additional resources to achieve target enrollment, and failure to meet enrollment goals may result in underpowered studies [2,3].

Electronic patient portals are web-based applications owned and administered by health care institutions that allow patients to access their electronic health records (EHRs). In the past two decades, a growing number of patients have used this technology to manage their health care and communicate with their providers [4-6]. Estimates of patient portal use vary by study, subpopulation, and measured outcome, but reports range from 25.8% to 84.1%, and longitudinal analyses indicate that utilization rates are growing [7-16]. In a national US sample, Turner et al [17] found that 24.9% of participants reported using at least one patient portal tool in 2017, compared to only 12.6% in 2011. As adoption becomes widely accepted, researchers have recognized the opportunity to use EHR data to identify eligible research cohorts and send recruitment invitations to potential participants via the patient portal [18,19]. Direct messaging through patient portals enables a study to efficiently contact eligible patients and facilitates low-touch, low-cost outreach to large numbers of patients, an approach that is particularly advantageous for studies with large target sample sizes. In addition, once a system of electronic recruitment is established, the process of identification, prescreening, and outreach can be automated and repeated.

Studies using patient portal research invitations for recruitment report a wide range of study enrollment rates, with 1.8% to 24.7% of those who received an invitation consenting to participate [20-27]. A summary of 14 studies that sent research recruitment invitations through the patient portal at a single medical center found that condition-specific studies had higher response and enrollment rates compared to general health studies [23]. The ADAPTABLE (Aspirin Dosing: A Patient-Centric Trial Assessing Benefits and Long Term Effectiveness) study, a pragmatic trial with a recruitment target of 15,000 participants, reported enrollment rates from four different modes of outreach: patient portal, email, mailed letter, and in-person communication with the research coordinator. Although in-person recruitment had the highest enrollment rates of all the modes, patient portal, and email outreach yielded the most overall study enrollment because they allowed the study team to approach many more potential participants than did the other modes [24]. A recent study comparing in-person, email, and patient portal recruitment of adults from primary care and bariatric clinics also found that electronic forms of outreach resulted in the most overall study participants in spite of lower recruitment efficiency compared to in-person recruitment [27].

Despite the advantages of using patient portals to recruit for research, they remain primarily clinical tools, and using them to send research invitations may run the risk of decreasing patients’ trust in the health care system or utilization of the platform for clinical purposes. However, there is some evidence that patients find recruitment through patient portals acceptable. Plante et al [25] reported only 2 complaints and 1 request to unsubscribe from future messages in a study that sent 6896 invitations, and Gleason et al [28] noted that most patients reported research recruitment to be an acceptable use of patient portals in a satisfaction survey from a study that sent 1303 invitations. Patients who find patient portal recruitment to be unacceptable, however, may not open a message, send a complaint, or complete a satisfaction survey. Thus an in-depth understanding of factors influencing acceptability of patient portal recruitment remains to be determined.

Demographic disparities between patient portal users and nonusers present a major barrier to the recruitment of a representative study sample. Studies have shown that patient portal nonusers are more likely to be racial/ethnic minorities, older, male, low socioeconomic status, low health literacy, and live in a rural area [8,23,29-31]. Patient portal recruitment may, however, decrease study population disparities that result from certain demographic groups being approached for research participation less frequently in clinic settings [32,33]. There is some evidence that clinicians as gatekeepers may contribute to the under-representation of certain populations, particularly among patients with minority backgrounds [33]. Mass electronic invitations may be a universal recruitment outreach approach reflective of demographic variation. Some studies have recommended that patient portal recruitment be one part of a comprehensive outreach approach, including approaches that specifically target traditionally under-represented groups [34].

Early Check: A Research Study Piloting the use of a Patient Portal to Recruit Pregnant Women

In this article, we describe our use of invitations sent through the Epic EHR and patient portal (MyChart) within UNC Health (UNCH). At UNCH, the patient portal is branded my UNC Chart. We used my UNC Chart to recruit for Early Check, a research study offering screening to all newborns in the state of North Carolina for a panel of genetic conditions. With a target recruitment rate of over 10,000 newborns per year, an online consent process that does not require contact with a research coordinator, and broad eligibility criteria, Early Check is a study for which recruitment messaging through patient portals is a good fit. Additionally, the target populations for recruitment outreach are pregnant women and mothers of newborns; this group is relatively younger and female, both groups which have been shown to be more likely to open a patient portal account and use patient portals to manage their health [15]. Two recent studies reported a rate of patient portal utilization between 34% and 72% of pregnant patients [21,35]. One study recruited pregnant women to a research study through a patient portal and found 34% of pregnant patients used their patient portal, and when invited to their study, 11% consented and completed their questionnaire [21].

Since Early Check began recruitment in October 2018, the primary outreach method has been personalized direct mail letters and emails on letterhead from the North Carolina Department of Health and Human Services, a study partner, sent postnatally to all women with a listed mailing or email address in the North Carolina newborn screening records. A social media outreach campaign was piloted from March to September 2019. We resumed social media advertising on Facebook and Instagram on April 1, 2020. An evaluation of the direct mail outreach impact on study enrollment showed that approximately 4% of all women who were sent a recruitment letter enrolled their newborn in the study, and the enrollment rate among women who also received a recruitment email was approximately 5% [36]. An analysis of the social media campaign from March 2019 to September 2019 showed that paid ads on social media resulted in 3.5 additional daily enrollments in the study for each day ads were run [37]. To further increase outreach to eligible participants, we used my UNC Chart to send recruitment invitations to pregnant women whose newborns would be eligible for Early Check.

Objective

In this article, we describe the use of a patient portal to recruit research participants for Early Check and report on characteristics of mothers who received and opened a research recruitment message and enrolled their newborns in the study. We addressed two research questions:

  1. Is receipt of a research invitation through my UNC Chart patient portal associated with enrollment in the Early Check research study within 27 days after receipt of the invitation?
  2. Is there a difference in opening a research invitation or enrollment in Early Check by a mother’s race/ethnicity, age, or rural/urban home address location?

To address these questions, we examined data on 4510 UNCH patients who were invited to participate in Early Check through my UNC Chart between November 22, 2019, and March 5, 2020.


Early Check Research Study

A collaboration between RTI International, the University of North Carolina at Chapel Hill, Wake Forest School of Medicine, Duke University, and the North Carolina State Laboratory of Public Health, Early Check is a large research study offering screening for a panel of conditions to all newborns in the state of North Carolina [38]. The panel includes fragile X syndrome (October 2018-current), spinal muscular atrophy (October 2018-March 2021), and Duchenne and related muscular dystrophies (November 2020-current). Newborns are eligible if they have a newborn screening in North Carolina and live in North Carolina or South Carolina. Newborns may be enrolled in the study by their mother or legally authorized representative in the event the mother is unavailable, between the start of the mother’s second trimester and when the newborn is one month old. During the phase of the study described herein, permission for the newborn to participate was completed entirely online without direct engagement with a research recruiter [39]. The Office of Human Research Ethics at the University of North Carolina at Chapel Hill serves as the central Institutional Review Board for Early Check (#18-0009), and they approved these activities.

Recruitment Using my UNC Chart Invitations

The process of identifying women within UNCH to be sent an invitation to participate in Early Check via my UNC Chart began with a computable phenotype, a data query that “use[s] EHR data exclusively to describe clinical characteristics, events, and service patterns for a specific patient population [40].” UNCH’s enterprise data warehouse, the Carolina Data Warehouse for Health (CDWH), was queried using the computable phenotype to identify invitation recipients. UNCH’s appointment discharge paperwork provides patients with a unique ID that can be used to activate their my UNC Chart account. Women were invited if they had ever activated their my UNC Chart account, regardless of how recently they had logged into the account. The primary criteria in the computable phenotype were: a) having an active pregnancy “episode of care,” and b) being in the second or third trimester of pregnancy (ie, 13-42 weeks’ gestation).

In Epic@UNC, a pregnancy Episode of Care groups all prenatal encounters and diagnoses for a pregnancy. A pregnancy Episode of Care can be generated at any time during the pregnancy, although ideally, it is generated at the time a pregnancy is first confirmed or at the time a pregnant woman transfers her care from another health care system. Pregnancy Episodes of Care are designed to automatically resolve after delivery, specifically after any of the following: a) 48 weeks with no linked encounter; b) 364 days after the episode creation date; c) 84 days after the estimated delivery date in the patient’s medical record. An Episode of Care may also be manually resolved after the baby is born. Pregnancy Episodes of Care are not designed to resolve automatically when a woman loses her pregnancy; the Episode of Care must be manually resolved (eg, closed out) in the case of pregnancy loss.

Since a cohort based on active pregnancy Episodes of Care may unintentionally include some women who have lost their pregnancy, women were excluded from receiving an invitation if their health record showed any of a series of International Classification of Diseases 10th revision or current procedural terminology codes associated with elective or spontaneous abortion within 10 months of the date that the CDWH was queried. Women were also excluded if they had indicated in their communication preferences that UNCH was not permitted to contact them through my UNC Chart. Women were only sent one invitation, so they were also excluded from the cohort if they had already been sent an invitation for the study during the same pregnancy Episode of Care. The computable phenotype with inclusionary and exclusionary codes and a figure showing the text of the research invitation can be found in the Multimedia Appendices 1 and 2.

Data

We consolidated data from four sources: (1) UNCH patient records with my UNC Chart invitation data; (2) ZIP code-level rural-urban commuting area (RUCA) approximation codes, from which we derived dichotomous urbanicity status; (3) newborn screening records gathered from the North Carolina State Laboratory of Public Health, to which Early Check mailing list data have been appended; and (4) enrollment information collected through the Early Check permission portal. After cleaning and standardizing the data, we iteratively matched records from my UNC Chart to the newborn screening and Early Check permission portal datasets using several combinations of data fields appearing in two or more sources, including phone number, email address, name, date of birth, and street address (including ZIP codes). We visually inspected the combined data set after each pass to find and update mismatched records or duplicates. The final data set used in the analysis contained one record per patient with variables derived from all four sources.

The main analyses presented in this report focus on all 4510 women living in North Carolina or South Carolina who had not yet enrolled in Early Check and were sent invitations via my UNC Chart from November 22, 2019, through March 5, 2020. To standardize results from batches of invitations sent on different dates, we set a 27-day window for recruitment and enrollment outcomes starting from the date participants were sent a my UNC Chart invitation. We selected a 27-day window to avoid any overlap with a social media ad campaign for Early Check that began April 1, 2020. We also compiled aggregate data for women with patient records in the UNCH system who did not have an active my UNC Chart account but would have otherwise met the eligibility criteria for an invitation. We used this aggregate data to estimate the proportion of Early Check-eligible UNCH patients who were reachable through my UNC Chart and to examine whether there are differences by age, race and ethnicity, or urbanicity between women who received an invitation and those who did not.

Measures

Early Check Enrollment

We converted enrollment dates into a dichotomous variable indicating whether women granted permission for their babies to participate in Early Check within 27 days of being sent a my UNC Chart invitation, enrolled (1) or not enrolled (0). Among women who enrolled a child in the study within the 27-day timeframe, we also calculated the number of days it took them to enroll.

Opened my UNC Chart Invitation

Using the earliest date that women logged into their account to view the invitation, we derived a dichotomous variable recording whether women opened the invitation within 27 days of when it was sent, yes (1) or no (0). We also calculated the number of days it took women to open the invitation.

Contact Via Direct Mail and Email

Using variables recording the dates that postnatal Early Check outreach materials were sent, we created a set of dichotomous variables indicating whether each woman was sent a postnatal recruitment letter or email up to 27 days of being sent a my UNC Chart invitation (for each type of mailing, yes [1] or no [0]).

Age

We converted women’s date of birth to age in years, anchoring it to the date when we sent the recipient a my UNC Chart invitation (ie, invitation date-date of birth/365.25). We then transformed this into a 5-level categorical variable: Under 20, 20 to 24, 25 to 29, 30 to 34, and ≥35 years.

Race and Ethnicity

We used race and ethnicity data from the UNCH patient records and recoded these into a single variable that aligns with the race and ethnicity categories used in resident live birth reports published by the North Carolina Department of Health and Human Services: non-Hispanic White alone, non-Hispanic Black alone, Hispanic, and non-Hispanic any other race or unknown [41].

Urbanicity

To measure urbanicity, we constructed a variable based on RUCA codes associated with the patient residential ZIP codes recorded in the my UNC Chart data. RUCA codes were developed by the US Department of Agriculture to classify Census tracts by population density, proximity to large urban centers, and daily commuting flows [42]. For this analysis, we used ZIP code RUCA approximation codes developed by the University of Washington and recoded these into a two-level urbanicity measure developed for the National Cancer Institute’s Surveillance, Epidemiology, and End Results database [43,44]. Under this coding scheme, we collapsed RUCA codes associated with each ZIP code of residence into two categories indicating whether the location was urban area commuting focused (ie, urban) or not (ie, rural). ZIP codes with 1 of 10 RUCA codes (ie, 1.0, 1.1, 2.0, 2.1, 3.0, 4.1, 5.1, 7.1, 8.1, or 10.1) were classified as urban (1) and all other codes were classified as rural (0).

Statistical Analysis

We conducted logistic regressions to test for differences in whether women enrolled their newborns in the study or opened the my UNC Chart invitation by outreach methods, urbanicity, race and ethnicity, and age. The model estimating enrollment included whether women opened the invitation as a predictor variable; all other regressors were the same in both models. Cases with missing values on one or more regressors were excluded by listwise deletion in the logistic regression models. In addition to reporting model estimates, we also present the predicted probabilities for significant categorical variables, which represent the rates of enrollment and invitation-opening within levels of those variables while controlling for other regressors in the models. To examine potential differences by urbanicity, race and ethnicity, and age between women who were sent invitations through my UNC Chart and patients in the UNCH system who were otherwise eligible but did not have an active my UNC Chart account, we conducted χ2 tests of independence. We followed up significant χ2 tests involving independent variables with more than 2 categories using two-sample z tests for the difference of proportions. For these pairwise comparisons, we used a Bonferroni-adjusted alpha level of .0085. We conducted all analyses using Stata Statistical Software (version16.0; StataCorp).


Sample Characteristics

In total, 12,036 patients within the UNCH system fit the computable phenotype that would have made them eligible to receive an invitation from November 22, 2019, through March 5, 2020, but only 4510 out of 12,036 (37.5%) had an active my UNC Chart patient portal account. We compared the demographic characteristics of women to whom we sent a my UNC Chart invitation to those who did not receive an invitation because they did not have an active account. We found no significant differences by age (χ2 [4, N=12,036]=3.51; P=.48) or urbanicity (χ2[2, N=12,036]=0.37; P=.54), but we did find differences by race/ethnicity (χ2[3, N = 12,036] = 180.99; P<.001, Cramér’s V = 0.12). A greater percentage of non-Hispanic White patients (2527/5852, 43.2%) had an active my UNC Chart account compared to non-Hispanic Black patients (932/2738, 34.0%, z=8.05; P<.001), Hispanic patients (531/1,916, 27.7%, z=12.03; P<.001), or non-Hispanic patients of any other race (520/1530, 34.0%, z=6.50; P<.001). Hispanic patients were significantly less likely to have an active my UNC Chart account than patients in any of the other three race/ethnicity groups. The full contingency table comparing whether women had an active my UNC Chart account by race and ethnicity is shown in Table 1.

Characteristics of North Carolina or South Carolina residents who were sent an invitation to participate in Early Check from November 22, 2019, through March 5, 2020, are presented in Table 2. The sample excludes 18 patients who had already enrolled their newborns in Early Check before their my UNC Chart invitation was sent. Invitations were sent in five batches, with approximately half of the recipients (2466/4510, 54.7%) included in the first mailing on November 22, 2019. Two-thirds of all recipients (3054/4510, 67.7%) logged into their my UNC Chart accounts and opened the invitation within 27 days of when it was sent. Among women who opened the invitation, it took them 2.45 days (SD 4.83) on average to do so; however, the distribution is positively skewed, with 68.6% (2094/4510) opening it within 24 hours. Relatively few women (357/4510, 7.9%) were sent a postnatal recruitment letter or a personalized email (24/4510, 0.5%) within 27 days of being sent the my UNC Chart invitation. This observation is not unexpected, given that the computable phenotype was designed to target women with an active pregnancy as early as the 13th week of gestation.

Table 1. Cross-tabulation of having an active my UNC Chart account by race and ethnicity (N=12,036).a
Active my UNC Chart accountRace/ethnicity, n(%)

White, nBlackHispanicOtherTotal
Yes2527 (43.2)932 (34.0b)531 (27.7c)520 (34.0b)4510 (37.5)
No3325 (56.8)1806 (66.0)1385 (72.3)1010 (66.0)7526 (62.5)
Total5852 (100.0)2738 (100.0)1916 (100.0)1530 (100.0)12,036 (100.0)

aχ2(3,N=12036)=180.99; P<.001, Cramér’s V= 0.12

b, cPercentages among participants with an active my UNC Chartaccount across race/ethnicity columns are significantly different at a Bonferroni-adjusted P<.009

Table 2. Characteristics of women who were sent a my UNC Chart invitation (N=4510).a
CharacteristicValues, n (%)b
Enrolled in Early Check within 27 days of my UNC Chart invitationc

Yes161 (3.6)

No4349 (96.4)
Opened my UNC Chart invitation within 27 days

Yes3054 (67.7)

No1456 (32.3)
Postnatal outreach methods sent within 27 days of my UNC Chart invitation

Recruitment letter


Sent letter357 (7.9)


No letter 4153 (92.1)

Personalized email


Sent email24 (0.5)


No email4486 (99.5)
Date invitation sent

November 22, 20192466 (54.7)

January 7, 2020931 (20.6)

January 29, 2020423 (9.4)

February 12, 2020272 (6.0)

March 5, 2020418 (9.3)
Age (years)

Under 20169 (3.7)

20-24791 (17.5)

25-291224 (27.1)

30-341386 (30.7)

≥35940 (20.8)
Race/ethnicity

White2527 (56.0)

Black932 (20.7)

Hispanic531 (11.8)

Other or unknown520 (11.5)
Urbanicity

Urban3615 (80.2)

Rural892 (19.8)

Unknown3 (0.1)

aAnalysis excludes 18 patients who were sent a my UNC Chart invitation after enrolling in Early Check.

bPercentages may not sum to 100 due to rounding.

cFor this analysis, we set a 27-day enrollment window from the date the my UNC Chart invitations were sent to normalize results from batches of invitations sent on different dates.

Early Check Enrollment

In all, 3.6% (161/4510) of women who received a my UNC Chart invitation enrolled their newborns in the study within 27 days. Excluding 8 women who enrolled their newborns in Early Check without opening the invitation, women took on average 3.92 days (SD 6.50) to enroll. Similar to the distribution for the time it took to open the invitation, the enrollment timing distribution was positively skewed, with 63.4% (97/4510) enrolling within 1 day of opening the invitation.

Our first research question examined whether and to what extent women who opened a research invitation sent to them through my UNC Chart were more likely to enroll in Early Check within 27 days of receiving the invitation. The overall logistic regression model predicting enrollment was significant (χ2 [11, N=4507]=134.90; P<.001, R2McFadden=.10). As shown in Table 3, the odds of enrolling among women who opened the invitation was nearly 9 times the odds of enrolling among women who did not open and who therefore did not view their invitation (SE 3.24, OR 8.86, 95% CI 4.33-18.13; P<.001). Expressed in terms of predicted probabilities holding everything else in the model constant, 4.88% of women who opened the invitation (SE 0.38 , 95% CI 4.13%-5.63%) enrolled their newborns in Early Check within 27 days of when it was sent compared to only 0.58% of women who did not open the invitation within that time frame (SE 0.02, 95% CI 0.18%-0.99%) and most likely became aware of the study through another outreach method Being sent a postnatal recruitment letter (P=.57 or a personalized email invitation (P=.53) did not have a significant additional impact on enrollment within the 27-day period.

Our second research question asked, in part, whether there are differences in enrollment by race/ethnicity, age, or urbanicity. Although we observed no significant differences in enrollment rates across age groups, race/ethnicity and urbanicity were both related to enrollment. The odds of enrolling for Black women who were sent a my UNC Chart invitation was 0.29 times the odds of White women (SE 0.09, OR 0.29, 95% CI 0.16-0.55; P<.001). Expressed in terms of predicted probabilities, whereas 4.49% of White women (SE 0.40, 95% CI 3.72%-5.27%) enrolled their newborns in Early Check within 27 days of when their invitations were sent, only 1.38% of Black women enrolled their newborns (SE 0.41, 95% CI 0.57%-2.19%). We found no other differences in enrollment by other race/ethnicity groups. Additionally, women with a home address in urban zip codes were more likely to enroll than women from rural zip codes (SE 0.97, OR 3.03, 95% CI 1.62-5.67; P=.001). Controlling for the other variables in the model and expressed in terms of predicted probabilities, 4.04% of urban women (SE 0.32, 95% CI 3.41%-4.67%) enrolled their newborns in Early Check compared to 1.40% of rural women (SE 0.42, 95% CI 0.57%-2.22%).

Table 3. Logistic regression analysis predicting Early Check enrollment (N= 4507).
PredictorORSEP values 95% CIa
Opened invitation within 27 days of when it was sent

Reference = no1.0b

Yes8.863.24<.0014.33-18.13
Postnatal recruitment letter

Reference = not sent a recruitment letter1.0

Sent a recruitment letter1.200.37.5650.65-2.20
Postnatal personalized email invitation

Reference = not sent an email invitation1.0

Sent an email invitation1.982.16.5310.23-16.82
Race/ethnicity

Reference = White1.0

Black0.290.09<.0010.16-0.55

Hispanic0.630.20.1350.34-1.16

Other0.620.16.0680.36-1.04
Urbanicityc

Reference = rural1.0

Urban3.030.97.0011.62-5.67
Age (years)

Reference = under 201.0

20-240.970.75.9690.21-4.43

25-291.871.38.3960.44-7.92

30-341.921.41.3760.45-8.07

≥352.581.90.1980.61-10.90
Constant0.000.00<.0010.00-0.01

aThe analysis excluded 3 women for whom geolocation data were insufficient to compute urbanicity.

bThe reference levels are fixed parameters, not estimates, so no measures of precision were calculated.

c\'Urbanicity\' is a variable indicating whether women live in an urban or rural area based on residential ZIP code (see measures section).

Opened my UNC Chart Invitation

Our second research question also considers associations of race/ethnicity, age, and urbanicity on whether women opened the research invitation sent to them through my UNC Chart. As shown in Table 4, the logistic regression model that predicts opening the invitation was significant (χ2[10, N=4507]=62.38; P<.001, R2McFadden=.01). Women who were sent a postnatal recruitment letter by mail within 27 days of a my UNC Chart invitation were significantly less likely to open the invitation (SE 0.09, OR 0.76, 95% CI 0.60-0.96; P=.02). Holding everything else constant and expressed in terms of predicted probabilities, 62.1% of women who were sent a postnatal recruitment letter opened their my UNC Chart invitations (SE 2.6, 95% CI 56.9%-67.2%) versus 68.2% of women who were not sent a recruitment letter (SE 0.7, 95% CI 66.8%-69.6%). Whether women were sent a postnatal personalized email within the 27-day timeframe was not significantly associated with opening the my UNC Chart invitation (P=.19), nor was urbanicity (P=.75). However, race/ethnicity and age were both significantly related to opening the invitation. Black women were significantly less likely than White women to open the invitation (SE 0.05, OR 0.67, 95% CI 0.57-0.78; P<.001), with 61.4% of Black women (SE 1.6, 95% CI 58.3%-64.5%) opening the invitation compared to 70.4% of White women (SE 0.9, 95% CI 68.6%-72.2%). Hispanic women were also less likely to open the invitation than were White women (SE 0.07, OR 0.73, 95% CI 0.60-0.89; P=.002), with an estimated 63.4% of Hispanic women opening their invitations. Lastly, opening the my UNC Chart invitation differed significantly by age. Compared to women under 20 years of age, women aged 25 to 29 years (SE 0.26, OR 1.51, 95% CI 1.08-2.10; P=.02), 30 to 34 years (SE 0.28, OR 1.67, 95% CI 1.20-2.33; P=.003), or 35 years or more (SE 0.25, OR 1.44, 95% CI 1.03-2.03; P=.35) were significantly more likely to open the invitation. The predicted probabilities of opening the my UNC Chart invitation by age are shown in Table 5.

Table 4. Logistic regression analysis predicting whether the my UNC Chart invitation was opened (N=4507).
PredictorORSEP values 95% CI
Postnatal recruitment letter

Reference = not sent a recruitment letter1.0a

Sent a recruitment letter0.760.09.0220.60-0.96
Postnatal personalized email invitation

Reference = not sent an email invitation1.0

Sent an email invitation0.570.24.1850.24-1.31
Race/ethnicity

Reference = White1.0

Black0.670.05< .0010.57-0.78

Hispanic0.730.07.0020.60-0.89

Other0.990.11.9280.80-1.22
Urbanicityb

Reference = rural1.0

Urban0.970.08.7470.83-1.14
Age, years

Reference = under 201.0

20-241.260.22.1780.90-1.78

25-291.510.26.0151.08-2.10

30-341.670.28.0031.20-2.33

≥351.440.25.0351.03-2.03
Constant1.700.29.0021.22-2.38

aThe reference levels are fixed parameters, not estimates, so no measures of precision were calculated.

bThe analysis excluded 3 women for whom geolocation data were insufficient to compute urbanicity.

Table 5. Predicted probability of opening a my UNC Chart invitation by age (N= 4507).
Predictor%aSE95% CI
Age, years

Under 2058.93.851.5-66.3

20-2464.41.761.0-67.7

25-2968.31.365.7-70.9

30-3470.41.268.0-72.8

≥3567.31.564.3-70.4

aPredicted probability expressed as a percentage controlling for covariates included in the logistic regression model.


Principal Findings

We examined the utility of sending research invitations to pregnant women through a patient portal and whether opening an invitation was associated with enrollment in the study. We found an association between opening a patient portal research invitation and enrollment in the study but found disparities by race/ethnicity in having a my UNC Chart patient portal, opening the invitation, and enrolling in the study.

The use of EHR data to identify and contact eligible participants through their patient portal proved to be successful. The findings show that the my UNC Chart patient portal within UNCH could be used to send recruitment invitations to over 4500 pregnant women whose newborns would be eligible for Early Check over a period of approximately 15 weeks. These results demonstrate the efficiency of using patient portals to send recruitment invitations to large numbers of potential research participants, compared to the time and effort it would require to contact thousands of participants through other methods like phone or in-person recruitment. As such, patient portals are especially valuable for studies seeking to approach and enroll very large numbers of participants.

Despite contacting thousands of eligible women, those we contacted accounted for a minority (4510/12.036, 37.5%) of patients at UNCH who met the computable phenotype during the time period of this study; a majority of eligible women did not have active my UNC Chart accounts and thus could not receive a recruitment message. However, for a study like Early Check with broad eligibility criteria and for which patients will become newly eligible over time as women become pregnant, my UNC Chart still proved an efficient method to contact thousands of eligible women.

Overall, patient portal research invitations sent through my UNC Chart were associated with enrollment in the Early Check study among women who opened those invitations. We found that a majority of women who received a my UNC Chart research invitation opened it, and of those who opened it, 4.88%, expressed in predicted probability, enrolled their newborn. In comparison, a previous analysis of the other primary recruitment method for Early Check, postnatal letters and emails to new mothers showed an overall statewide enrollment rate of 4% [36]. For those women who were sent a postnatal letter in addition to the recruitment invitation through my UNC Chart, the receipt of the postnatal letter did not increase the odds of enrollment. We did not independently compare my UNC Chart recruitment with direct letters and emails.

The findings demonstrated disparities in the use of patient portals, opening of research invitations, and enrollment in the study by race/ethnicity. There were also disparities in enrollment by urban/rural home address and in opening research invitations by age. Black women and Hispanic women were less likely to open an Early Check recruitment invitation sent through my UNC Chart and were less likely to enroll in the study after opening the invitation compared to non-Hispanic White women. We also found disparities by race and ethnicity among women we had hoped to reach using my UNC Chart invitations because they did not have an active my UNC Chart account. Members of traditionally underrepresented racial and ethnic minority groups were less likely than non-Hispanic White women in our target audience to have an active my UNC Chart account. Hispanic women were least likely to be my UNC Chart users, a finding that may be partially due to the availability of my UNC Chart in English only.

In our analysis of my UNC Chart by age and rural/urban home address, we found that there was no difference by age or urbanicity in those who had an active my UNC Chart account. Age was not significantly associated with opening the message or enrolling in the study except for women less than 20 years of age who were less likely to open the invitation. We found that women from urban areas were significantly more likely to enroll their newborns in the study compared to women from rural areas. It is not clear why urban women were more likely to enroll their newborns in the study although proximity to academic medical institutions and research familiarity may play a role.

Comparison With Prior Work

Our enrollment rate among women who received Early Check my UNC Chart patient portal research recruitment invitations was similar to other studies using patient portals for recruitment, including the ADAPTABLE study performed in the same health system using a similar messaging protocol (4.4%) and a review of 13 studies recruiting through the patient portal of a single health system (2.9%-3.4%) [20,23,24]. Some studies have reported higher enrollment rates using patient portals ranging from 7% to 38% [22,28,32,34]. Bower et al [21], a study that also recruited pregnant women through patient portals, had a higher enrollment rate (11%) compared to the enrollment rate we report here (161/4510, 3.6%). The reasons for the differing enrollment rates across these studies are unclear but may be partially due to the target study population, demographics of patient portal users at an institution, type of study, demand on participants, formatting of the message, and the timing of the invitation in relation to a scheduled medical appointment. More research is needed on the factors associated with successful recruitment through patient portals and on the acceptability of using patient portals to recruit for research, to identify those studies for which a patient portal recruitment approach is likely to be most productive and acceptable.

The findings of racial and ethnic disparities in the users of my UNC Chart, opening of the recruitment invitation, and enrollment in the study are consistent with the findings across other studies examining the use of patient portals for recruitment and the use of patient portals for clinical care [8,23,29,32,34]. It is important to recognize that patient portal recruitment approaches have limited reach and may compound the problem of underrepresentation in health research. Identifying barriers to patient portal use for clinical care and intervening with specific subgroups to address those barriers may improve the reach of patient portals and their utility in recruiting a diverse research sample [17]. In the meantime, research administrators should use patient portals as part of a broader recruitment strategy and not the sole recruitment method.

Limitations

The study examined patient portal research invitations sent to pregnant women, and findings may have limited generalizability to other types of patients. Findings may also have limited generalizability to organizations that use a patient portal other than Epic MyChart. It is also a limitation of the study that we did not directly compare the effectiveness of recruitment to Early Check through my UNC Chart research invitations to recruitment through postnatal letters and emails. We were unable to conclude whether one of these recruitment approaches was superior in enrolling newborns in the Early Check study or whether one approach would have resulted in a more representative sample.

Conclusions

Patient portals are an effective way to recruit participants for research studies and are especially useful for studies with large target sample sizes. There remain substantial racial and ethnic disparities in the use of patient portals, the response to receipt of an invitation, and enrollment in the study.

Acknowledgments

This research was supported by the National Center for Advancing Translational Sciences (NCATS) and the National Institutes of Health (grant number UL1TR002489). The Early Check infrastructure was supported by NCATS (grant number 5U01TR001792) and by grants from The John Merck Fund.

The findings and conclusions in this publication are those of the authors and do not necessarily represent the views of the North Carolina Department of Health and Human Services, Division of Public Health.

Authors' Contributions

LG, RP, SA, AL, AG, MD, EP, and DB conceptualized the study. EP, SA, AL, RP, LG, and DB contributed to the methodology, while AL and MD provided the software. RP performed the formal analysis, and MD, AL, RP, and SA completed the data curation. LG, RP, and SA drafted the original manuscript. LG, RP, SA, AL, AG, MD, EP, and DB reviewed and edited the manuscript. LG, SA, and AL administered the project, and DB acquired the necessary funding.

Conflicts of Interest

DBB reports current external funding to RTI from Janssen Pharmaceuticals and The John Merck Fund and prior external funding to RTI from Orchard Therapeutics, Travere, BioMarin, and Sarepta Pharmaceuticals. RTI also received donated reagents and equipment from Asuragen.

RSP reports prior external funding to RTI from Inflexxion, a subsidiary of Uprise Health, and Parent Project Muscular Dystrophy (PPMD) with support for PPMD’s Patient Preference Research program provided by Solid Bioscience and Pfizer.

LMG reports receiving grants from Janssen Pharmaceuticals, the John Merck Fund, Sarepta Therapeutics, Muscular Dystrophy Association, and donated reagents and equipment from Asuragen, outside the submitted work.

AYG reports current external funding to RTI from Janssen Pharmaceuticals, the John Merck Fund, the Foundation for Angelman Syndrome Therapeutics, Alcyone Therapeutics, and Lipedema Foundation, and prior external funding to RTI from Orchard Therapeutics, Travere, BioMarin, Sarepta Pharmaceuticals, and the Parent Project Muscular Dystrophy. RTI also received donated reagents and equipment from Asuragen.

SMA reports current external funding to RTI from Janssen Pharmaceuticals, Sarepta Pharmaceuticals, The John Merck Fund, and the Foundation for Angelman Syndrome Therapeutics and prior external funding to RTI from Orchard Therapeutics, Travere Therapeutics, BioMarin, and the EveryLife Foundation for Rare Diseases. RTI also received donated reagents and equipment from Asuragen.

Multimedia Appendix 1

Computable Phenotype for Carolina Data Warehouse for Health Query (CDWH).

PDF File (Adobe PDF File), 103 KB

Multimedia Appendix 2

Text of the Early Check my UNC Chart Research Invitation.

PDF File (Adobe PDF File), 100 KB

  1. Carlisle B, Kimmelman J, Ramsay T, MacKinnon N. Unsuccessful trial accrual and human subjects protections: an empirical analysis of recently closed trials. Clin Trials 2015 Feb;12(1):77-83 [FREE Full text] [CrossRef] [Medline]
  2. Treweek S, Pitkethly M, Cook J, Fraser C, Mitchell E, Sullivan F, et al. Strategies to improve recruitment to randomised trials. Cochrane Database Syst Rev 2018 Feb 22;2:MR000013 [FREE Full text] [CrossRef] [Medline]
  3. Stein MA, Shaffer M, Echo-Hawk A, Smith J, Stapleton A, Melvin A. Research START: A Multimethod Study of Barriers and Accelerators of Recruiting Research Participants. Clin Transl Sci 2015 Dec;8(6):647-654 [FREE Full text] [CrossRef] [Medline]
  4. Irizarry T, DeVito Dabbs A, Curran CR. Patient Portals and Patient Engagement: A State of the Science Review. J Med Internet Res 2015 Jun 23;17(6):e148 [FREE Full text] [CrossRef] [Medline]
  5. Emani S, Yamin CK, Peters E, Karson AS, Lipsitz SR, Wald JS, et al. Patient perceptions of a personal health record: a test of the diffusion of innovation model. J Med Internet Res 2012;14(6):e150 [FREE Full text] [CrossRef] [Medline]
  6. Powell KR. Patient-Perceived Facilitators of and Barriers to Electronic Portal Use: A Systematic Review. Comput Inform Nurs 2017 Nov;35(11):565-573. [CrossRef] [Medline]
  7. Bush RA, Barlow H, Pérez A, Vazquez B, Mack J, Connelly CD. Internet Access Influences Community Clinic Portal Use. Health Equity 2018;2(1):161-166 [FREE Full text] [CrossRef] [Medline]
  8. Graetz I, Gordon N, Fung V, Hamity C, Reed ME. The Digital Divide and Patient Portals: Internet Access Explained Differences in Patient Portal Use for Secure Messaging by Age, Race, and Income. Med Care 2016 Aug;54(8):772-779. [CrossRef] [Medline]
  9. Peacock S, Reddy A, Leveille SG, Walker J, Payne TH, Oster NV, et al. Patient portals and personal health information online: perception, access, and use by US adults. J Am Med Inform Assoc 2017 Apr 01;24(e1):e173-e177. [CrossRef] [Medline]
  10. Lockwood MB, Dunn-Lopez K, Pauls H, Burke L, Shah SD, Saunders MA. If you build it, they may not come: modifiable barriers to patient portal use among pre- and post-kidney transplant patients. JAMIA Open 2018 Oct;1(2):255-264 [FREE Full text] [CrossRef] [Medline]
  11. Plate JF, Ryan SP, Bergen MA, Hong CS, Attarian DE, Seyler TM. Utilization of an Electronic Patient Portal Following Total Joint Arthroplasty Does Not Decrease Readmissions. J Arthroplasty 2019 Feb;34(2):211-214. [CrossRef] [Medline]
  12. Rounds JA, Merianos AL, Bernard AL. Cardiometabolic risk factors and MyChart enrollment among adult patients. Health Policy and Technology 2017 Sep;6(3):302-308. [CrossRef]
  13. Siegel C, Gill A, Esteghamat N, Tu Y, Rodin MB. Barriers to online patient portal adoption among adult oncology patients. JCO 2017 Nov 01;35(31_suppl):44-44. [CrossRef]
  14. Son H, Nahm E. Older Adults' Experience Using Patient Portals in Communities: Challenges and Opportunities. Comput Inform Nurs 2019 Jan;37(1):4-10. [CrossRef] [Medline]
  15. Wallace LS, Angier H, Huguet N, Gaudino JA, Krist A, Dearing M, et al. Patterns of Electronic Portal Use among Vulnerable Patients in a Nationwide Practice-based Research Network: From the OCHIN Practice-based Research Network (PBRN). J Am Board Fam Med 2016 Oct;29(5):592-603 [FREE Full text] [CrossRef] [Medline]
  16. Woods SS, Forsberg CW, Schwartz EC, Nazi KM, Hibbard JH, Houston TK, et al. The Association of Patient Factors, Digital Access, and Online Behavior on Sustained Patient Portal Use: A Prospective Cohort of Enrolled Users. J Med Internet Res 2017 Oct 17;19(10):e345 [FREE Full text] [CrossRef] [Medline]
  17. Turner K, Hong Y, Yadav S, Huo J, Mainous AG. Patient portal utilization: before and after stage 2 electronic health record meaningful use. J Am Med Inform Assoc 2019 Oct 01;26(10):960-967. [CrossRef] [Medline]
  18. Coorevits P, Sundgren M, Klein GO, Bahr A, Claerhout B, Daniel C, et al. Electronic health records: new opportunities for clinical research. J Intern Med 2013 Dec;274(6):547-560. [CrossRef] [Medline]
  19. Obeid JS, Beskow LM, Rape M, Gouripeddi R, Black RA, Cimino JJ, et al. A survey of practices for the use of electronic health records to support research recruitment. J Clin Transl Sci 2017 Aug;1(4):246-252 [FREE Full text] [CrossRef] [Medline]
  20. Samuels MH, Schuff R, Beninato P, Gorsuch A, Dursch J, Egan S, et al. Effectiveness and cost of recruiting healthy volunteers for clinical research studies using an electronic patient portal: A randomized study. J Clin Transl Sci 2017 Dec;1(6):366-372 [FREE Full text] [CrossRef] [Medline]
  21. Bower JK, Bollinger CE, Foraker RE, Hood DB, Shoben AB, Lai AM. Active Use of Electronic Health Records (EHRs) and Personal Health Records (PHRs) for Epidemiologic Research: Sample Representativeness and Nonresponse Bias in a Study of Women During Pregnancy. EGEMS (Wash DC) 2017;5(1):1263 [FREE Full text] [CrossRef] [Medline]
  22. Leveille SG, Huang A, Tsai SB, Weingart SN, Iezzoni LI. Screening for chronic conditions using a patient internet portal: recruitment for an internet-based primary care intervention. J Gen Intern Med 2008 Apr;23(4):472-475 [FREE Full text] [CrossRef] [Medline]
  23. Miller HN, Gleason KT, Juraschek SP, Plante TB, Lewis-Land C, Woods B, et al. Electronic medical record-based cohort selection and direct-to-patient, targeted recruitment: early efficacy and lessons learned. J Am Med Inform Assoc 2019 Nov 01;26(11):1209-1217 [FREE Full text] [CrossRef] [Medline]
  24. Pfaff E, Lee A, Bradford R, Pae J, Potter C, Blue P, et al. Recruiting for a pragmatic trial using the electronic health record and patient portal: successes and lessons learned. J Am Med Inform Assoc 2019 Jan 01;26(1):44-49 [FREE Full text] [CrossRef] [Medline]
  25. Plante TB, Gleason KT, Miller HN, Charleston J, McArthur K, Himmelfarb CD, STURDY Collaborative Research Group. Recruitment of trial participants through electronic medical record patient portal messaging: A pilot study. Clin Trials 2020 Feb;17(1):30-38 [FREE Full text] [CrossRef] [Medline]
  26. Obeid JS, Shoaibi A, Oates JC, Habrat ML, Hughes-Halbert C, Lenert LA. Research participation preferences as expressed through a patient portal: implications of demographic characteristics. JAMIA Open 2018 Oct;1(2):202-209 [FREE Full text] [CrossRef] [Medline]
  27. Bennett WL, Bramante CT, Rothenberger SD, Kraschnewski JL, Herring SJ, Lent MR, et al. Patient Recruitment Into a Multicenter Clinical Cohort Linking Electronic Health Records From 5 Health Systems: Cross-sectional Analysis. J Med Internet Res 2021 May 27;23(5):e24003 [FREE Full text] [CrossRef] [Medline]
  28. Gleason KT, Ford DE, Gumas D, Woods B, Appel L, Murray P, et al. Development and preliminary evaluation of a patient portal messaging for research recruitment service. J Clin Transl Sci 2018 Feb;2(1):53-56 [FREE Full text] [CrossRef] [Medline]
  29. Anthony DL, Campos-Castillo C, Lim PS. Who Isn't Using Patient Portals And Why? Evidence And Implications From A National Sample Of US Adults. Health Aff (Millwood) 2018 Dec;37(12):1948-1954. [CrossRef] [Medline]
  30. Ancker JS, Barrón Y, Rockoff ML, Hauser D, Pichardo M, Szerencsy A, et al. Use of an electronic patient portal among disadvantaged populations. J Gen Intern Med 2011 Oct;26(10):1117-1123 [FREE Full text] [CrossRef] [Medline]
  31. Arcury TA, Quandt SA, Sandberg JC, Miller DP, Latulipe C, Leng X, et al. Patient Portal Utilization Among Ethnically Diverse Low Income Older Adults: Observational Study. JMIR Med Inform 2017 Nov 14;5(4):e47 [FREE Full text] [CrossRef] [Medline]
  32. Kannan V, Wilkinson KE, Varghese M, Lynch-Medick S, Willett DL, Bosler TA, et al. Count me in: using a patient portal to minimize implicit bias in clinical research recruitment. J Am Med Inform Assoc 2019 Aug 01;26(8-9):703-713 [FREE Full text] [CrossRef] [Medline]
  33. Hussain-Gambles M, Atkin K, Leese B. Why ethnic minority groups are under-represented in clinical trials: a review of the literature. Health Soc Care Community 2004 Sep;12(5):382-388. [CrossRef] [Medline]
  34. Tabriz AA, Fleming PJ, Shin Y, Resnicow K, Jones RM, Flocke SA, et al. Challenges and opportunities using online portals to recruit diverse patients to behavioral trials. J Am Med Inform Assoc 2019 Dec 01;26(12):1637-1644 [FREE Full text] [CrossRef] [Medline]
  35. Ukoha EP, Feinglass J, Yee LM. Disparities in Electronic Patient Portal Use in Prenatal Care: Retrospective Cohort Study. J Med Internet Res 2019 Sep 23;21(9):e14445 [FREE Full text] [CrossRef] [Medline]
  36. Paquin RS, Lewis MA, Harper BA, Moultrie RR, Gwaltney A, Gehtland LM, et al. Outreach to new mothers through direct mail and email: recruitment in the Early Check research study. Clin Transl Sci 2020 Dec 31:880-889. [CrossRef] [Medline]
  37. Guillory J, Jordan A, Paquin RS, Pikowski J, McInnis S, Anakaraonye A, et al. Using Social Media to Conduct Outreach and Recruitment for Expanded Newborn Screening. Front. Commun 2020 May 6;5:1-11. [CrossRef]
  38. Bailey DB, Gehtland LM, Lewis MA, Peay H, Raspa M, Shone SM, et al. Early Check: translational science at the intersection of public health and newborn screening. BMC Pediatr 2019 Jul 17;19(1):238 [FREE Full text] [CrossRef] [Medline]
  39. Early Check.   URL: https://portal.earlycheck.org [accessed 2020-08-01]
  40. Richesson RL, Hammond WE, Nahm M, Wixted D, Simon GE, Robinson JG, et al. Electronic health records based phenotyping in next-generation clinical trials: a perspective from the NIH Health Care Systems Collaboratory. J Am Med Inform Assoc 2013 Dec;20(e2):e226-e231 [FREE Full text] [CrossRef] [Medline]
  41. C. Department of Health and Human Services.   URL: https://schs.dph.ncdhhs.gov/schs/births/babybook/2018/northcarolina.pdf [accessed 2020-10-01]
  42. 2010 Rural-urban commuting area codes (revised 7/3/19). U.S. Department of Agriculture ER service. 2010.   URL: https://www.ers.usda.gov/data-products/rural-urban-commuting-area-codes.aspx [accessed 2020-10-01]
  43. ZIP code RUCA approximation codes, 2. Rural Health Research Center U of W.   URL: https://depts.washington.edu/uwruca/ruca-download.php [accessed 2020-10-01]
  44. Moss JL, Stinchcomb DG, Yu M. Providing Higher Resolution Indicators of Rurality in the Surveillance, Epidemiology, and End Results (SEER) Database: Implications for Patient Privacy and Research. Cancer Epidemiol Biomarkers Prev 2019 Sep;28(9):1409-1416 [FREE Full text] [CrossRef] [Medline]


CDWH: Carolina Data Warehouse for Health
EHR: electronic health record
NCATS: National Center for Advancing Translational Sciences
RUCA: rural-urban commuting area
UNCH: UNC Health


Edited by S Badawy; submitted 03.06.21; peer-reviewed by A Alishahi, T Payne; comments to author 30.07.21; revised version received 12.08.21; accepted 11.12.21; published 10.02.22

Copyright

©Lisa M Gehtland, Ryan S Paquin, Sara M Andrews, Adam M Lee, Angela Gwaltney, Martin Duparc, Emily R Pfaff, Donald B Bailey Jr. Originally published in JMIR Pediatrics and Parenting (https://pediatrics.jmir.org), 10.02.2022.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Pediatrics and Parenting, is properly cited. The complete bibliographic information, a link to the original publication on https://pediatrics.jmir.org, as well as this copyright and license information must be included.