Objectives. To compare health care utilization and costs between beneficiaries randomly assigned to usual services versus a community health worker (CHW) program implemented by 3 Medicaid health plans.

Methods. From February 2018 to June 2019, beneficiaries residing in Detroit, Michigan’s Cody Rouge neighborhood with more than 3 emergency department (ED) visits or at least 1 ambulatory care‒sensitive hospitalization in the previous 12 months were randomized. CHWs reached out to eligible beneficiaries to assess their needs and link them to services. We compared ED and ambulatory care visits, hospitalizations, and related costs over 12 months.

Results. In intention-to-treat analyses among 2457 beneficiaries, the 1389 randomized to the CHW program had lower adjusted ratios of ED visits (adjusted rate ratio [ARR] = 0.96; P < .01) and ED visit costs (ARR = 0.96; P < .01), but higher adjusted ratios of ambulatory care costs (ARR = 1.15; P < .01) and no differences in inpatient or total costs compared with the usual-care group.

Conclusions. Initial increases in ambulatory care use from effective programs for underserved communities may mitigate savings from decreased acute care use. Longer-term outcomes should be followed to assess potential cost savings from improved health.

Trial Registration: ClinicalTrials.gov identifier: NCT03924713. (Am J Public Health. 2022;112(5):766–775. https://doi.org/10.2105/AJPH.2021.306700)

Emergency department (ED) visits and hospitalizations because of ambulatory care‒sensitive conditions are important markers of unmet needs and impaired access to health care.1 If people diagnosed with conditions such as asthma, heart failure, and type 2 diabetes have access to high-quality ambulatory care and the resources to effectively manage these conditions, they are less likely to require acute care. However, many low-income urban residents face unmet social needs and barriers to accessing services.2,3 These barriers contribute to high rates of ED visits and hospitalizations, low rates of ambulatory care visits, and poor health outcomes.

A necessary but often insufficient prerequisite for access to outpatient care is health insurance. In 2014, the Healthy Michigan Plan, Michigan’s Medicaid expansion program, extended health insurance to more low-income residents. Yet many Medicaid beneficiaries still struggle with unmet social needs such as food insecurity and face other barriers to managing their health and navigating outpatient health care.4

Community health worker (CHW) programs are one effective approach to provide outreach, support, and linkages to health and social services for individuals facing barriers to care. Trained frontline health workers who share characteristics such as culture, ethnicity, language, and community with those they serve,5 CHWs have improved clinical outcomes among adults with a range of ambulatory care‒sensitive conditions.6–12 In efficacy trials, CHW programs have decreased hospital readmission rates13 and lowered costs.14–18

To date, however, few controlled effectiveness trials have evaluated CHW programs implemented in real-life practice. Moreover, the impact of most CHW initiatives remains limited by their dependence on short-term grants. For any model to be sustainable, payors must be willing to cover the costs of CHW services, or a fee-for-service billing code for CHW services must be established.19 In addition, few studies have examined effects of CHW programs on both acute and ambulatory care. Our study addresses these gaps by evaluating a potentially sustainable CHW program designed in collaboration with 3 Medicaid health plans, the Detroit Health Department, a neighborhood-based community organization, and a university, and implemented as a regular covered program staffed by salaried Medicaid health plan CHWs.

Health plans are well-positioned to address population health needs because most health care spending in the United States flows through them, and health plans typically bear financial risk for their enrollees. Thus, targeting investments to address social, behavioral, and medical needs that contribute to high health care costs can make financial sense for these plans. Since 2016, Medicaid health plans in Michigan are required to provide CHW services to their beneficiaries, either with CHWs they hire or through contracts with community-based organizations. Thus, Michigan offers an excellent opportunity to evaluate the impact of CHW services on beneficiaries’ health care utilization. Accordingly, we worked with 3 Medicaid health plans to design and evaluate a demonstration project building on their existing CHW services that prioritized beneficiaries with high acute care use living in one low-income urban Detroit, Michigan, community.

We hypothesized that beneficiaries randomized to the program would have decreased acute care utilization (ED visits and ambulatory care‒sensitive hospitalizations), increased ambulatory care (primary care and subspecialty medical) visits, and lower overall costs compared with beneficiaries receiving usual health plan services.

The program was implemented in Cody Rouge, a low-income neighborhood in Detroit, with about 36 000 predominantly Black (81%) residents, strong community organizations, and a federally qualified health center.20 The participating Medicaid health plans also determined that Cody Rouge has among the highest concentrations in Detroit of Medicaid enrollees who overutilize acute care yet underutilize ambulatory care.

Over a 12-month period before initiating the project, we conducted interviews with stakeholders from 10 community health and social services organizations in Cody Rouge to inform program development, implementation, and evaluation. These interviews further helped establish a community advisory committee with representatives from neighborhood organizations to inform program activities. We then partnered with 3 Medicaid health plans, the Detroit Health Department, and the Joy-Southfield Community Development Corporation to design and implement a CHW-led Cody Rouge‒focused program that incorporates best practices from our previous work and the CHW literature.5–18

Selection, Recruitment, and Randomization

The study protocol is described elsewhere.21 Briefly, from February 2018 to June 2019, on a monthly basis, each plan compiled lists of its members who (1) resided in Cody Rouge zip codes and (2) either had more than 3 ED visits, defined for eligibility as a unique date with an ED visit claim as identified by Current Procedural Terminology version 4 (CPT4)22 and Universal Bill version 4 (UB-04) revenue codes,23 or at least 1 ambulatory care-sensitive hospitalization, defined through International Classification of Diseases, Tenth Revision (ICD-10) diagnosis codes,24 in the previous 12 months (Appendix A, available as a supplement to the online version of this article at http://www.ajph.org). These enrollees were randomized by random number generator to be offered either the CHW program or usual health plan services. To ensure adequate representation of the highest-need members, randomization was stratified so that half of each arm consisted of enrollees with 5 or more ED visits in the previous 12 months and half with fewer than 5.


Each of the health plans assigned their own salaried CHWs to lead the program (2 CHWs at 1 health plan who worked part-time on the program and 1 each at the other 2 plans). Each plan assigned Black CHWs from or familiar with Cody Rouge. The CHWs underwent training by a trainer from the Detroit Health Department (R. G.), under a contract with the Michigan Community Health Worker Alliance. Although each health plan had provided their CHWs with in-house training, the Michigan Community Health Worker Alliance’s core competency‒based training ensured a common set of skills and approaches aligned with the national CHW Core Consensus.25 In response to recommendations from the community advisory committee, 2 “program trainees” who were residents of Cody Rouge were recruited to participate in the training and consult with the CHWs on neighborhood-specific issues.

Each month, the CHWs were provided by their health plan the list of members randomized to the program and reached out to them either by phone or in person to offer their services. These lists included more members than the CHWs had time to reach out to. Those members who remained eligible and for whom no contact was attempted were included on the next month’s list, but there were still more eligible members each month than the CHWs had time to try to contact. Participants met with their health plan’s CHW who was tasked with (1) conducting an initial comprehensive health, behavioral, and social needs assessment; (2) developing an individualized action plan; and (3) linking members to necessary services. The frequency and duration of follow-up support depended on identified needs and required support as determined collaboratively by the CHW and the participant.

Each CHW provided services to his or her own health plan’s eligible members. All CHWs, however, followed the same outreach protocol, assessed the same domains in their assessments, and followed similar counseling, action plan, and follow-up protocols. Each CHW completed brief encounter forms to track contacts and log key activities and referrals.

The health plan CHWs met as a group at regular intervals with the trainer (R. G.) to reinforce skills and share best practices and information on Cody Rouge resources. These “reflective consultation” sessions built mutual support among the CHWs, provided opportunities for ongoing training, and encouraged the program trainees to offer their perspectives related to neighborhood-specific social needs and services.

Usual Care

Members randomized to the control arm were eligible for usual services. Each health plan has algorithms for identifying which members meet criteria for outreach (e.g., not completing quality measures).

Data Collection and Outcomes Measures

Data on the primary outcomes—ED visits, ambulatory care‒sensitive and all hospitalizations, and ambulatory care visits—and claims summaries used to compute standardized costs were obtained from health plan limited data sets. The health plans provided the evaluation team individual-level data on billing (CPT4, UB-04) codes, diagnosis (ICD-10) codes, and dates of health care services for a 36-month period from 24 months before to 12 months after the date individuals were randomized. Outcomes were measured for the 12-month postrandomization period, and baseline utilization and costs were measured for the 12-month period immediately preceding randomization. Charlson comorbidities were identified using the “charlson” command in Stata version 16 (StataCorp LP, College Station, TX) and participants grouped into those having 0, 1, or 2 or more comorbid condtions.26,27

For purposes of study eligibility, ED visits were counted as the number of unique days with an ED claim (CPT4 code: 99281–99285 or UB-04 revenue code: 0450–0452, 0456, 0459, 0981). To achieve a more accurate count of unique ED visits for comparing outcomes and baseline utilization, we counted ED claims within 3 days of one another as a single visit28 and compared the number of visits (not days) between groups. Ambulatory care‒sensitive hospitalizations were identified from inpatient stays with an ambulatory care‒sensitive condition as the primary diagnosis.

Sample Size Power Calculations

We calculated the required sample size by using the following assumptions. Among high utilizers, we anticipated a mean of 2 ED visits per beneficiary-year in the usual health plan services arm. We expected a reduction in mean ED visit by 0.65 per year to be clinically meaningful. Thus, we required 125 participants per arm to provide 80% power to detect this difference with a 0.05-level 2-sided test, assuming 0.01 within-CHW correlation. For hospitalizations, we expected 1.5 hospitalizations per beneficiary in the control arm; the proposed sample size provided 80% power to detect a difference in the number of hospitalizations of 0.37 between the 2 groups.


We first compared the 2 study arms to check for balance in age, gender, prevalence of baseline comorbid conditions, Medicaid eligibility through the Healthy Michigan Plan program, previous year rates for ED visits, office visits, hospitalizations, and ambulatory care‒sensitive hospitalizations, and previous year standardized costs both in total and separated into ED visits, non-ED outpatient care, and inpatient hospitalizations. Because of privacy concerns, the health plans did not provide race or income data. To be eligible for the Healthy Michigan Medicaid plan, however, household income had to be 133% or less of the federal poverty level29 (e.g., $16 000 for a single person in 2018). We compared continuous variables by using the 2-sample t test with cost variables compared on the log($cost + $1)-scale. We used the χ2 test to compare categorical variables, and univariate Poisson regression for rate-variables, with an offset for a participant’s (log) months enrolled during the 12-month baseline period.

In intention-to-treat analyses comparing numbers of visits, hospitalizations, and costs between the 2 study groups of ED visits and hospitalizations over a 12-month period after randomization, we used separate quasi-Poisson regression models with each having a CHW group indicator as the primary predictor and adjusting for age, gender, and rate of utilization in the previous year. To account for within-plan clustering, we fit the quasi-Poisson model using generalized estimating equations (GEEs).30 We estimated adjusted rate ratios (ARRs) with their 95% confidence intervals (CIs) based on the models and reported as a summary measure of comparison. We used similar approaches to examine differences in ambulatory care visits between groups. As a planned secondary “as-treated” analysis, we compared differences in outcomes between health plan members randomized to the CHW program who the CHWs recorded in their disposition logs had been “partially engaged” or “fully engaged” or for whom the CHWs had provided resources or taken any action on their behalf. We repeated all analyses described previously adjusting for this “active” treatment status and compared the active treatment group to both controls and inactive treatment members. While these comparisons unavoidably conflate treatment and selection effects, they provide a useful upper bound on the likely treatment effect among active participants.

Standardized costs for professional billing for outpatient services were taken from Medicare national average payment amounts by CPT4 code using Medicare public-use files.31 Costs for services for which these estimates were unavailable were imputed using the average cost of all CPT4 codes sharing the first 4, 3, or 2 digits, with more digits preferred when applicable. Costs for ED visits included any associated professional billing and an estimated per-visit facility charge of $1118 based on the national average.32 Costs for inpatient hospitalizations were based on diagnosis-related groups and length of stay using an imputation model derived from the 2017 State Inpatient Database for Michigan assembled by the Healthcare Cost and Utilization Project33 (Appendix B, available as a supplement to the online version of this article at http://www.ajph.org). All costs were Winsorized to the 98th percentile to reduce the influence of potential outliers.34

We compared ED, outpatient, and total costs (plus $1) between groups using γ-family GEE models with log link and otherwise identical to those described previously. Most patients had zero hospitalizations, so they had no inpatient costs. We thus modeled inpatient costs using a 2-part model with a binomial GEE for whether a participant was hospitalized and a γ-family GEE for inpatient costs conditional on hospitalization(s). We summarized these models using the average ARR of expected inpatient cost (probability of hospitalization multiplied by expected cost). We estimated the standard error of this average ARR for inpatient costs using 200 bootstrap replications, with replication stratified by health plan and intervention arm. Data were cleaned and organized using Stata version 16 with statistical analyses done in R version 4.0.2 and 4.1.1 using the “geepack”35 library for GEE models.

There were no missing values for baseline variables. Some randomized participants left their health plans before the 12-month end point and were censored when they disenrolled. To account for this censoring, our generalized linear models included offsets for the (log) number of days of follow-up divided by 365. These “days-covered” values represent the number of postrandomization days for which we have data on each enrollee’s health care utilization.

The CONSORT diagram (Figure 1) shows participant flow. The CHWs attempted to contact 1090 (61%) of 1782 beneficiaries randomized to the program. A total of 284 beneficiaries (16%) had at least some recorded engagement in the program. Participants’ baseline characteristics are reported in Table 1. We had outcome data for at least 1 month of follow-up on 1068 eligible controls (78%) and 1389 eligible participants (78%). Groups were balanced in terms of mean age (29.4 vs 29.9 years) and gender (36.1% vs 36.6% male). Relative to the national Medicaid population younger than 65 years, our cohort was less likely to be younger than 21 years (25.7% vs 58.1%) and more likely to be aged 21 to 26 (16.4% vs 7.5%), 27 to 45 (28.7% vs 20.4%) or 46 to 65 years (19.4% vs 15.1%).36


TABLE 1— Baseline Characteristics of Medicaid Enrollees in the 3 Detroit, Michigan, Medicaid Health Plans Randomized From February 2018 to June 2019

TABLE 1— Baseline Characteristics of Medicaid Enrollees in the 3 Detroit, Michigan, Medicaid Health Plans Randomized From February 2018 to June 2019

Characteristic % (No.) or Mean (IQR) Intervention vs Control, P
Active (n = 284) Inactive (n = 1105) Control (n = 1068)
Age, y 31.5 (21.0–44.2) 28.9 (20.0–39.0) 29.9 (20.0–41.0) .46
 < 21 23.9 (68) 26.5 (293) 25.4 (271) .77
 21–26 15.1 (43) 16.2 (179) 17.0 (182)
 27–44 37.3 (106) 39.5 (437) 37.6 (402)
 45–65 23.6 (67) 17.7 (196) 19.9 (213)
Male 34.5 (98) 37.2 (411) 36.1 (386) .83
Charlson comorbidities .92
 0 34.5 (98) 37.9 (419) 37.4 (399)
 1 25.7 (73) 29.1 (321) 28.9 (309)
 ≥ 2 39.8 (113) 33.0 (365) 33.7 (360)
Healthy Michigan 35.6 (101) 31.3 (346) 29.3 (313) .14
Office visits per person-year, previous 12 mo 6.4 (1.0‒10.0) 5.6 (1.0‒8.0) 5.9 (1.0‒8.0) .29
ED visits per person-year, previous 12 moa 4.4 (3.0‒5.0) 4.7 (3.0‒5.0) 4.7 (3.0‒5.0) .37
% with hospitalization, previous 12 mo 29.6 (84) 28.9 (319) 29.3 (313) .91
% with ambulatory care‒sensitive hospitalizations, previous 12 mo 10.6 (30) 9.2 (102) 9.2 (98) .84
ED costs per person-year, $ 5 757 (3 703‒6 528) 6 108 (3 830‒7 068) 6 140 (3 875‒6 953) .97
Outpatient costs per person-year, excluding ED, $ 4 902 (1 062‒4 850) 4 836 (1 058‒4 650) 4 898 (1 115‒5 232) .28
Inpatient costs per person-year, $ 14 109 (0‒10 714) 14 190 (0‒10 519) 12 749 (0‒10 698) .88
Total costs per person-year, $ 24 767 (5771‒19 971) 25 134 (6 040‒20 727) 23 787 (6135‒21 351) .33

Note. ED = emergency department; IQR = interquartile range.

aOccurring > 3 d after last ED visit.

Program Engagement

Among the 284 participants for whom CHWs logged engagement in the program, the average number of days separating randomization from the first contact attempt was 76.4 (SD = 90.2) days and the median was 43.5 (interquartile range = 25.0‒80.3) days. Across these participants, 23.1% of the follow-up period preceded the first contact attempt. The 12-month postrandomization follow-up period therefore includes approximately 2.8 preintervention months and 9.2 postrandomization months, which can be expected to attenuate the estimated intervention effect. Of participants with at least some engagement in the program, 55.3% (157/284) had 1 recorded interaction with their CHW. On average, CHWs recorded 1.9 (95% CI = 1.7, 2.1) contacts with engaged participants. A majority of 59.2% (168/284) were referred to at least 1 community resource with an average of 0.7 (95% CI = 0.6, 0.8) referrals per engaged participant.


Table 2 shows unadjusted outcomes at 12 months. Of “active” participants, 77.5% had 360 or more days of follow-up compared with 72.2% of the inactive group and 71.4% of the control group; the difference was not statistically significant (P = .13). In the fully adjusted intention-to-treat analyses (Table 3), enrollees randomized to the CHW program on average had fewer ED visits than control patients (ARR = 0.96; 95% CI = 0.94, 0.98; P < .01) over the 12-month follow-up period. There were no significant differences in average ambulatory care‒sensitive or overall hospitalizations, or ambulatory care visits. Those randomized to the CHW program had significantly lower ED visit costs (ARR = 0.96; 95% CI = 0.94, 0.98; P < .001) but higher ambulatory care visit costs (ARR = 1.06; 95% CI = 1.00, 1.11; P < .05), with no significant between-group differences in inpatient or total costs.


TABLE 2— Unadjusted Outcomes for Medicaid Enrollees in the 3 Detroit, Michigan, Medicaid Health Plans Randomized From February 2018 to June 2019

TABLE 2— Unadjusted Outcomes for Medicaid Enrollees in the 3 Detroit, Michigan, Medicaid Health Plans Randomized From February 2018 to June 2019

Characteristic % (No.) or Meana (IQR)
Active (n = 284) Inactive (n = 1105) Control (n = 1068)
Days covered, 12 mo after randomizationb 331.3 (365–365) 324.7 (335–365) 323.0 (335–365)
 < 330 15.5 (44) 21.5 (238) 21.8 (233)
 330–359 7.0 (20) 6.2 (69) 6.7 (72)
 360–366 77.5 (220) 72.2 (798) 71.4 (763)
Office visits per person-year 6.4 (1–8) 4.8 (1–6) 5.3 (1–7)
ED visits per person-year (3-d gap) 2.8 (1–4) 3.0 (1–4) 3.1 (1–4)
% with hospitalization 19.4 (55) 16.6 (183) 19.1 (204)
Hospitalizations per person-year 0.34 (0–0) 0.36 (0–0) 0.36 (0–0)
% with ambulatory care–sensitive hospitalizations 6.0 (17) 4.0 (44) 4.6 (49)
Ambulatory care–sensitive hospitalizations per person-year 0.08 (0–0) 0.09 (0–0) 0.07 (0–0)
ED costs per person-year,c $ 3 465 (1 187–5 033) 3 862 (1 176–5 346) 3 971 (1205–5485)
Outpatient costs per person-year,d $ 3 989 (503–4 764) 3 442 (453–3 894) 3 516 (467–4 061)
Inpatient costs per person-year, $ 7 325 (0–0) 7 156 (0–0) 7 196 (0–0)
Total costs per person-year,e $ 15 815 (1 947–15 209) 15 608 (1 958–12 741) 15 495 (1 931–14 540)

Note. ED = emergency department; IQR = interquartile range.

aMeans are weighted by length of follow-up (e.g., for ED visits, the mean here is total ED visits for each group divided by total years of follow-up for that group).

bDays covered equal the number of days in the 12 months following randomization for which we have participant data.

cED costs include an estimated per-visit facility charge of $1118.

dOutpatient costs exclude ED costs.

eTotal cost is the sum of ED, outpatient, and inpatient costs and includes the ED facility charge.


TABLE 3— Results of Adjusted Analyses for all Outcomes of Medicaid Enrollees in the 3 Detroit, Michigan, Medicaid Health Plans Randomized From February 2018 to June 2019

TABLE 3— Results of Adjusted Analyses for all Outcomes of Medicaid Enrollees in the 3 Detroit, Michigan, Medicaid Health Plans Randomized From February 2018 to June 2019

Intention-to-Treat: Intervention vs Control As Treated
Inactive vs Control Active vs Control
ED visits, AIRR (95% CI) 0.96 (0.94, 0.98) 0.97 (0.93, 1.02) 0.91 (0.86, 0.96)
Office visits, AIRR (95% CI) 1.01 (0.98, 1.03) 0.97 (0.93, 1.02) 0.94 (0.80, 1.09)
Hospitalizations, AIRR (95% CI) 0.97 (0.77, 1.24) 0.98 (0.75, 1.29) 0.94 (0.80, 1.09)
Ambulatory care‒sensitive hospitalizations, AIRR (95% CI) 1.21 (0.69, 2.14) 1.22 (0.64, 2.36) 1.17 (0.74, 1.82)
ED costs,a ARR (95% CI) 0.96 (0.94, 0.98) 0.98 (0.95, 1.00) 0.90 (0.79, 1.02)
Outpatient costs,b ARR (95% CI) 1.06 (1.00, 1.11) 1.04 (0.98, 1.10) 1.14 (1.08, 1.21)
Inpatient costs,c ARR (95% CI) 0.83 (0.52, 1.14) 0.82 (0.56, 1.08) 0.91 (0.56, 1.27)
Total costs,d ARR (95% CI) 0.97 (0.90, 1.05) 0.97 (0.91, 1.02) 0.98 (0.83, 1.17)

Note. AIRR = adjusted incident rate ratio; ARR = adjusted rate ratio; CI = confidence interval; ED = emergency department. All models adjusted for age, gender, Charlson Comorbidity Index, and the corresponding utilization or cost measure (e.g., ED visits) from the 12 mo before randomization.

aOutpatient costs exclude ED costs.

bED costs include a $1118 facility charge for each ED visit, based on the national average reported here: https://consumerhealthratings.com/healthcare_category/emergency-room-typical-average-cost-of-hospital-ed-visit.

cInpatient costs are compared using an average ARR.

dTotal costs = (outpatient + ED + inpatient).

Enrollees with some reported engagement with a CHW (Table 3) also had higher ambulatory outpatient costs (ARR = 1.14; 95% CI = 1.08, 1.21; P < .001) and fewer ED visits (ARR = 0.91; 95% CI = 0.86, 0.96; P < .01) relative to the control group. There were no significant differences in average numbers of hospitalizations, inpatient costs, or total costs. (Appendix C, available as a supplement to the online version of this article at http://www.ajph.org, shows adjusted models with all variables.)

In this Medicaid health plan CHW-led demonstration program, although only 16% of plan beneficiaries randomized to the CHW program had any recorded engagement in the program, even in intention-to-treat analyses, those randomized to the program had fewer ED visits and more outpatient ambulatory care resource use at 12-month follow-up than beneficiaries randomized to usual care. Because of the greater ambulatory care costs, the lower rates of ED visits did not translate into a decrease in total costs.

This study contributes to the literature on CHW program effects on health care utilization in 3 key ways. First, we evaluated a real-world CHW program using a highly rigorous methodology. Many CHW programs have been implemented successfully in the United States37 and globally.38 Evaluation of these programs often has not employed rigorous methodologies such as random assignment, comparison groups, intent-to-treat samples, and 12-month follow-up, in part because of resource and time constraints and the need for rapid within-program feedback. At the opposite end of the spectrum are programs that are efficacious in randomized controlled trials conducted under well-controlled conditions, but may not be effective when implemented under real-life conditions.39,40 Studies can increase engagement and limit attrition by paying participants and employing research staff to facilitate enrollment and follow-up. As these tools are unavailable to programs in the real world, the leap from research to practice can expose a promising intervention to problems that mitigate its effects.

The current study thus took place at a unique intersection of real-world CHW programming and methodological rigor, allowing for exploration of important implementation factors. Although the current program was effective in reducing ED visits and increasing ambulatory resource use relative to the control group, CHWs reported engaging with less than 20% of eligible beneficiaries. We explored this challenge through an ancillary qualitative interview study41 that found that barriers to successful outreach included the CHWs’ schedules (not working evenings or weekends), out-of-date enrollee phone numbers and addresses, and concerns among would-be recipients that the CHWs were affiliated with child protective services or other enforcement agencies. These barriers would not have been identified in a clinical randomized controlled trial in which participants are recruited and consented. Nor would the ultimate effectiveness of the intervention under real-life conditions—in spite of these barriers—have been established.

Second, while prior studies have found CHW programs to reduce ED visits and hospitalizations,13–18 few studies have examined both acute and ambulatory care use and costs. In many underserved populations such as in our study, high acute care use is often combined with little or no ambulatory care use. Thus, as in our study, a successful program may in the short term increase use of ambulatory care sufficiently to match cost savings from decreased acute care utilization. More than 12 months of follow-up data are necessary to assess longer-term patterns of health care use and potential cost savings from improved health.

Third, this study illustrates an important cross-sectoral model of partnership among a university, 3 Medicaid health plans, a city health department, and a local community organization in which the partners worked cooperatively to implement and evaluate a potentially sustainable demonstration program with already employed plan CHWs that incorporated best practices and prioritized a high-need urban neighborhood.


This study should be understood in the context of some limitations. First, because this was a nonregulated program evaluation, we had no direct contact with participants and, thus, were not able to examine patient-centered outcomes. Second, we lacked data on participant characteristics such as race/ethnicity, and the study lacked power to conduct subgroup analyses to determine whether results varied by participants’ characteristics. Third, the program was implemented by 3 separate health plans with differing histories, practices, and preexisting CHW services. The study team worked with the CHWs and their supervisors before the program’s launch to establish standard operating guidelines with respect to practices such as number of outreach phone calls and home visits, assessment domains, and “action plan” approaches. However, each health plan has its own culture and workflow. These may have introduced subtle differences in intervention delivery. Finally, privacy concerns made it infeasible to audio-record meetings or otherwise introduce fidelity or uniformity checks for CHW counseling or actions.

Notwithstanding these limitations, this evaluation represents one of the first efforts to examine the effects on health care use and costs of a real-world CHW demonstration program conducted by Medicaid health plans with their own salaried CHWs focused on beneficiaries in a specific urban neighborhood. Our findings can inform other programs and public policy on sustainable financing of CHW services.

Public Health Implications

Our study suggests that even with outreach barriers and low rates of engagement, significant positive outcomes are possible. Those hoping to implement real-world CHW programs are encouraged to build on the lessons learned in our demonstration project. First, a useful strategy may be for health plans and systems to contract with CHWs employed by community-based organizations that are trusted, have close linkages to the specific communities they serve, and are able to field a range of flexible outreach and engagement strategies. It is encouraging that recently Michigan’s Medicaid program has incentivized Medicaid Health Plans to contract for CHW services with such community-based organizations.

Second, increasing use of ambulatory care—thereby leading to a short-term increase in costs despite decreased acute care utilization—should be considered a marker of success for programs seeking to benefit underserved communities. It will be necessary to follow outcomes over a longer period than 12 months to assess potential cost savings from the improved health that can flow from increased ambulatory care utilization. Third, efforts to reach beyond the health care system to improve health require multisectoral partnerships such as this study’s partnership. Fourth, as the State of Michigan has supported a financially sustainable model of CHW programming through its Medicaid health plans, other state Medicaid programs should test this and other models to provide sustained funding for evidence-based CHW programs.

See also Rodriguez, p. 697.


This project was supported by the Blue Cross Blue Shield of Michigan Foundation, the Michigan Department of Health and Human Services, the Ralph C. Wilson Foundation, Poverty Solutions at the University of Michigan, and the National Institute of Diabetes and Digestive and Kidney Diseases (grant P30DK092926).

We thank the 5 incredible community health workers who implemented the evaluated program. We thank all the organizational partners who participated in the design and execution of the model community health worker program evaluated in this study. These include not only the 3 Medicaid health plans, the Detroit Health Department, and the Joy-Southfield Community Development Corporation but also the Cody Rouge community organizations that participated in the advisory council.

Note. The funding sources had no role in the study design; data collection; administration of the interventions; analysis, interpretation, or reporting of data; or decision to submit the findings for publication.


The authors have no conflict of interest or financial disclosures.


As this was an evaluation of a program offered by Medicaid health plans, with each plan conducting its own randomization processes, the University of Michigan institutional review board deemed the outcomes analysis nonregulated.


1. Brown AD, Goldacre MJ, Hicks N, et al. Hospitalization for ambulatory care-sensitive conditions: a method for comparative access and quality studies using routinely collected statistics. Can J Public Health. 2001;92(2):155159. https://doi.org/10.1007/BF03404951 Crossref, MedlineGoogle Scholar
2. Butkus R, Rapp K, Cooney TG, Engel LS; Health and Public Policy Committee of the American College of Physicians. Envisioning a better US health care system for all: reducing barriers to care and addressing social determinants of health. Ann Intern Med. 2020;172(2 suppl): S50S59. https://doi.org/10.7326/M19-2410 Crossref, MedlineGoogle Scholar
3. Patel MR, Piette JD, Resnicow K, Kowalski-Dobson T, Heisler M. Social determinants of health, cost-related nonadherence, and cost-reducing behaviors among adults with diabetes: findings from the National Health Interview Survey. Med Care. 2016;54(8):796803. https://doi.org/10.1097/MLR.0000000000000565 Crossref, MedlineGoogle Scholar
4. Rosland AM, Kieffer EC, Tipirneni R, et al. Diagnosis and care of chronic health conditions among Medicaid expansion enrollees: a mixed-methods observational study. J Gen Intern Med. 2019; 34(11):25492558. https://doi.org/10.1007/s11606-019-05323-w Crossref, MedlineGoogle Scholar
5. American Public Health Association. Support for community health workers to increase health access and to reduce health inequities, APHA Policy No 20091. Available at: https://www.apha.org/policies-and-advocacy/public-health-policy-statements/policy-database/2014/07/09/14/19/support-for-community-health-workers-to-increase-health-access-and-to-reduce-health-inequities. Accessed February 13, 2022. Google Scholar
6. Palmas W, March D, Darakjy S, et al. Community health worker interventions to improve glycemic control in people with diabetes: a systematic review and meta-analysis. J Gen Intern Med. 2015;30(7):10041012. https://doi.org/10.1007/s11606-015-3247-0 Crossref, MedlineGoogle Scholar
7. Spencer MS, Kieffer EC, Sinco B, et al. Outcomes at 18 months from a community health worker and peer leader diabetes self-management program for Latino adults. Diabetes Care. 2018;41(7): 14141422. https://doi.org/10.2337/dc17-0978 Crossref, MedlineGoogle Scholar
8. Brownstein JN, Bone LR, Dennison CR, Hill MN, Kim MT, Levine DM. Community health workers as interventionists in the prevention and control of heart disease and stroke. Am J Prev Med. 2005;29(5 suppl 1):128133. https://doi.org/10.1016/j.amepre.2005.07.024 Crossref, MedlineGoogle Scholar
9. Roland KB, Milliken EL, Rohan EA, et al. Use of community health workers and patient navigators to improve cancer outcomes among patients served by federally qualified health centers: a systematic literature review. Health Equity. 2017;1(1):6176. https://doi.org/10.1089/heq.2017.0001 Crossref, MedlineGoogle Scholar
10. Kangovi S, Mitra N, Grande D, Huo H, Smith RA, Long JA. Community health worker support for disadvantaged patients with multiple chronic diseases: a randomized clinical trial. Am J Public Health. 2017;107(10):16601667. https://doi.org/10.2105/AJPH.2017.303985 LinkGoogle Scholar
11. Kangovi S, Mitra N, Norton L, et al. Effect of community health worker support on clinical outcomes of low-income patients across primary care facilities: a randomized clinical trial. JAMA Intern Med. 2018;178(12):16351643. https://doi.org/10.1001/jamainternmed.2018.4630 Crossref, MedlineGoogle Scholar
12. Kim K, Choi JS, Choi E, et al. Effects of community-based health worker interventions to improve chronic disease management and care among vulnerable populations: a systematic review. Am J Public Health. 2016;106(4):e3e28. https://doi.org/10.2105/AJPH.2015.302987 LinkGoogle Scholar
13. Cardarelli R, Horsley M, Ray L, et al. Reducing 30-day readmission rates in a high-risk population using a lay-health worker model in Appalachia Kentucky. Health Educ Res. 2018;33(1):7380. https://doi.org/10.1093/her/cyx064 Crossref, MedlineGoogle Scholar
14. Galbraith AA, Meyers DJ, Ross-Degnan D, et al. Long-term impact of a postdischarge community health worker intervention on health care costs in a safety-net system. Health Serv Res. 2017; 52(6):20612078. https://doi.org/10.1111/1475-6773.12790 Crossref, MedlineGoogle Scholar
15. Jack HE, Arabadjis SD, Sun L, Sullivan EE, Phillips RS. Impact of community health workers on use of healthcare services in the United States: a systematic review. J Gen Intern Med. 2017;32(3): 325344. https://doi.org/10.1007/s11606-016-3922-9 Crossref, MedlineGoogle Scholar
16. Cross-Barnet C, Ruiz S, Skillman M, et al. Higher quality at lower cost: community health worker interventions in the health care innovation awards. J Health Dispar Res Pract. 2011;11(2): 150164. Google Scholar
17. Viswanathan M, Kraschnewski JL, Nishikawa B, et al. Outcomes and costs of community health worker interventions: a systematic review. Med Care. 2010;48(9):792808. https://doi.org/10.1097/MLR.0b013e3181e35b51 Crossref, MedlineGoogle Scholar
18. Vasan A, Morgan JW, Mitra N, et al. Effects of a standardized community health worker intervention on hospitalization among disadvantaged patients with multiple chronic conditions: a pooled analysis of three clinical trials. Health Serv Res. 2020;55(suppl 2):894901. https://doi.org/10.1111/1475-6773.13321 Crossref, MedlineGoogle Scholar
19. Lapidos A, Lapedis J, Heisler M. Realizing the value of community health workers—new opportunities for sustainable financing. N Engl J Med. 2019;380(21):19901992. https://doi.org/10.1056/NEJMp1815382 Crossref, MedlineGoogle Scholar
20. City of Detroit. West Design Region. Available at: https://detroitmi.gov/departments/planning-and-development-department/neighborhood-plans#West-Design-Region. Accessed March 29, 2021. Google Scholar
21. Heisler M, Lapidos A, Henderson J, et al. Study protocol for a community health worker (CHW)-led comprehensive neighborhood-focused program for Medicaid enrollees in Detroit. Contemp Clin Trials Commun. 2019;16:100456. https://doi.org/10.1016/j.conctc.2019.100456 MedlineGoogle Scholar
22. Agency for Healthcare Research and Quality. United States Health Information Knowledgebase. 2008. Available at: https://ushik.ahrq.gov/ViewItemDetails?system=mu&itemKey=86651000. Accessed February 13, 2022. Google Scholar
23. Find-a-Code. UB04/CMS1450 Revenue Codes. 2018. Available at: https://www.findacode.com/ub04-revenue/ub04-revenue-cms-1450-codes-01-group.html. Accessed February 13, 2022. Google Scholar
24. Michigan Department of Health and Human Services. Ambulatory care sensitive (ACS) conditions ICD-9-CM and ICD-10-CM groupings. 2021. Available at: http://www.mdch.state.mi.us/pha/osr/CHI/HOSP/icd9cm1.htm. Accessed February 13, 2022. Google Scholar
25. Community Health Worker Core Consensus Project. 2021. Available at: https://www.c3project.org. Accessed February 13, 2022. Google Scholar
26. Quan H, Sundararajan V, Halfon P, et al. Coding algorithms for defining comorbidities in ICD-9-CM and ICD-10 administrative data. Med Care. 2005;43(11):11301139. https://doi.org/10.1097/01.mlr.0000182534.19832.83 Crossref, MedlineGoogle Scholar
27. Stagg V. Charlson: Stata module to calculate Charlson index of comorbidity, Statistical Software Components S456719. 2006. Available at: https://econpapers.repec.org/software/bocbocode/s456719.htm. Accessed February 13, 2022. Google Scholar
28. Venkatesh AK, Mei H, Kocher KE, et al. Identification of emergency department visits in Medicare administrative claims: approaches and implications. Acad Emerg Med. 2017;24(4):422431. https://doi.org/10.1111/acem.13140 Crossref, MedlineGoogle Scholar
29. Healthy Michigan Plan Home. Healthy Michigan Plan. Who is Eligible. 2022. Available at: https://www.michigan.gov/healthymiplan/0,5668,7-326-67874,00.html. Accessed February 13, 2022. Google Scholar
30. Liang KY, Zeger SL. Longitudinal data analysis using generalized linear models. Biometrika. 1986;73(1):1322. https://doi.org/10.1093/biomet/73.1.13 CrossrefGoogle Scholar
31. US Department of Health and Human Services. Guidance portal. Physician and other supplier data CY 2017. Available at: https://www.hhs.gov/guidance/document/physician-and-other-supplier-data-cy-2017-0. Accessed February 13, 2022. Google Scholar
32. Consumer Health Ratings. Emergency room—typical average cost of hospital ED visit. Available at: https://consumerhealthratings.com/healthcare_category/emergency-room-typical-average-cost-of-hospital-ed-visit. Accessed February 13, 2022. Google Scholar
33. Agency for Healthcare Research and Quality. Overview of the State Inpatient Databases (SID). Available at: https://www.hcup-us.ahrq.gov/sidoverview.jsp. Accessed February 13, 2022. Google Scholar
34. Tukey J. The future of data analysis. Ann Math Stat. 1962;23(1):167. https://doi.org/10.1214/aoms/1177704711 CrossrefGoogle Scholar
35. Højsgaard S, Halekoh U, Yan J. The R Package Geepack for generalized estimating equations. J Stat Softw. 2006;15(2):111. https://doi.org/10.18637/jss.v015.i02 Google Scholar
36. Center for Medicare and Medicaid Services. Medicaid and CHIP beneficiaries at a glance. February 2020. Available at: https://www.medicaid.gov/medicaid/quality-of-care/downloads/beneficiary-ataglance.pdf. Accessed October 28, 2021. Google Scholar
37. London K, Tikkanen R. Sustainable financing models for community health worker services in Connecticut: translating science into practice. Commonwealth Medicine Publications. 2017. Available at: https://escholarship.umassmed.edu/commed_pubs/32. Accessed February 13, 2022. Google Scholar
38. Vaughan K, Kok MS, Witter S, Dieleman M. Costs and cost-effectiveness of community health workers: evidence from a literature review. Hum Resour Health. 2015;13(1):71. https://doi.org/10.1186/s12960-015-0070-y Crossref, MedlineGoogle Scholar
39. Heisler M, Hofer TP, Schmittdiel JA, et al. Improving blood pressure control through a clinical pharmacist outreach program in patients with diabetes mellitus in 2 high-performing health systems: the adherence and intensification of medications cluster randomized, controlled pragmatic trial. Circulation. 2012;125(23):28632872. https://doi.org/10.1161/CIRCULATIONAHA.111.089169 Crossref, MedlineGoogle Scholar
40. Damschroder LJ, Lowery JC. Evaluation of a large-scale weight management program using the Consolidated Framework for Implementation Research (CFIR). Implement Sci. 2013;8(1):51. https://doi.org/10.1186/1748-5908-8-51 Crossref, MedlineGoogle Scholar
41. Lapidos A, Kieffer E, Guzman R, Hess K, Flanders T, Heisler M. Barriers and facilitators to community health worker outreach and engagement in Detroit, Michigan: a qualitative study. Health Promot Pract. 2021; Epub ahead of print. https://doi.org/10.1177/15248399211031818 Crossref, MedlineGoogle Scholar


No related items




Michele Heisler, MD, MPA , Adrienne Lapidos, PhD , Edith Kieffer, MPH, PhD , James Henderson, PhD , Rebeca Guzman, BA , Jasmina Cunmulaj, BA , Jason Wolfe, MPP , Trish Meyer, EdM , and John Z. Ayanian, MD, MPP Michele Heisler is with the University of Michigan Medical School, Ann Arbor. Adrienne Lapidos is with the University of Michigan Department of Psychiatry, Ann Arbor. Edith Kieffer is with the University of Michigan School of Social Work, Ann Arbor. James Henderson is with the University of Michigan Consulting for Statistics, Computing and Analytics Research, Ann Arbor. Rebeca Guzman is with the Detroit Health Department, Detroit, MI. Jasmina Cunmulaj is with the University of Michigan School of Public Health, Ann Arbor. Jason Wolfe, Trish Meyer, and John Z. Ayanian are with the University of Michigan Institute of Healthcare Policy and Innovation, Ann Arbor. “Impact on Health Care Utilization and Costs of a Medicaid Community Health Worker Program in Detroit, 2018–2020: A Randomized Program Evaluation”, American Journal of Public Health 112, no. 5 (May 1, 2022): pp. 766-775.


PMID: 35324259