Objectives. We have described the practice of designing for dissemination among researchers in the United States with the intent of identifying gaps and areas for improvement.

Methods. In 2012, we conducted a cross-sectional study of 266 researchers using a search of the top 12 public health journals in PubMed and lists available from government-sponsored research. The sample involved scientists at universities, the National Institutes of Health, and the Centers for Disease Control and Prevention in the United States.

Results. In the pooled sample, 73% of respondents estimated they spent less than 10% of their time on dissemination. About half of respondents (53%) had a person or team in their unit dedicated to dissemination. Seventeen percent of all respondents used a framework or theory to plan their dissemination activities. One third of respondents (34%) always or usually involved stakeholders in the research process.

Conclusions. The current data and the existing literature suggest considerable room for improvement in designing for dissemination.

The effective dissemination of information on priorities, health risks, and evidence-based interventions in public health is a formidable challenge.1,2 Dissemination is an active approach of spreading evidence-based information to the target audience via determined channels using planned strategies.3 Studies from both clinical and public health settings suggest that evidence-based practices are not being disseminated effectively.4–6 For example, in a study of US adults, only 55% of overall care received was based on what is recommended in the scientific literature.7 In a study of US public health departments, an estimated 58% of programs and policies were reported as “evidence-based.”8

To illustrate the dissemination challenges and possible solutions, research on evidence-based interventions has now taught us several important lessons:

    dissemination generally does not occur spontaneously and naturally4;

    passive approaches to dissemination are largely ineffective9,10;

    single-source prevention messages are generally less effective than comprehensive, multilevel approaches11,12;

    stakeholder involvement in the research or evaluation process is likely to enhance dissemination (so-called practice-based research)13–19;

    theory and frameworks for dissemination are beneficial20,21; and

    the process of dissemination needs to be tailored to specific audiences.22

The difficulty in dissemination is the result of differing priorities.23,24 For researchers, the priority is often on discovery (not application) of new knowledge, whereas for practitioners and policymakers, the priority is often on practical ways for applying these discoveries for their settings.25 The chasm between researchers and practitioners was illustrated in a “Designing for Dissemination” workshop sponsored by the US National Cancer Institute.26 In this workshop, all participants acknowledged the importance of dissemination. Researchers reported their role was to identify effective interventions, but that they were not responsible for dissemination of research findings. Similarly, practitioners did not believe they were responsible for dissemination.

It has been recommended that researchers should identify dissemination partners before conducting discovery research, so that those who might adopt the discoveries will see the research process and results in a collaborative manner.24,27 Ultimately, we need to better understand how to design interventions with the elements most critical for external validity in mind,28–30 addressing these issues during early, developmental phases, and not near the end of a project.24,31 To date, few studies have evaluated the extent to which researchers are designing their studies for dissemination and how the design process may differ by researcher background and setting of research.

In the present study, we described the practice of designing for dissemination (D4D) among researchers in the United States with the intent of identifying gaps and areas for improvement.

To begin the sampling, we conducted a PubMed search for the top 12 journals in the category “public, environmental and occupational health” with the highest impact factors.32 The following journals were searched: American Journal of Epidemiology; American Journal of Preventive Medicine; American Journal of Public Health; Annual Review of Public Health; Bulletin of the World Health Organization; Cancer Epidemiology, Biomarkers & Prevention; Environmental Health Perspectives; Epidemiologic Reviews; Epidemiology; International Journal of Epidemiology; Tobacco Control; and World Health Organization Technical Report Series. The goal of the search was to identify a set of lead authors to constitute the sample. The search was limited to authors based in the United States, from October 1, 2008, to October 1, 2011, and excluded certain article types (commentaries, biographies, historical articles, classical articles, reviews, meta-analyses, webcasts). We chose the set of 12 journals and time frame to represent a range of disciplines across public health and enough researchers from the National Institutes of Health (NIH) and the Centers for Disease Control and Prevention (CDC) to allow subgroup analyses. The lead author’s affiliation was used to identify 100 intramural NIH researchers and 91 CDC researchers. We then drew a random sample of 200 names of the 2738 researchers not affiliated with NIH or CDC.

Next, using the NIH RePORTER database (an electronic tool for searching NIH-funded research projects), we identified 57 NIH extramural grantees in dissemination and implementation research, In addition, we randomly selected 100 names from a list of 335 investigators affiliated with the CDC’s Prevention Research Centers (PRCs) Program (compiled from the PRC website). We identified e-mail addresses through PubMed, RePORTER, university websites, and the directory of the US Department of Health and Human Services (for NIH and CDC employees). Together, these sources provided an initial pool of 548 researchers. Of these researchers, 488 constituted the valid number of participants for the denominator (excluding failed e-mail addresses, previous unknown duplicates, and individuals who were deceased or disabled).

Questionnaire Development

The survey tool was based in part on a similar study conducted in the United Kingdom (UK) by Wilson et al.33 The project team obtained the UK tool, modified certain questions for the US context, and added new questions (e.g., questions asking participants to rank barriers to dissemination and to identify methods for involving stakeholder groups in dissemination efforts). The tool was pilot tested with 10 researchers housed at the PRC in St. Louis, Missouri. It was then revised, resulting in a final questionnaire with 35 questions. The questionnaire is available from the first author on request.

Data Collection and Analyses

Data were collected using the online survey software developed by Qualtrics (Provo, UT).34 Each participant received a unique link to the survey, and nonrespondents received 3 reminder emails. Because the initial response rates were lower from NIH and CDC respondents, 5 weeks after the launch we attempted to call nonrespondents at NIH and CDC. Because incentives increase response rates,35 we offered a $20 gift card (from Amazon.com) to all participants who completed the survey. The survey remained open for 8 weeks (from January 10, 2012, to March 6, 2012). The overall response rate was 54.5% (n = 266 of 488), with a response rate of 61% among university researchers, 41% among CDC scientists, and 38% among NIH scientists. The median completion time for the survey was 11 minutes.

Descriptive analyses were conducted. Sample sizes varied in subgroup analyses because of missing data (e.g., “don’t know” coded to missing). Bivariate relationships were analyzed using the independent sample t-test or the Pearson χ2 test.

The largest percentage of respondents worked in a university (65%), followed by the CDC (13%), NIH (9%), and a variety of other settings, such as nonprofit organizations, think tanks, and health departments (13%). Across the entire sample, 66% had previously worked in a practice or policy setting where research findings might be applied. The highest percentage of those with previous practice experience was shown for CDC researchers (79%) and the lowest for NIH researchers (36%). When respondents were asked to estimate the time they spent on dissemination, 73% of respondents in the pooled sample estimated they spent less than 10% of their time on dissemination. In this same sample, 11% spent at least 20% of their time on dissemination. There was considerable variation by setting, with the highest percentage of those spending at least 20% of the time on dissemination among university researchers (13.5%) and the lowest among NIH researchers (4%).

Findings related to infrastructure (i.e., people or physical structures to support dissemination) showed that 73% of respondents had formal training in communication or ready access to someone with skills in health communication (Table 1). Researchers at NIH and CDC had the highest likelihood of access to health communication experts (88%). About half of respondents (53%) had a person or team in their unit dedicated to dissemination. Researchers at NIH had the highest likelihood of a dedicated unit (80%), and those in universities not affiliated with PRCs had the lowest frequency of access to such a unit (41%). Those with a dedicated person or unit were asked to provide characteristics of the unit (open-ended responses). This was nearly always a person or unit in health communication, media relations, or public relations. A few respondents reported having a person to craft messages for policymakers or develop newsletters for community members. Those who did not have a formal dissemination unit were asked if they would like to have such a unit, and 68% expressed desire for such an entity. About one third of respondents reported that their unit or department had a formal communication or dissemination strategy (32%), with the highest proportion with a strategy at NIH (52%) and the lowest at non-PRC universities (22%).


TABLE 1— Dissemination Infrastructure by Setting: United States, Public Health Researchers Designing for Dissemination Survey, 2012

TABLE 1— Dissemination Infrastructure by Setting: United States, Public Health Researchers Designing for Dissemination Survey, 2012

VariableAll (n = 266), No. (%)No PRC Affiliation (n = 109), No. (%)PRC Affiliation (n = 63), No. (%)NIH (n = 25), No. (%)CDC (n = 34), No. (%)Other (n = 34), No. (%)Pa
Formal training in or ready access to someone with formal training in communication
 Yes194 (73)68 (62)54 (86)22 (88)30 (88)19 (56)< .001
 No71 (27)41 (38)9 (14)3 (12)4 (12)14 (41)
 Not sure1 (0.4)00001 (2.9)
Dedicated person/team for dissemination in unit/organization
 Yes140 (53)45 (41)31 (49)20 (80)23 (68)20 (59).001
 No109 (41)55 (51)32 (51)3 (12)8 (24)11 (32)
 Not sure17 (6)9 (8)02 (8)3 (9)3 (8.8)
Unit/department has formal communication/ dissemination strategy
 Yes84 (32)24 (22)21 (34)13 (52)16 (47)10 (29).002
 No114 (43)56 (52)32 (52)3 (12)9 (27)14 (41)
 Not sure66 (25)28 (26)9 (15)9 (36)9 (27)10 (29)

Note. CDC = Centers for Disease Control and Prevention; NIH = National Institutes of Health; PRC = Prevention Research Center.

aP values were determined by using the Pearson χ2 test (excludes “Other” category and “Not sure” when there were 0 cells).

Seventeen percent of all respondents always or usually used a framework or theory to plan their dissemination activities (Table 2). Those respondents affiliated with a PRC (26%) and those with experience in a practice setting (20%) were more likely than other subgroups to use a framework or theory to plan dissemination activities. Among those reporting use of a framework or theory (reported in an open-ended question), the most commonly used theory was Diffusion of Innovations. Other frameworks or theories included RE-AIM (Reach, Effectiveness, Adoption, Implementation, Maintenance), social cognitive theory, social marketing, network theory, and several different frameworks for community-based participatory research. One third of respondent always or usually produced summaries (e.g., issue briefs) for nonresearch audiences. Participants from universities with a PRC were most likely to produce summaries for nonresearch audiences (54% reported always or usually doing so), and university researchers without a PRC were least likely (20% reported always or usually doing so).


TABLE 2— Dissemination Activities by Setting and Practice Experience: United States, Public Health Researchers Designing for Dissemination Survey, 2012

TABLE 2— Dissemination Activities by Setting and Practice Experience: United States, Public Health Researchers Designing for Dissemination Survey, 2012

Place of Work
Ever Worked in Practice/Policy Setting
VariableAll (n = 266), No. (%)University—No PRC Affiliation (n = 109), No. (%)University With PRC Affiliation (n = 63), No. (%)NIH (n = 25), No. (%)CDC (n = 34), No. (%)PaYes (n = 173), No. (%)No/Not Sure (n = 91), No. (%)Pa
Use of framework/theory to plan dissemination activities
 Always/usually44 (17)17 (16)16 (26)3 (12)5 (15).00235 (20)9 (10)< .001
 Sometimes/rarely101 (38)40 (37)30 (48)6 (24)13 (39)77 (45)23 (26)
 Never68 (26)29 (27)13 (21)4 (16)11 (33)42 (24)26 (29)
 Not sure23 (9)10 (9)3 (5)5 (20)0 (0)10 (6)12 (14)
 Does not plan dissemination-related activities27 (10)12 (11)0 (0)7 (28)4 (12)8 (5)19 (21)
How often produce summaries for nonresearch audiences
 Always/usually85 (32)22 (20)34 (54)7 (28)11 (32).00168 (39)14 (18)< .001
 Sometimes/rarely159 (60)76 (70)26 (41)15 (60)22 (65)96 (56)53 (67)
 Never21 (8)10 (9)3 (5)3 (12)1 (3)9 (5)12 (15)
How often stakeholders involved
 Always/usually87 (34)24 (23)42 (67)4 (19)9 (27)< .00169 (40)16 (19)< .001
 Sometimes/rarely126 (49)53 (50)21 (33)10 (48)21 (62)83 (48)43 (51)
 Never45 (17)29 (27)0 (0)7 (33)4 (12)20 (12)25 (30)
How stakeholders are involvedb
 Engage stakeholders as advisors154 (72)51 (66)54 (86)9 (64)17 (57).013112 (74)40 (68).392
 Work with stakeholders to make your research relevant to their setting133 (62)49 (64)46 (73)6 (43)15 (50).062103 (69)28 (48).006
 Understand how to enhance the relevance of your study to stakeholders126 (59)47 (61)41 (65)8 (57)12 (40).134101 (66)23 (39)< .001
 Involve stakeholders in the research process110 (52)34 (44)42 (67)7 (50)11 (37).01890 (59)19 (32)< .001
 Understand how study findings fit with their organizational goals/mission105 (49)37 (48)36 (57)6 (43)12 (40).41388 (58)17 (29)< .001
 Seek resources for dissemination73 (34)23 (30)28 (44)5 (36)6 (20).09859 (39)13 (22).021
Stage of stakeholder involvement
 Proposal73 (27)30 (28)27 (43)2 (18)6 (18)< .00156 (32)16 (18).006
 Data collection/analysis or draft report/manuscript39 (14)10 (9)10 (16)0 (0)10 (29)22 (13)16 (18)
 Final report/manuscript65 (24)32 (29)5 (8)11 (44)13 (38)42 (24)23 (25)
 All stages47 (18)20 (18)16 (25)2 (8)1 (3)34 (20)13 (14)
 Rarely plans these dissemination activities42 (16)17 (16)5 (8)10 (40)4 (12)19 (11)23 (25)

Note. CDC = Centers for Disease Control and Prevention; NIH = National Institutes of Health; PRC = Prevention Research Center.

aP values were determined by using the Pearson χ2 test.

bRespondents who never involve stakeholders (n = 45) did not see this question.

The extent of stakeholder involvement in the research process was examined (Table 2). One third of respondents (34%) always or usually involved stakeholders in the research process. The subgroups most likely to involve stakeholders were those affiliated with a PRC (67%) and those with practice experience (40%). The respondents least likely to involve stakeholders were those at NIH (19%). The 3 most common methods of stakeholder involvement included engaging them as advisors (72%), working together to make research more relevant to their setting (62%), and understanding how to enhance relevance of research (59%). Among the various stages of involvement, stakeholders were more likely to be involved during proposal development (27%) and at the time of final report or article writing (24%). One quarter of respondents at PRCs reported stakeholder involvement at all stages of the research process, and 3% of researchers at CDC also reported such involvement.

Designing for dissemination is an active process that helps to ensure that public health interventions, often evaluated by researchers, are developed in ways that match well with adopters’ needs, assets, and time frames. The findings from this study, which might be the first of its kind in the United States, showed gaps in how well researchers are designing for dissemination. For example, only 17% of respondents always or usually based efforts on a framework or theory, and 34% of researchers always or usually involved stakeholders in their study processes. Although some of the D4D factors were harder to change (e.g., infrastructure), other D4D activities were readily within the control of the research team (e.g., involving stakeholders, producing user-friendly summaries of research).

There were interesting patterns by setting and location. For infrastructure variables (Table 1), federal respondents (NIH and CDC) had greater access than university researchers to a dedicated person or team for dissemination and were more likely to have a formal dissemination strategy. However, for dissemination activities (Table 2), university respondents with a PRC affiliation were the most likely to use a framework or theory, produce summaries for nonresearch audiences, and engage stakeholders in the process. The higher level of activity for PRC respondents was not surprising, given their focus on translational research and participatory methods.36 The present findings could be compared with results from a recent study from the United Kingdom that sampled 243 principal investigators and had a response rate similar to ours.33 In the present study, 73% of respondents spent less than 10% of their time on dissemination compared with 66% in the United Kingdom. Researchers in the United States appeared to have greater access to a person or team for dissemination than those in the United Kingdom (United States = 53%; United Kingdom = 20%) and were more likely to have a formal communication or dissemination strategy (United States = 32%; United Kingdom = 20%). Future research could better address the type of research setting and country differences via triangulated (quantitative and qualitative) methods.37

Only 17% respondents always or usually relied on a framework or theory to guide their dissemination efforts. Furthermore, respondents reported using Diffusion of Innovations, RE-AIM, social cognitive theory, social marketing, or networking; however, this only represented a fraction of the frameworks available to researchers. Although these might be the frameworks most common in public health, many frameworks that researchers would find beneficial might have been developed in other fields. Given the value of theory38 and the large number of available theories and frameworks that are quite diverse in a number of factors, including origin and scope,20 an initial action could involve reviewing possible frameworks and adopting 1 or more within a research unit. Several reviews demonstrated that there is a wealth of such frameworks from which researchers and practitioners can draw.20,21,39

It is also important to consider that the D4D process may vary, depending on the type of research (e.g., analysis of existing data) and maturity of a body of literature. Not all research should lead to public health action because findings may be equivocal, the cost of action is high, or public health audiences may not be equipped to implement new interventions. Some have referred to the “anxiety of the week” syndrome, where early or equivocal findings can receive intense media attention.40 Even in these cases, there are advantages to having public health practice and policy involved and informed. For researchers who primarily analyze existing data (i.e., collected by others) or etiologic research, it is unrealistic to expect a D4D process that involves stakeholders at every stage from proposal development to data analysis and interpretation. Many of these D4D challenges need particular attention in settings with high health disparities, where the system constraints may be the greatest, and the delivery systems are often underdeveloped.1,41

These findings raised a number of important questions: What might enhance efforts in dissemination? What system-level changes are needed? What might motivate and encourage researchers to focus more strongly on dissemination? Based on the findings of the present study and related literature,10,20,22,24,28,31,42–44 Table 3 addresses some of these issues by describing possible D4D actions at the levels of system, process, and production. At the systems level, fundamental changes are needed in how research is funded and how researchers are incentivized. These actions (e.g., how research grants are evaluated, promotion or tenure guidelines) will require a long-term commitment to dissemination, and in some cases, are not easy to change. There is evidence of progress on several systems-level actions, including an NIH Program Announcement on dissemination and implementation research,45 some grant mechanisms that include a specific section calling for a dissemination plan,46 and new tools for tracking measures, such as the National Cancer Institute’s Grid-Enabled Measures.47 System-level variations may in part explain the differences between the United States and the United Kingdom.


TABLE 3— Principles for Designing for Dissemination: United States, Public Health Researchers Designing for Dissemination Survey, 2012

TABLE 3— Principles for Designing for Dissemination: United States, Public Health Researchers Designing for Dissemination Survey, 2012

DomainSample Actions
System changes
 Shift research funder priorities and processesMake dissemination (e.g., a dissemination plan) a scoreable part of funding announcements
Include stakeholders in the grant review process
Provide rapid funding for practice-based research with high dissemination potential
Provide supplemental funding for dissemination
 Shift researcher incentives and opportunitiesProvide academic incentives and credit, including impacts on promotion and tenure decisions (provide prototype promotion and tenure policies)
Hire faculty with practice experience
Provide opportunities for faculty to spend time in practice settings
Conduct trainings to improve dissemination, implementation, evaluation, and translation
 Develop new measures and toolsIdentify measures for evaluating dissemination efforts
Maintain systems for tracking the measures
Develop tools for designing for dissemination
 Develop new reporting standardsDevelop standards for reporting research that focus more fully on dissemination
Promote new dissemination and implementation reporting standards
 Identify infrastructure requirementsIdentify people required for dissemination and evaluation
Identify system requirements (information technology, media)
 Involve stakeholders as early in the process as possibleEngage as advisors and collaborators
Engage in the research process
 Engage key stakeholders (receptors) for research through audience researchIdentify gaps in research, relevance of methods, messagesEnsure stakeholders represent potential adopter organizationsIdentify opinion leaders for uptakeIdentify barriers to disseminationIdentify success and failure stories
 Identify frameworks or theories for dissemination effortsReview existing frameworks for applicable constructs
Pilot test measures for assessing model constructs among key stakeholders
Develop models for dissemination actions that are context relevant
 Identify the appropriate means of delivering the messageIdentify the optimal disseminator (usually not the researcher)
Link the researcher, practice, and policy specialists with the disseminator
Identify channels for dissemination or mode of knowledge transfer
 Identify the appropriate messageFor interventions, document evidence of effectiveness, cost of implementation, and cost-effectiveness
For etiologic research, address risk communication
Document evidence of disseminability or ease of use
 Develop summaries of research in user-friendly, nonacademic formats (audience tailoring)Develop issue briefs, policy briefs, case studiesIdentify potential roles for social media (e.g., Twitter, Facebook) Deliver presentations to stakeholders

In general, these D4D principles apply best to research studies that are testing interventions or other approaches in dissemination research where an evidence-based approach and target audience is clear. This set of principles also raises the question of who the disseminator should be. For more than half of the respondents to our survey, that was likely to be an expert or team in dissemination or communication. In most cases, the disseminator should not be the researcher.15,25 As articulated by Kreuter et al.,25 the public health sector can learn from commercial marketers, where systems are set up for distribution management, marketing, technical assistance, and other services. In addition to these marketing skills, successful D4D efforts are likely to involve transdisciplinary teams that involve experts in communication, participatory research, and translational research.

In addressing the D4D actions in Table 3, researchers should recognize that different audiences require specific communication messages and channels.24,26,43 The intent of the D4D activities is to ensure that findings from research projects are useful, relevant, and ready for widespread dissemination when the research funding ends. Two related fields of inquiry can assist in these efforts. First, health communication calls for audience segmentation, where experts ask thoughtful questions, conduct audience research, and learn from the literature.44 In addition, principles of community-based participatory research (i.e., systematic inquiry that involves participation of those affected by the issue being studied, for the purposes of education and taking action48) places emphasis on genuine stakeholder engagement throughout the research process and the use of findings to help bring about change.18 These participatory methods are likely to be useful in the D4D process.

There were a few limitations of this study. First, the data were self-reported, and we had no way to objectively compare the reported dissemination activities with actual infrastructure and practices; additionally, reliability of the measures was not assessed. Second, among the various D4D activities, there was sparse literature on the relative effectiveness of various approaches, making it difficult to evaluate their precise impacts on public health practice and policy. Third, the focus of the present study was on high impact journals, which might omit important, practice-oriented journals with a significant focus on dissemination. Finally, response bias might be present, given our response rate of 54.5%.

In summary, the current data and the existing literature suggested that considerable room for improvement exists for better D4D. Researchers need to better recognize the practical applications of their findings, and learn to identify collaborations and build partnerships with key stakeholders that can address the many complexities of moving a project from discovery to widespread dissemination. To accomplish more effective dissemination, it is important to address D4D issues early in the research process (not when a grant is ending). In this time of increasing pressure on scientific resources, researchers should continue to meet the implied obligation to the public that the billions of dollars invested in scientific discovery will continue to yield specific and tangible benefits to their health.


This study was supported in part by the Centers for Disease Control and Prevention (Cooperative Agreement Number U48/DP001903; the Prevention Research Centers Program); National Cancer Institute (Transdisciplinary Research in Energetics and Cancer; grant U48/CA155496); the National Institutes of Health-National Center for Research Resources and the National Center for Advancing Translational Sciences (grants UL1 TR000448 and TL1 TR000449/KL2 TR000450); and the National Institute of Diabetes and Digestive and Kidney Diseases (NIDDK; grant 1P30DK092950).

We are grateful to Paul M. Wilson of the University of York, United Kingdom for sharing his study instrument. We also thank David Chambers, Russ Glasgow, and Jon Kerner for their helpful input.

Note. The contents of this article are solely the responsibility of the authors and do not necessarily represent the official views of the NIDDK.

Human Participant Protection

Human participant approval was obtained from the Washington University institutional review board.


1. Glasgow RE, Vinson C, Chambers D, Khoury MJ, Kaplan RM, Hunter C. National Institutes of Health approaches to dissemination and implementation science: current and future directions. Am J Public Health. 2012;102(7):12741281. LinkGoogle Scholar
2. Stamatakis K, Vinson C, Kerner J. Dissemination and implementation research in community and public health settings. In: Brownson R, Colditz G, Proctor E, eds. Dissemination and Implementation Research in Health: Translating Science to Practice. New York, NY: Oxford University Press; 2012:359383. CrossrefGoogle Scholar
3. Rabin B, Brownson R. Developing the terminology for dissemination and implementation research. In: Brownson R, Colditz G, Proctor E, eds. Dissemination and Implementation Research in Health: Translating Science to Practice. New York, NY: Oxford University Press; 2012:2351. CrossrefGoogle Scholar
4. Glasgow RE, Marcus AC, Bull SS, Wilson KM. Disseminating effective cancer screening interventions. Cancer. 2004;101(suppl 5):12391250. Crossref, MedlineGoogle Scholar
5. Green LW, Ottoson JM, Garcia C, Hiatt RA. Diffusion theory, and knowledge dissemination, utilization, and integration in public health. Annu Rev Public Health. 2009;30(1):151174. Crossref, MedlineGoogle Scholar
6. Lomas J. Words without action? The production, dissemination, and impact of consensus recommendations. Annu Rev Public Health. 1991;12(1):4165. Crossref, MedlineGoogle Scholar
7. McGlynn EA, Asch SM, Adams J, et al. The quality of health care delivered to adults in the United States. N Engl J Med. 2003;348(26):26352645. Crossref, MedlineGoogle Scholar
8. Dreisinger M, Leet TL, Baker EA, Gillespie KN, Haas B, Brownson RC. Improving the public health workforce: evaluation of a training course to enhance evidence-based decision making. J Public Health Manag Pract. 2008;14(2):138143. Crossref, MedlineGoogle Scholar
9. Bero LA, Grilli R, Grimshaw JM, Harvey E, Oxman AD, Thomson MA. Closing the gap between research and practice: an overview of systematic reviews of interventions to promote the implementation of research findings. The Cochrane Effective Practice and Organization of Care Review Group. BMJ. 1998;317(7156):465468. Crossref, MedlineGoogle Scholar
10. Lehoux P, Denis JL, Tailliez S, Hivon M. Dissemination of health technology assessments: identifying the visions guiding an evolving policy innovation in Canada. J Health Polit Policy Law. 2005;30(4):603641. Crossref, MedlineGoogle Scholar
11. Richard L, Gauvin L, Raine K. Ecological models revisited: their uses and evolution in health promotion over two decades. Annu Rev Public Health. 2011;32(1):307326. Crossref, MedlineGoogle Scholar
12. Zaza S, Briss PA, Harris KW, eds. The Guide to Community Preventive Services: What Works to Promote Health? New York, NY: Oxford University Press; 2005. CrossrefGoogle Scholar
13. Green LW. Making research relevant: if it is an evidence-based practice, where’s the practice-based evidence? Fam Pract. 2008;25(suppl 1):i20i24. Crossref, MedlineGoogle Scholar
14. Greene J. Stakeholder participation in evaluation design: is it worth the effort? Eval Program Plann. 1987;10(4):379394. CrossrefGoogle Scholar
15. Harris JR, Cheadle A, Hannon PA, et al. A framework for disseminating evidence-based health promotion practices. Prev Chronic Dis. 2012;9(1):E22. MedlineGoogle Scholar
16. Glasgow RE, Goldstein MG, Ockene JK, Pronk NP. Translating what we have learned into practice. Principles and hypotheses for interventions addressing multiple behaviors in primary care. Am J Prev Med. 2004;27(2 suppl):88101. Crossref, MedlineGoogle Scholar
17. Mendel P, Meredith LS, Schoenbaum M, Sherbourne CD, Wells KB. Interventions in organizational and community context: a framework for building evidence on dissemination and implementation in health services research. Adm Policy Ment Health. 2008;35(1–2):2137. Crossref, MedlineGoogle Scholar
18. Minkler M, Salvatore A. Participatory approaches for study design and analysis in dissemination and implementation research. In: Brownson R, Colditz G, Proctor E, eds. Dissemination and Implementation Research in Health: Translating Science to Practice. New York, NY: Oxford University Press; 2012:192212. CrossrefGoogle Scholar
19. Wandersman A, Duffy J, Flaspohler P, et al. Bridging the gap between prevention research and practice: the interactive systems framework for dissemination and implementation. Am J Community Psychol. 2008;41(3-4):171181. Crossref, MedlineGoogle Scholar
20. Tabak RG, Khoong EC, Chambers DA, Brownson RC. Bridging research and practice: models for dissemination and implementation research. Am J Prev Med. 2012;43(3):337350. Crossref, MedlineGoogle Scholar
21. Wilson PM, Petticrew M, Calnan MW, Nazareth I. Disseminating research findings: what should researchers do? A systematic scoping review of conceptual frameworks. Implement Sci. 2010;5(1):91. Crossref, MedlineGoogle Scholar
22. Lomas J. Diffusion, dissemination, and implementation: who should do what? Ann N Y Acad Sci. 1993;703(1):226235; discussion 235–237. Crossref, MedlineGoogle Scholar
23. Colditz GA, Emmons KM, Vishwanath K, Kerner JF. Translating science to practice: community and academic perspectives. J Public Health Manag Pract. 2008;14(2):144149. Crossref, MedlineGoogle Scholar
24. Owen N, Goode A, Fjeldsoe B, Sugiyama T, Eakin E. Designing for the dissemination of environmental and policy initiatives and programs for high-risk groups. In: Brownson R, Colditz G, Proctor E, eds. Dissemination and Implementation Research in Health: Translating Science to Practice. New York, NY: Oxford University Press; 2012:114127. CrossrefGoogle Scholar
25. Kreuter MW, Bernhardt JM. Reframing the dissemination challenge: a marketing and distribution perspective. Am J Public Health. 2009;99(12):21232127. LinkGoogle Scholar
26. National Cancer Institute. Designing for Dissemination: Conference Summary Report. Washington, DC: National Cancer Institute; 2002. Google Scholar
27. Brownson R, Dreisinger M, Colditz G, Proctor E. The path forward in dissemination and implementation research. In: Brownson R, Colditz G, Proctor E, eds. Dissemination and Implementation Research in Health: Translating Science to Practice. New York, NY: Oxford University Press; 2012:498508. CrossrefGoogle Scholar
28. Glasgow RE, Emmons KM. How can we increase translation of research into practice? Types of evidence needed. Annu Rev Public Health. 2007;28(1):413433. Crossref, MedlineGoogle Scholar
29. Green LW, Glasgow RE. Evaluating the relevance, generalization, and applicability of research: issues in external validation and translation methodology. Eval Health Prof. 2006;29(1):126153. Crossref, MedlineGoogle Scholar
30. Klesges LM, Estabrooks PA, Dzewaltowski DA, Bull SS, Glasgow RE. Beginning with the application in mind: designing and planning health behavior change interventions to enhance dissemination. Ann Behav Med. 2005;29(suppl):6675. Crossref, MedlineGoogle Scholar
31. Scullion PA. Effective dissemination strategies. Nurse Res. 2002;10(1):6577. Crossref, MedlineGoogle Scholar
32. Journal Citation Reports. Thomson Reuters. Available at: http://thomsonreuters.com/products_services/science/science_products/a-z/journal_citation_reports/#tab1. Accessed June 9, 2012. Google Scholar
33. Wilson PM, Petticrew M, Calnan MW, Nazareth I. Does dissemination extend beyond publication: a survey of a cross section of public funded research in the UK. Implement Sci. 2010;5(1):61. Crossref, MedlineGoogle Scholar
34. Qualtrics: Survey Research Suite. Available at: http://www.qualtrics.com. Accessed December 24, 2011. Google Scholar
35. Dillman D, Smyth J, Melani L. Internet, Mail, and Mixed-Mode Surveys: The Tailored Design Method. 3rd ed. Hoboken, NJ: John Wiley & Sons, Inc; 2009. Google Scholar
36. Ammerman A, Harris JR, Brownson RC, Tovar-Aguilar JA. CDC’s Prevention Research Centers program: translating research into action with communities. J Prim Prev. 2011;32(3–4):131134. Crossref, MedlineGoogle Scholar
37. Creswell J, Klassen A, Plano Clark V, Clegg Smith K. Best Practices for Mixed Methods Research in the Health Sciences. Bethesda, MD: Office of Behavioral and Social Sciences Research, National Institutes of Health; 2011. Google Scholar
38. Glanz K, Bishop DB. The role of behavioral science theory in development and implementation of public health interventions. Annu Rev Public Health. 2010;31(1):399418. Crossref, MedlineGoogle Scholar
39. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4(1):50. Crossref, MedlineGoogle Scholar
40. Taubes G. Epidemiology faces its limits. Science. 1995;269(5221):164169. Crossref, MedlineGoogle Scholar
41. Yancey A, Glenn B, Bell-Lewis L, Ford C. Dissemination and implementation research in populations with health disparities. In: Brownson R, Colditz G, Proctor E, eds. Dissemination and Implementation Research in Health: Translating Science to Practice. New York, NY: Oxford University Press; 2012:459482. CrossrefGoogle Scholar
42. Caburnay CA, Kreuter MW, Donlin MJ. Disseminating effective health promotion programs from prevention research to community organizations. J Public Health Manag Pract. 2001;7(2):8189. Crossref, MedlineGoogle Scholar
43. Slater MD. Theory and method in health audience segmentation. J Health Commun. 1996;1(3):267283. Crossref, MedlineGoogle Scholar
44. Slater MD, Kelly KJ, Thackeray R. Segmentation on a shoestring: health audience segmentation in limited-budget and local social marketing interventions. Health Promot Pract. 2006;7(2):170173. Crossref, MedlineGoogle Scholar
45. National Institutes of Health. Dissemination and Implementation Research in Health (R01). Vol PAR-13-055. Bethesda, MD: National Institutes of Health; 2013. Available at: http://grants.nih.gov/grants/guide/pa-files/PAR-13-055.html. Accessed May 1, 2013. Google Scholar
46. National Institutes of Health. Cancer Education Grants Program (R25). Vol PAR-12-049. Bethesda, MD: National Institutes of Health, National Cancer Institute; 2012. Available at http://grants.nih.gov/grants/guide/pa-files/PAR-12-049.html. Accessed May 1, 2013. Google Scholar
47. Rabin BA, Purcell P, Naveed S, et al. Advancing the application, quality and harmonization of implementation science measures. Implement Sci. 2012;7(1):119. Crossref, MedlineGoogle Scholar
48. Institute of Health Promotion Research, The University of British Columbia, and the BC Consortium for Health Promotion Research. Study of Participatory Research in Health Promotion. Review and Recommendations for the Development of Participatory Research in Health Promotion in Canada. Vancouver, BC: The Royal Society of Canada; 1995. Google Scholar


No related items




Ross C. Brownson, PhD, Julie A. Jacobs, MPH, Rachel G. Tabak, PhD, Christine M. Hoehner, PhD, MSPH, and Katherine A. Stamatakis, PhD, MPHRoss C. Brownson is with the Prevention Research Center in St Louis, Brown School and the Division of Public Health Sciences and Alvin J. Siteman Cancer Center, School of Medicine, Washington University, St Louis, MO. Julie A. Jacobs and Rachel G. Tabak are with the Prevention Research Center in St Louis, Brown School, Washington University. Christine M. Hoehner and Katherine A. Stamatakis are with the Division of Public Health Sciences and Alvin J. Siteman Cancer Center, School of Medicine, Washington University. “Designing for Dissemination Among Public Health Researchers: Findings From a National Survey in the United States”, American Journal of Public Health 103, no. 9 (September 1, 2013): pp. 1693-1699.


PMID: 23865659