To address the vast gap between current knowledge and practice in the area of dissemination and implementation research, we address terminology, provide examples of successful applications of this research, discuss key sources of support, and highlight directions and opportunities for future advances. There is a need for research testing approaches to scaling up and sustaining effective interventions, and we propose that further advances in the field will be achieved by focusing dissemination and implementation research on 5 core values: rigor and relevance, efficiency, collaboration, improved capacity, and cumulative knowledge.

Despite the demonstrable benefits of many new medical discoveries, we have done a surprisingly poor job of putting research findings into practice. The ultimate goal of new discoveries is to enhance human health, yet most discoveries are slow to or never fulfill this promise. The challenge of moving health research innovations from discovery to practice is complex and multifaceted.1 For example, in clinical practice, modern medications that dissolve blood clots are one of the best illustrations of efficacious medical care.2 Nevertheless, substantial portions of the population who could benefit from these medicines do not receive them.

In addition, although the most recent evidence suggests that approximately 80% of people with hypertension are aware that they have this problem, only about 70% of these individuals receive treatment, and only about 50% have their blood pressure controlled even in the short to medium term.3 This suggests that only about half of people with high blood pressure are being successfully treated, apart from consideration of the challenging issues associated with long-term medication adherence.

The disconnect between the percentage of those who could potentially benefit from an evidence-based therapy and those who actually do benefit appears across all areas of health,4–6 with the final percentage of those benefitting from an efficacious intervention often being as low as 1% to 5%.7 The challenges for public health practice are similar. As but one example, only about 65% of individuals older than 65 years have received the pneumococcal vaccine despite evidence that it offers life-extending protection for the elderly.8

How many human health gains could be achieved by decreasing the gap between optimal treatment and what patients actually receive? According to Woolf and Johnson,9 an experimental trial showed that putting red notices on the front of medical charts that simply reminded practitioners about guidelines for cholesterol treatment resulted in initiation of or increases in proper drug use in 94% of patients. By comparison, only 10% of patients in a randomized control group initiated or improved their use of medication.10 Similar points have been made recently by Gawande on how checklists have improved surgical practice.11

Notwithstanding the significant advances in treatments, the public health benefits associated with these improved treatments tend to be modest because they are not widely implemented. Wlodarczyk et al.,12 for example, conducted a meta-analysis of 25 head-to-head studies comparing the cholesterol-lowering drugs rosuvastatin and atorvastatin. The studies reviewed, all randomized clinical trials, involved more than 20 000 participants. Aggregated across studies, there was only a small advantage of rosuvastatin with respect to lowering cholesterol. Weng et al.13 systematically reviewed head-to-head randomized comparisons of statin medications and found that, although the difference was statistically significant, powerful new statins had only a 7% advantage over established statins in lowering cholesterol. Furthermore, there was insufficient evidence that the new medications resulted in incremental reductions in coronary events. However, increasing the reach and consistent use of statins in general (or even aspirin) can have a profound effect on heart attacks and deaths.14

As argued by Woolf and Johnson,9 the return on investment for dissemination and implementation research dwarfs the return on investment for new discovery. In the current federal research model, basic biomedical and behavioral research accounts for the lion’s share of funding. The National Institutes of Health (NIH) has traditionally been committed to discovery and spends about $30 billion each year on basic and efficacy research. Funding for dissemination and implementation research has traditionally been small, and although NIH funding for such research is growing, it remains a very small fraction of spending on basic and efficacy research. In 2010, the Agency for Healthcare Research and Quality spent only about$270 million on research relevant to health quality, dissemination, and outcomes. In other words, for each dollar spent in discovery, mere pennies are spent learning how interventions known to be effective can be better disseminated. Discovery of new and improved interventions is important; to fully realize public health benefits, however, greater attention needs to be devoted to dissemination and implementation sciences to enhance the reach, adoption, implementation, use, and maintenance of new research discoveries.4,15

This article is a collaboration among NIH scientists involved in major research, funding, training, and partnership efforts in dissemination and implementation. It is not a comprehensive review of the field but an explication of current and future directions. Also, it is not an official NIH position paper and represents only our individual opinions. We briefly discuss terminology and provide examples of successful applications of dissemination and implementation science, describe NIH support for this area of research, and highlight current directions and opportunities for future advances.

Terminology

As is common in emerging areas of science, there are different perspectives on the precise definitions and terminology within dissemination and implementation science. There are several related definitions that differ in terms of their relative emphasis on specific issues. Although, as a result of space limitations, we cannot discuss these issues comprehensively, note that there are important commonalities across the different definitions (see Rabin and Glasgow16). However, it is important to understand how NIH currently defines dissemination and implementation.

For present purposes, we adopted the definitions in the NIH program announcement on dissemination and implementation research.17 Dissemination is defined as:

the targeted distribution of information and intervention materials to a specific public health or clinical practice audience. The intent is to spread knowledge and the associated evidence-based interventions.

The active process of dissemination is distinguished from the more passive process of “naturalistic” diffusion that occurs without concerted promotion.18 Implementation is “the use of strategies to adopt and integrate evidence-based health interventions and change practice patterns within specific settings.”

This latter definition is congruent with that used by the leading international journal in the field, Implementation Science, which defines implementation as “the methods to promote the systematic uptake of clinical research findings and other evidence-based practices into routine practice and hence improve the quality and effectiveness of health care.”19 Effectiveness research, which assesses how an intervention that has demonstrated efficacy functions in practice, is a related concept but is not our main focus here. Effectiveness research is similar to dissemination and implementation research in its emphasis on adaptation and testing in real-world settings and diverse populations, but it does not explicitly focus on understanding the spread and adoption of these intervention strategies.

We believe that effectiveness research and dissemination and implementation research are much stronger in tandem than as separate areas of inquiry. Effectiveness research is stronger if it anticipates and includes issues related to dissemination and implementation processes such as adoption decisions and implementation questions within intervention testing, and dissemination or implementation research is stronger if it assesses continued effectiveness as interventions or prevention approaches spread to and are adopted by large, diverse populations.

In addition, reducing dissemination and implementation research to a question of the effectiveness of interventions when they are tested in clinical and community settings omits the multilevel nature of the successful integration of treatment and preventive strategies within practice. Dissemination and implementation research requires attention not only to the individual but to the staff and organization delivering an intervention, the financial and political environment through which services are provided, and the broader societal context through which population health is derived. Successes in dissemination and implementation occur through the development of strategies that facilitate practice improvements,20 organizational change,21 and policy implementation.22 Successful integration of research into practice or policy requires partnership approaches and team science,23 practical interventions,24 greater cost-effectiveness, an increased focus on external validity and reach, and reductions in health disparities.25–27

Despite these areas of consensus, there is ongoing discussion about the number of stages involved in translational research; the value of a linear progression from basic science to efficacy research and effectiveness studies, large-scale demonstrations, and, finally, dissemination; the role and necessity of using nonrandomized controlled trial designs28–30; and other conceptual issues relating to how best to frame this important evolving area.

Recognizing that that there are a number of ways to frame the stages of translational research, we propose 5 key phases in the process of moving research into practice and policy.31,32 Focusing on these phases, each of which addresses different issues and requires somewhat different methods, provides greater clarity about what is needed if evidence-based approaches are to be successfully implemented and sustained in real-world settings. Research is not a one-way process: findings at each stage inform findings in the other stages. Rather than a linear process of translating research findings into practice, we propose a more differentiated approach to the science of dissemination and implementation, as illustrated in Figure 1.

As shown, there are 5 overlapping, interrelated phases of research in advancing from scientific discoveries to population health. Whatever terms are used for these phases and whether public health or medical issues are under consideration, the diagram illustrates the highly iterative nature of the cycle from discovery to translation. The process starts with the identification of a problem and the “discovery” of an opportunity or approach to tackle a health issue (T0). These discoveries can result from multiple sources and disciplines such as molecular or biological insights, behavioral research, or epidemiological research.

The first translational research phase (T1) involves research allowing for the development of tests or other clinical interventions, but it can also lead to nonmedical interventions such as policy, behavioral, social, or other public health interventions. The second research phase (T2) involves a rigorous analysis and investigation of whether the new interventions improve health outcomes (in randomized trials or other study designs). The end result of T2 is evidence-based guidelines and recommendations by professional organizations and independent panels. (NIH often focuses primarily on T1 and T2 research, usually in the context of drug and other clinical interventions, and incorporates all T2–T4 activities under T2.33)

However, even in the case of medical procedures and other interventions, such activities include dissemination of interventions, decisions by health care practitioners or organizations to adopt or use the interventions, implementation of the interventions into standard practice or standard operating procedures of organizations, and maintenance of changes in health care practices by organizations, individual health care practitioners, and patients. Dissemination and implementation science emphasizes investigation and understanding of the processes involved in the adoption, implementation, and sustainability of research.

An incomplete classification of translation can miss important and qualitatively different tasks and issues involved in “later-stage” translation. We focus primarily on such later-stage translation or, as we term it, dissemination and implementation science; we do not address in any detail issues associated with moving from discovery to drug development, an area worthy of its own discussion.33 T3 research includes investigations designed to increase uptake and implementation of evidence-based recommendations into practice, whereas T4 research involves evaluation of the effectiveness and cost-effectiveness of such interventions in the “real world” and in diverse populations. Many discoveries may move rapidly through this cycle or skip steps, may become adopted before others, and may be put in use before evidence-based recommendations are made. However, many discoveries never reach the point of becoming standards of care.26,34

Our 5-phase model illustrates the complexity of large-scale uptake and the opportunity to address this issue through partnerships between medicine and public health,35 and it implies transdisciplinary team interactions23 among basic, clinical, and population sciences. T4 research should also lead to new insights that will fuel new discoveries (e.g., postmarketing surveillance of drugs can uncover new teratogenic or carcinogenic agents). Finally, as can be seen in Figure 1, the translation cycle is guided by ongoing and updated knowledge synthesis, with structured reviews of the status of science to guide implementation and large-scale dissemination research.

Dissemination and Implementation Successes

Although there is a long history at NIH of applied research with a focus on diffusion of innovations and issues such as institutionalization, the specific dissemination and implementation research portfolio at NIH is relatively new and comprises a small proportion of biomedical research funding. However, there have already been noticeable successes. For instance, Lorig et al. have conducted a series of studies of their 6-week chronic disease self-management training program, most often delivered by peer coleaders who themselves have one or more chronic illnesses.36,37 Originally evaluated in randomized studies with arthritis patients, versions of the program have been found to be successful in controlled evaluations among Spanish-speaking diabetes patients and those with multiple chronic illnesses; it has been successfully delivered via the Internet as well.38,39 The Lorig et al. program has been implemented across general practice settings throughout the United Kingdom.40 In addition, their evidence-based model of goal setting, peer support, action planning, and follow-up has been adapted for and replicated across diverse conditions and settings.

Another example is the Diabetes Prevention Program (DPP). Following a successful multisite efficacy trial,41 several investigative teams have explored efficient, practical methods of replicating the results in real-world community settings. For example, Katula et al.42 found that a lifestyle intervention that was based on the DPP and delivered by community health workers resulted in impressive 1-year glucose and weight changes. Other researchers,43 partnering with the national YMCA to offer an adaptation of the DPP lifestyle intervention, achieved changes in weight similar to those observed in the DPP. This work illustrates the importance of adapting interventions to fit delivery contexts.

With respect to tobacco cessation, the widely replicated state “quit line” programs (see reviews in the December 2007 special issue of Tobacco Control44) have probably produced more public health benefits than any other dissemination program. Based on years of successful, smaller scale research as well as pioneering statewide applications in California and Massachusetts, proactive telephone-based cessation counseling has proven not only cost-effective but also feasible to integrate with primary care, mass media promotions, and pharmacological interventions.44 This tobacco example illustrates the importance of large-scale natural experiments and the importance of consistent implementation combined with incentives to produce population change.

NIH Support for Dissemination and Implementation Research

NIH has been working to advance the knowledge base related to disseminating and implementing evidence-based health strategies in real-world settings. These efforts include funding of investigator-initiated applications and targeted activities to build capacity in this area. Since 2000, individual NIH institutes have established dissemination and implementation portfolios, issuing independent funding announcements to solicit applications on the optimal ways to translate research into practice.

A need for trans-NIH opportunities has also been recognized. In 2003, the NIH Office of Behavioral and Social Science Research (OBSSR) convened a trans-NIH committee around this emerging area of research. In 2005, NIH issued the first set of multi-institute program announcements on dissemination and implementation research, covering small grant, exploratory or developmental grant, and research project grant mechanisms. Eight institutes participated along with OBSSR and the Office of Dietary Supplements. Over the next 4 years, 40 dissemination and implementation grants were funded through these program announcements.

Another 4 institutes were added when the program announcements were reissued in 2009, and there was increased attention to global health with the participation of the Fogarty International Center (international investigators can be principal investigators in this area of grants). As a result of the growing number of applications and investments in dissemination and implementation research, the NIH Center for Scientific Review established a standing review committee, Dissemination and Implementation Research in Health, in 2010.

Dissemination and implementation research at NIH is now linked to one of the goals included in the strategic plan of the US Department of Health and Human Services: to identify key factors influencing the scaling up of research-tested interventions across large networks of service systems, such as primary care, specialty care, and community practice, by 2015.45 NIH reports quarterly to the secretary of the Department of Health and Human Services on progress toward this goal, further evidence of the importance to this agency of advancing dissemination and implementation science.

At NIH, there has also been a marked growth in overall interest in the field. Since 2007, NIH has held an annual conference on the science of dissemination and implementation at which research findings are presented and topics are discussed for further development. The first conference, which showcased NIH-funded research, was attended by some 300 people. Starting with the second conference, NIH issued a call for abstracts and balanced original research presentations, symposia, and organized think tanks with plenary sessions around an annual conference theme. The fifth conference, held in March 2012 in partnership with the US Department of Veterans Affairs (VA), had more than 1200 registrants and reflected on the progress made in recent years to develop dissemination and implementation science, spotlighting the promise of technology and policy in population health benefit. Each of the annual conferences has also afforded researchers a chance to receive grant writing assistance from NIH staff and funded researchers. The 2011 conference technical assistance workshop was attended by approximately 100 participants.

To further build research capacity, OBSSR, the National Cancer Institute, the National Institute of Mental Health, and the VA sponsored the weeklong Training Institute for Dissemination and Implementation Research in Health in August 2011 to provide in-depth training for researchers interested in moving into the field. Contingent on future budgets, the intent is to make this an annual training opportunity. Finally, the National Institute of Diabetes and Digestive and Kidney Diseases has launched the Centers for Diabetes Translation Research, which will directly support rigorous translation research aimed at prevention and improved treatment of diabetes. These examples are not comprehensive but reflect the commitment of NIH to train researchers in undertaking studies that balance rigor with relevance and in using designs and methods appropriate for the complex processes involved in dissemination and implementation.

Dissemination and implementation research is at a crossroads. To move the field forward, it will be important to focus on the following key opportunities.

Scaling Up and Sustainability

There has been progress in the quality, quantity, and scope of dissemination and implementation research. However, true return on research investment requires improvements in the adoption and implementation of effective interventions within discrete clinical and community settings. It requires advances in 2 additional dimensions as well: scaling up and sustainability.

Given the diversity of health care settings in the United States, a key challenge is understanding how best to scale up successful interventions to regional, national, or international levels.40,46 There are notable examples, including the DPP initiatives mentioned earlier and the Centers for Disease Control and Prevention’s Diffusion of Effective Behavioral Interventions project, which provides training and technical assistance on selected evidence-based sexually transmitted disease prevention interventions. Enthusiastic about results from efficacy trials, practitioners often want to disseminate results without further research. Yet, a great deal of benefit can be “lost in translation.” In public health, examples of scaling up include universal assessments or screenings such as measurement of blood pressure at primary care visits, vaccinations, and regular Papanicolau test screening. Despite some successes, little evidence exists to recommend optimal strategies for effective large-scale rollout. We encourage studies designed to tackle this important research question.

Sustainability is the long-term integration of effective interventions within specific settings. Although implementation is meaningful only if program and policy results can be sustained, limited data exist on how to maintain changes. Sustainability is commonly viewed either as exogenous (“What will happen once a study ends?”) or as a discrete postimplementation phase (“Twelve months after the implementation, we examined whether change had been sustained”). Current efficacy research is inadequate to effectively inform real-world practice, and indeed recent articles have noted the need to consider explicitly a sustainability phase wherein evidence-based interventions are reexamined to determine how well they fit within the structure and workflow of the organizations where they are delivered.47 We encourage longitudinal observational studies to examine the course of effective interventions once implemented, as well as prospective trials to test sustainability strategies in generalizable populations and settings.

To advance these opportunities, as well as the broader field of dissemination and implementation research, we suggest that the next generation of dissemination and implementation research studies be anchored around 5 core tenets: rigor and relevance, efficiency and speed, collaboration, improved capacity, and cumulative knowledge (Table 1 ).

TABLE 1— Core Dissemination and Implementation Values and Example Activities

TABLE 1— Core Dissemination and Implementation Values and Example Activities

 Core Value Key Opportunities Rigor and relevance Studies focused on diverse, low-resource settings Recognition of the importance of alternative research designs—simulation modeling, pragmatic trials, rapid learning studies, and systematic studies combined with environmental and community data—to address important public health challenges Efficiency and speed Limited funding for large-scale, multisite randomized controlled trials requires a shift to research designs that access existing and expanding data sets Growth in the availability of electronic health record data under new meaningful use guidelines Rapid learning health care systems and organizations Collaboration Team science Community/clinical partnerships through community-based participatory research Clinical and Translational Science Awards Improved capacity Training Institute for Dissemination and Implementation Research in Health E-learning Web 2.0 social networking Cumulative knowledge Emerging textbooks to consolidate knowledge Source materials in diverse fields
Rigor and Relevance

Many of the current approaches to dissemination and implementation as well as comparative effectiveness research48,49 are limited in scope and applicability, underemphasizing the value of context and health services research and often focusing solely on drugs and devices.50 We know that no more than 20% of the variation in health outcomes is affected by the medical care system. The other 80% is determined by factors outside the system, including social and environmental influences.35 Dissemination and implementation research needs to investigate these factors, include diverse and low-resource settings, and recognize the necessity for nonrandomized experimental designs in answering crucial systems and public health questions.

Traditional randomized control trials focused on efficacy and effectiveness have advantages with respect to internal validity and testing of treatments under optimal conditions, but they are less beneficial in terms of external validity, relevance to many complex health care questions, and amount of time needed to produce answers.28,51–53 Alternative research designs and approaches should be considered. One example in the area of health care is well-controlled “rapid learning” studies (e.g., “plan, do, study, act” study cycles54) based on electronic health records (EHRs) from thousands of patients receiving care in real-world settings.55,56 Simulation modeling57 studies, along with longitudinal observational studies, are also needed to address scale up and sustainability.

With the availability of linked community health indicators, geospatial methodologies will make the use of alternative designs more feasible and robust given the explosion of publicly available and actionable data under Open Government Platform initiatives. This is particularly relevant for dissemination and implementation research, given that the goal is to allow diverse populations to benefit from interventions rather than to orchestrate optimal yet unsustainable practice conditions solely for research purposes. As evidenced by the findings from an NIH methodology conference published in 200729 and recent comparative effectiveness research meetings, methods need to fit the question and not vice versa.

Furthermore, the availability of such data is likely to dramatically change the way research is done. There may be fewer prospective studies involving new data collection. Replacing them will be systematic studies that involve information acquired as part of the health care process, combined with social, environmental, and community data from other sources. The major need is to develop methodologies that allow us to harmonize and integrate the enormous amount of data becoming available.58 Engineering sciences and other recent examples (e.g., Community Health Status Indicators) demonstrate that these methodologies and practical tools can be created. For instance, vast amounts of information can be quickly collated and summarized in dashboard applications. Yet, we need new analytic methods to allow appropriate analyses and meaningful use of these large amounts of information. Methods are needed to guide detection of signal to noise or chance findings in such enormous data sets, to handle huge amounts of longitudinal data, and to guard against inappropriate inferences.

Efficiency and Speed

Funding for future large-scale, multisite randomized clinical trials is likely to be limited, partly as a result of fiscal issues but also because of the need for new methods that can more rapidly inform health care practice. NIH has limited capacity to fund large-scale evaluations of the ever-increasing plethora of promising health care therapies, devices, and practices. However, as noted earlier, data that can inform decisions about treatment effectiveness and degree of implementation within care settings will be abundant. Currently, about half of all US providers use EHRs.59 In the next few years, this percentage is expected to increase to 80%. This means that information will be available on hundreds of millions of real-world medical encounters between typical patients and typical health care teams in settings with typical constraints. Although there are some limitations, data from these encounters can help inform the effectiveness and implementation of health care interventions.

NIH, in conjunction with the Society of Behavioral Medicine, sponsored a workshop in May 2011 to identify standard patient-reported health behavior and psychosocial data elements that could be routinely collected in EHRs. Development of a consensus on this issue could be an important step in moving toward a large, harmonized database of information gathered from individuals in naturally occurring health care encounters and would also support both patient-centered care60,61 and population-based research.

One common health service research finding is substantial variability in treatments provided across different settings.62 For example, patients in Los Angeles are likely to receive much more aggressive and expensive treatments than demographically and geographically equivalent patients in San Diego.63 Comparisons of such variations in health care reflect ongoing “natural” dissemination and implementation experiments. Although these are not randomized designs, equivalent populations are receiving different treatments on a quasi-random basis. It may be possible, through the use of EHR data, to make valid inferences about implementation strategies that are used with some providers and not others, especially when results are replicated across multiple states, health care systems, and populations. This new science can be more rapid and efficient, considerably less expensive, and more contextually relevant.28,64

Collaboration

To advance to the next level, new dissemination and implementation research–practice collaborations are required among health care researchers, economists, information scientists, biostatisticians, and most important key stakeholders, including citizens and practitioners who will need to implement and will be affected by innovations.65 Addressing the substantial public health challenges we face will require team science66 and a blending of clinical, public health, and community research to a greater extent than ever before. The Clinical and Translational Science Awards should provide fertile laboratories for such research, as do various research networks such as the VA, the HMO Research Network, and the NIH-supported Cancer Research Network, Cardiovascular Research Network, and Mental Health Research Network. The growing need for team science will necessitate changes in academic systems to account for the time and challenges of developing these collaborations and recognize the importance of dissemination and implementation science.67

Improved Capacity

Despite the promise of emerging methodological and analytical approaches, many dissemination and implementation methods are not widely known or understood. The next generation of scientists is now being trained, but not in methods relevant to dissemination and implementation science such as EHR data interpretation or designs that can produce rapid, replicable, and relevant solutions to real-world problems. There is a pressing need to make these methods available to traditional scientists, scientists in training, and key stakeholders in settings such as primary care clinics, community hospitals, health care plans, workplaces, community-based organizations, and voluntary health associations. NIH training efforts such as the earlier-mentioned Training Institute for Dissemination and Implementation Research in Health are important initial steps, but much more is needed.

In particular, an investment in e-learning and other online and Web 2.0 social media is needed to produce a new generation of scientists with this expertise and to provide retraining for established scientists. Both the Veterans Affairs Quality Enhancement Research Initiative in the United States and the Knowledge Translation Canada network have excellent training and ongoing support programs in implementation science; to substantially advance this scientific area, however, more such opportunities are needed to train practitioners, researchers, and policymakers.

To fully realize the potential for public health impact, systems for delivering evidence-based approaches must be expanded and efforts must be made to increase demand by both practitioners and individuals for evidence-based treatments and interventions.67,68 Tools that promote dissemination of evidence-based interventions such as the National Registry of Evidence-Based Programs and Practices69 and the Research-tested Intervention Programs70 should be expanded, and vehicles such as Research to Reality71 that promote interactions between researchers and practitioners should be more broadly promoted.

Cumulative Knowledge

Texts are starting to emerge in the area of dissemination and implementation science,72 and contributions have been made by numerous fields, including biology, business, sociology, economics, and organizational, educational, and systems sciences. Both the pioneering journal in this field, Implementation Science, and newer publications are important repositories of accumulated knowledge. New investigations will do well to refer to such sources, as well as original resources such as Christiansen et al.,73 McLeroy,74 Epping-Jordan et al.,75 Rogers,18 Steckler and Linnan,76 and Stokols,77 rather than having to rediscover key lessons learned in each new subarea.

The ultimate goal of dissemination and implementation science is to ensure that advances in health science become standards for care in all populations and all health care settings. Our perspective and knowledge base is admittedly heavily influenced by our NIH setting, and again we make no claim that this is a comprehensive review. We explicitly acknowledge the important work being done by sister agencies such as the Agency for Healthcare Research and Quality, the Centers for Disease Control and Prevention, the Health Resources and Services Administration, the VA, and the Patient-Centered Outcomes Research Institute; the crucial efforts of numerous private foundations; and the valuable pioneering work being done on dissemination and implementation in other countries that has not been discussed here.

In the coming years, we envision the continued development of a robust dissemination and implementation evidence base that not only demonstrates success in integrating the knowledge gained into clinical and community practice but feeds back knowledge to improve the rigor, relevance, efficiency, speed, and impact of the biomedical research enterprise.

## Human Participant Protection

No protocol approval was needed for this study because no human participants were involved.

# References

1. Collins FS. Opportunities for research and NIH. Science. 2010;327(5961):3637. Crossref, MedlineGoogle Scholar
2. Lloyd Jones DM, Hong Y, Labarthe D, et al. Defining and setting national goals for cardiovascular health promotion and disease reduction: the American Heart Association’s strategic impact goal through 2020 and beyond. Circulation. 2010;121(4):586613. Crossref, MedlineGoogle Scholar
3. Egan BM, Zhao Y, Axon RN. US trends in prevalence, awareness, treatment, and control of hypertension, 1988–2008. JAMA. 2010;303(20):20432050. Crossref, MedlineGoogle Scholar
4. Glasgow RE. What types of evidence are most needed to advance behavioral medicine?Ann Behav Med. 2008;35(1):1925. Crossref, MedlineGoogle Scholar
5. Glasgow RE, Klesges LM, Dzewaltowski DA, Bull SS, Estabrooks P. The future of health behavior change research: what is needed to improve translation of research into health promotion practice?Ann Behav Med. 2004;27(1):312. Crossref, MedlineGoogle Scholar
6. McGlynn EA, Asch SM, Adams J, et al. The quality of health care delivered to adults in the United States. N Engl J Med. 2003;348(26):26352645. Crossref, MedlineGoogle Scholar
7. Abildso CG, Zizzi SJ, Reger-Nash B. Evaluating an insurance-sponsored weight management program with the RE-AIM model, West Virginia, 2004–2008. Prev Chronic Dis. 2010;7(3):A46. MedlineGoogle Scholar
8. Centers for Disease Control and Prevention. Early release of selected estimates based on data from the January–March 2011 National Health Interview Survey. Available at: http://www.cdc.gov/nchs/nhis/released201203.htm. Accessed April 24, 2012. Google Scholar
9. Woolf SH, Johnson RE. Woolf and Johnson respond. Am J Public Health. 2005;95(8):13061307. Crossref, MedlineGoogle Scholar
10. Stamos TD, Shaltoni H, Girard SA, Parrillo JE, Calvin JE. Effectiveness of chart prompts to improve physician compliance with the National Cholesterol Education Program guidelines. Am J Cardiol. 2001;88(12):14201423. Crossref, MedlineGoogle Scholar
11. Gawande A. The Checklist Manifesto: How to Get Things Right. New York, NY: Metropolitan Books; 2009. Google Scholar
12. Wlodarczyk J, Sullivan D, Smith M. Comparison of benefits and risks of rosuvastatin versus atorvastatin from a meta-analysis of head-to-head randomized controlled trials. Am J Cardiol. 2008;102(12):16541662. Crossref, MedlineGoogle Scholar
13. Weng T, Yang Y-K, Lin S, Tai S. A systematic review and meta-analysis on the therapeutic equivalence of statins. J Clin Pharm Ther. 2010;35(2):139151. Crossref, MedlineGoogle Scholar
14. Lazar LD, Pletcher MJ, Coxson PG, Bibbins-Domingo K, Goldman L. Cost-effectiveness of statin therapy for primary prevention in a low-cost statin era. Circulation. 2011;124(2):146153. Crossref, MedlineGoogle Scholar
15. Woolf SH, Johnson RE, Fryer GE Jr, Rust G, Satcher D. The health impact of resolving racial disparities: an analysis of US mortality data. Am J Public Health. 2004;94(12):20782081. LinkGoogle Scholar
16. Rabin BA, Glasgow RE. Dissemination of ehealth communication programs. In: Noar SM, Harrington NG, eds. eHealth Applications: Promising Strategies for Behavior Change. New York, NY: Routledge; 2012:221245. Google Scholar
17. US Dept of Health and Human Services. Program announcement number PAR-10-038. Available at: http://grants.nih.gov/grants/guide/pa-files/PAR-10-038.html. Accessed April 24, 2012. Google Scholar
18. Rogers EM. Diffusion of Innovations. 5th ed. New York, NY: Free Press; 2003. Google Scholar
19. Implementation Science. Aims and scope. Available at: http://www.implementationscience.com/about#aimsscope. Accessed April 24, 2012. Google Scholar
20. Ruhe MC, Weyer SM, Zronek S, Wilkinson A, Wilkinson PS, Stange KC. Facilitating practice change: lessons from the STEP-UP clinical trial. Prev Med. 2005;40(6):729734. Crossref, MedlineGoogle Scholar
21. Glisson C, Schoenwald SK, Hemmelgarn A, et al. Randomized trial of MST and ARC in a two-level evidence-based treatment implementation strategy. J Consult Clin Psychol. 2010;78(4):537550. Crossref, MedlineGoogle Scholar
22. Brownson RC, Chriqui JF, Stamatakis KA. Understanding evidence-based public health policy. Am J Public Health. 2009;99(9):15761583. LinkGoogle Scholar
23. Stokols D. Translating social ecological theory into guidelines for community health promotion. Am J Health Promot. 1996;10(4):282298. Crossref, MedlineGoogle Scholar
24. Zwarenstein M, Treweek S. What kind of randomised trials do patients and clinicians need?Evid Based Med. 2009;14(4):101103. Crossref, MedlineGoogle Scholar
25. Green LW, Ottoson JM, Garcia C, Hiatt RA. Diffusion theory and knowledge dissemination, utilization, and integration in public health. Annu Rev Public Health. 2009;30:151174. Crossref, MedlineGoogle Scholar
26. Glasgow RE, Emmons KM. How can we increase translation of research into practice? Types of evidence needed. Annu Rev Public Health. 2007;28:413433. Crossref, MedlineGoogle Scholar
27. Damschroder L, Aron D, Keith R, Kirsh S, Alexander J, Lowery J. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4(1):50. Crossref, MedlineGoogle Scholar
28. Kessler R, Glasgow RE. A proposal to speed translation of healthcare research into practice: dramatic change is needed. Am J Prev Med. 2011;40(6):637644. Crossref, MedlineGoogle Scholar
29. Mercer SM, DeVinney BJ, Fine LJ, Green LW. Study designs for effectiveness and translation research: identifying trade-offs. Am J Prev Med. 2007;33(2):139154. Crossref, MedlineGoogle Scholar
30. Tinetti ME, Studenski SA. Comparative effectiveness research and patients with multiple chronic conditions. N Engl J Med. 2011;364(26):24782481. Crossref, MedlineGoogle Scholar
31. Khoury MJ, Gwinn M, Ioannidis JP. The emergence of translational epidemiology: from scientific discovery to population health impact. Am J Epidemiol. 2010;172(5):517524. Crossref, MedlineGoogle Scholar
32. Khoury MJ, Gwinn M, Yoon PW, Dowling N, Moore CA, Bradley L. The continuum of translation research in genomic medicine: how can we accelerate the appropriate integration of human genome discoveries into health care and disease prevention?Genet Med. 2007;9(10):665674. Crossref, MedlineGoogle Scholar
33. Collins FS. Reengineering translational science: the time is right. Sci Transl Med. 2011;3(90):90cm17. CrossrefGoogle Scholar
34. Contopoulos-Ioannidis DG, Alexiou GA, Gouvias TC, Ioannidis JPA. Life cycle of translational research for medical interventions. Science. 2008;321(5894):12981299. Crossref, MedlineGoogle Scholar
35. Teutsch SM, Fielding JE. Comparative effectiveness—looking under the lamppost. JAMA. 2011;305(21):22252226. Crossref, MedlineGoogle Scholar
36. Kennedy A, Reeves D, Bower P, et al. The effectiveness and cost effectiveness of a national lay-led self care support programme for patients with long-term conditions: a pragmatic randomised controlled trial. J Epidemiol Community Health. 2007;61(3):254261. Crossref, MedlineGoogle Scholar
37. Lorig KR, Ritter P, Stewart AL, et al. Chronic disease self-management program: 2-year health status and health care utilization outcomes. Med Care. 2001;39(11):12171223. Crossref, MedlineGoogle Scholar
38. Gonzalez VM, Stewart A, Ritter PL, Lorig K. Translation and validation of arthritis outcome measures into Spanish. Arthritis Rheum. 1995;38(10):14291446. Crossref, MedlineGoogle Scholar
39. Lorig KR, Mazonson PD, Holman HR. Evidence suggesting that health education for self-management in patients with chronic arthritis has sustained health benefits while reducing health care costs. Arthritis Rheum. 1993;36(4):439446. Crossref, MedlineGoogle Scholar
40. Rogers A, Kennedy A, Bower P, et al. The United Kingdom expert patients programme: results and implications from a national evaluation. Med J Aust. 2008;189(suppl 10):S21S24. MedlineGoogle Scholar
41. Knowler WC, Barrett-Connor E, Fowler SE, et al. Reduction in the incidence of type 2 diabetes with lifestyle intervention or metformin. N Engl J Med. 2002;346(6):393403. Crossref, MedlineGoogle Scholar
42. Katula JA, Vitolins MZ, Rosenberger EL, et al. One-year results of a community-based translation of the Diabetes Prevention Program: Healthy-Living Partnerships to Prevent Diabetes (HELP PD) project. Diabetes Care. 2011;34(7):14511457. Crossref, MedlineGoogle Scholar
43. Ackermann RT, Finch EA, Brizendine E, Zhou H, Marrero DG. Translating the Diabetes Prevention Program into the community: the DEPLOY pilot study. Am J Prev Med. 2008;35(4):357363. Crossref, MedlineGoogle Scholar
44. Lichtenstein E, ed. Quitlines [special issue]. Tob Control. 2007;16(suppl 1):186. CrossrefGoogle Scholar
45. US Dept of Health and Human Services. Strategic plan: fiscal years 2010–2015. Available at: http://www.hhs.gov/secretary/about/priorities/strategicplan2010-2015.pdf. Accessed April 24, 2012. Google Scholar
46. Glasgow RE. HMC research translation: speculations about making it real and going to scale. Am J Health Behav. 2010;34(6):833840. Crossref, MedlineGoogle Scholar
47. Scheirer MA, Dearing JW. An agenda for research on the sustainability of public health programs. Am J Public Health. 2011;101(11):20592067. LinkGoogle Scholar
48. Agency for Healthcare Research and Quality. What is comparative effectiveness research?Available at: http://www.effectivehealthcare.ahrq.gov/index.cfm/what-is-comparative-effectiveness-research1. Accessed April 24, 2012. Google Scholar
49. Glasgow RE, Steiner J. Comparative effectiveness to accelerate translation: Recommendations for an emerging field of science. In: Brownson R, Colditz G, Proctor E, eds. Dissemination and Implementation Research in Health: Translating Science to Practice. New York, NY: Oxford University Press; 2012:5571. CrossrefGoogle Scholar
50. Fuchs VR, Milstein A. The \$640 billion question—why does cost-effective care diffuse so slowly?N Engl J Med. 2011;364(21):19851987. Crossref, MedlineGoogle Scholar
51. Glasgow RE, Klesges LM, Dzewaltowski DA, Estabrooks PA, Vogt TM. Evaluating the overall impact of health promotion programs: using the RE-AIM framework to form summary measures for decision making involving complex issues. Health Educ Res. 2006;21(5):688694. Crossref, MedlineGoogle Scholar
52. Glasgow RE, Nelson CC, Strycker LA, King DK. Using RE-AIM metrics to evaluate diabetes self-management support interventions. Am J Prev Med. 2006;30(1):6773. Crossref, MedlineGoogle Scholar
53. Kessler R, Glasgow RE. A proposal to speed translation of healthcare intervention research into practice: dramatic change is needed. Am J Prev Med. 2011;40(6):637644. Crossref, MedlineGoogle Scholar
54. Langley GL, Nolan KM, Nolan TW, Norman CL, Provost LP. The Improvement Guide: A Practical Approach to Enhancing Organizational Performance. 2nd ed.San Francisco, CA: Jossey-Bass; 2009. Google Scholar
55. Etheredge LM. A rapid-learning health system. Health Aff (Millwood). 2007;26(2):w107w118. Crossref, MedlineGoogle Scholar
56. Institute of Medicine. The Learning Healthcare System: Workshop Summary. Washington, DC: National Academies Press; 2007. Google Scholar
57. Milstein B, Homer J, Hirsch G. Analyzing national health reform strategies with a dynamic simulation model. Am J Public Health. 2010;100(5):811819. LinkGoogle Scholar
58. Holdren JP. America COMPETES Act keeps America’s leadership on target. Available at: http://www.whitehouse.gov/blog/2011/01/06/america-competes-act-keeps-americas-leadership-target. Accessed April 24, 2012. Google Scholar
60. Institute of Medicine. Crossing the Quality Chasm: A New Health System for the 21st Century. Washington, DC: National Academies Press; 2001. Google Scholar
61. Glasgow RE, Green LW, Taylor MV, Stange KC. An evidence integration triangle for aligning science with policy and practice. Am J Prev Med. In press. Google Scholar
62. Wennberg JE. Tracking Medicine: A Researcher’s Quest to Understand Health Care. New York, NY: Oxford University Press; 2010. Google Scholar
63. Kaplan RM. Variation between end-of-life health care costs in Los Angeles and San Diego: why are they so different?J Palliat Med. 2011;14(2):215220. Crossref, MedlineGoogle Scholar
64. Green LW, Glasgow RE. Evaluating the relevance, generalization, and applicability of research: issues in external validity and translation methodology. Eval Health Prof. 2006;29(1):126153. Crossref, MedlineGoogle Scholar
65. Hall KL, Feng AX, Moser RP, Stokols D, Taylor BK. Moving the science of team science forward: collaboration and creativity. Am J Prev Med. 2008;35(suppl 2):S243S249. Crossref, MedlineGoogle Scholar
66. Stokols D, Hall KL, Taylor BK, Moser RP. The science of team science: overview of the field and introduction to the supplement. Am J Prev Med. 2008;35(suppl 2):S77S89. Crossref, MedlineGoogle Scholar
67. Colditz GA, Emmons KM, Vishwanath K, Kerner JF. Translating science to practice: community and academic perspectives. J Public Health Manag Pract. 2008;14(2):144149. Crossref, MedlineGoogle Scholar
68. Kerner J, Guirguis-Blake J, Hennessy K, et al. Translating research into improved outcomes in comprehensive cancer control. Cancer Causes Control. 2005;16(suppl 1):2740. Crossref, MedlineGoogle Scholar
69. National Registry of Evidence-Based Programs and Practices. Available at: http://www.nrepp.samhsa.gov. Accessed May 10, 2012. Google Scholar
70. Research-Tested Intervention Programs. Available at: http://rtips.cancer.gov/rtips. Accessed May 10, 2012. Google Scholar
71. Research to Reality. Available at: https://researchtoreality.cancer.gov. Accessed May 10, 2012. Google Scholar
72. Brownson RC, Colditz GA, Proctor EK. Dissemination and Implementation Research in Health: Translating Science to Practice. New York, NY: Oxford University Press; 2012. CrossrefGoogle Scholar
73. Christiansen CM, Grossman JH, Hwang J. The Innovator’s Prescription: A Disruptive Solution for Health Care. New York, NY: McGraw-Hill; 2009. Google Scholar
74. McLeroy K. Thinking of systems. Am J Public Health. 2006;96(3):402. LinkGoogle Scholar
75. Epping-Jordan JE, Pruitt SD, Bengoa R, Wagner EH. Improving the quality of health care for chronic conditions. Qual Saf Health Care. 2004;13(4):299305. Crossref, MedlineGoogle Scholar
76. Steckler AB, Linnan L. Process Evaluation for Public Health Interventions and Research. San Francisco, CA: Jossey-Bass; 2002. Google Scholar
77. Stokols D. Establishing and maintaining health environments: toward a social ecology of health promotion. Am Psychol. 1992;47(1):622. Crossref, MedlineGoogle Scholar

No related items

### ARTICLE CITATION

Russell E. Glasgow, PhD, Cynthia Vinson, MPA, David Chambers, DPhil, Muin J. Khoury, MD, PhD, Robert M. Kaplan, PhD, and Christine Hunter, PhDRussell E. Glasgow, Cynthia Vinson, and Muin J. Khoury are with the Division of Cancer Control and Population Sciences, National Cancer Institute, Bethesda, MD. David Chambers is with the Division of Services and Interventions Research, National Institute of Mental Health, Bethesda, MD. Robert M. Kaplan is with the Office of Behavioral and Social Sciences Research, National Institutes of Health, Bethesda, MD. Christine Hunter is with the Division of Diabetes, Endocrinology, and Metabolic Diseases, National Institute of Diabetes and Digestive and Kidney Diseases, Bethesda, MD. “National Institutes of Health Approaches to Dissemination and Implementation Science: Current and Future Directions”, American Journal of Public Health 102, no. 7 (July 1, 2012): pp. 1274-1281.

https://doi.org/10.2105/AJPH.2012.300755

PMID: 22594758