The term “global health” is rapidly replacing the older terminology of “international health.” We describe the role of the World Health Organization (WHO) in both international and global health and in the transition from one to the other. We suggest that the term “global health” emerged as part of larger political and historical processes, in which WHO found its dominant role challenged and began to reposition itself within a shifting set of power alliances.
Between 1948 and 1998, WHO moved from being the unquestioned leader of international health to being an organization in crisis, facing budget shortfalls and diminished status, especially given the growing influence of new and powerful players. We argue that WHO began to refashion itself as the coordinator, strategic planner, and leader of global health initiatives as a strategy of survival in response to this transformed international political context.
EVEN A QUICK GLANCE AT THE titles of books and articles in recent medical and public health literature suggests that an important transition is under way. The terms “global,” “globalization,” and their variants are everywhere, and in the specific context of international public health, “global” seems to be emerging as the preferred authoritative term.1 As one indicator, the number of entries in PubMed under the rubrics “global health” and “international health” shows that “global health” is rapidly on the rise, seemingly on track to overtake “international health” in the near future (Table 1). Although universities, government agencies, and private philanthropies are all using the term in highly visible ways,2 the origin and meaning of the term “global health” are still unclear.
We provide historical insight into the emergence of the terminology of global health. We believe that an examination of this linguistic shift will yield important fruit, and not just information about fashions and fads in language use. Our task here is to provide a critical analysis of the meaning, emergence, and significance of the term “global health” and to place its growing popularity in a broader historical context. In particular, we focus on the role of the World Health Organization (WHO) in both international and global health and as an agent in the transition from one concept to the other.
Let us first define and differentiate some essential terms. “International health” was already a term of considerable currency in the late 19th and early 20th century, when it referred primarily to a focus on the control of epidemics across the boundaries between nations (i.e., “international”). “Intergovernmental” refers to the relationships between the governments of sovereign nations—in this case, with regard to the policies and practices of public health. “Global health,” in general, implies consideration of the health needs of the people of the whole planet above the concerns of particular nations. The term “global” is also associated with the growing importance of actors beyond governmental or intergovernmental organizations and agencies—for example, the media, internationally influential foundations, nongovernmental organizations, and transnational corporations. Logically, the terms “international,” “intergovernmental,” and “global” need not be mutually exclusive and in fact can be understood as complementary. Thus, we could say that WHO is an intergovernmental agency that exercises international functions with the goal of improving global health.
Given these definitions, it should come as no surprise that global health is not entirely an invention of the past few years. The term “global” was sometimes used well before the 1990s, as in the “global malaria eradication program” launched by WHO in the mid-1950s; a WHO Public Affairs Committee pamphlet of 1958, The World Health Organization: Its Global Battle Against Disease3; a 1971 report for the US House of Representatives entitled The Politics of Global Health4; and many studies of the “global population problem” in the 1970s.5 But the term was generally limited and its use in official statements and documents sporadic at best. Now there is an increasing frequency of references to global health.6 Yet the questions remain: How many have participated in this shift in terminology? Do they consider it trendy, trivial, or trenchant?
Supinda Bunyavanich and Ruth B. Walkup tried to answer these questions and published, under the provocative title “US Public Health Leaders Shift Toward a New Paradigm of Global Health,” their report of conversations conducted in 1999 with 29 “international health leaders.”7 Their respondents fell into 2 groups. About half felt that there was no need for a new terminology and that the label “global health” was meaningless jargon. The other half thought that there were profound differences between international health and global health and that “global” clearly meant something transnational. Although these respondents believed that a major shift had occurred within the previous few years, they seemed unable clearly to articulate or define it.
In 1998, Derek Yach and Douglas Bettcher came closer to capturing both the essence and the origin of the new global health in a 2-part article on “The Globalization of Public Health” in the American Journal of Public Health.8 They defined the “new paradigm” of globalization as “the process of increasing economic, political, and social interdependence and integration as capital, goods, persons, concepts, images, ideas and values cross state boundaries.” The roots of globalization were long, they said, going back at least to the 19th century, but the process was assuming a new magnitude in the late 20th century. The globalization of public health, they argued, had a dual aspect, one both promising and threatening.
In one respect, there was easier diffusion of useful technologies and of ideas and values such as human rights. In another, there were such risks as diminished social safety nets; the facilitated marketing of tobacco, alcohol, and psychoactive drugs; the easier worldwide spread of infectious diseases; and the rapid degradation of the environment, with dangerous public health consequences. But Yach and Bettcher were convinced that WHO could turn these risks into opportunities. WHO, they argued, could help create more efficient information and surveillance systems by strengthening its global monitoring and alert systems, thus creating “global early warning systems.” They believed that even the most powerful nations would buy into this new globally interdependent world system once these nations realized that such involvement was in their best interest.
Despite the long list of problems and threats, Yach and Bettcher were largely uncritical as they promoted the virtues of global public health and the leadership role of WHO. In an editorial in the same issue of the Journal, George Silver noted that Yach and Bettcher worked for WHO and that their position was similar to other optimistic stances taken by WHO officials and advocates. But WHO, Silver pointed out, was actually in a bad way: “The WHO’s leadership role has passed to the far wealthier and more influential World Bank, and the WHO’s mission has been dispersed among other UN agencies.” Wealthy donor countries were billions of dollars in arrears, and this left the United Nations and its agencies in “disarray, hamstrung by financial constraints and internal incompetencies, frustrated by turf wars and cross-national policies.”9 Given these -realities, Yach and Bettcher’s promotion of “global public health” while they were affiliated with WHO was, to say the least, intriguing. Why were these spokesmen for the much-criticized and apparently hobbled WHO so upbeat about “global” public health?
To better understand Yach and Bettcher’s role, and that of WHO more generally, it will be helpful to review the history of the organization from 1948 to 1998, as it moved from being the unquestioned leader of international health to searching for its place in the contested world of global health.
WHO formally began in 1948, when the first World Health Assembly in Geneva, Switzerland, ratified its constitution. The idea of a permanent institution for international health can be traced to the organization in 1902 of the International Sanitary Office of the American Republics, which, some decades later, became the Pan American Sanitary Bureau and eventually the Pan American Health Organization.10 The Rockefeller Foundation, especially its International Health Division, was also a very significant player in international health in the early 20th century.11
Two European-based international health agencies were also important. One was the Office Internationale d’Hygiène Publique, which began functioning in Paris in 1907; it concentrated on several basic activities related to the administration of international sanitary agreements and the rapid exchange of epidemiological information.12 The second agency, the League of Nations Health Organization, began its work in 1920.13 This organization established its headquarters in Geneva, sponsored a series of international commissions on diseases, and published epidemiological intelligence and technical reports. The League of Nations Health Organization was poorly budgeted and faced covert opposition from other national and international organizations, including the US Public Health Service. Despite these complications, which limited the Health Organization ’s effectiveness, both the Office Internationale d’Hygiène Publique and the Health Organization survived through World War II and were present at the critical postwar moment when the future of international health would be defined.
An international conference in 1945 approved the creation of the United Nations and also voted for the creation of a new specialized health agency. Participants at the meeting initially formed a commission of prominent individuals, among whom were René Sand from Belgium, Andrija Stampar from Yugoslavia, and Thomas Parran from the United States. Sand and Stampar were widely recognized as champions of social medicine. The commission held meetings between 1946 and early 1948 to plan the new international health organization. Representatives of the Pan American Sanitary Bureau, whose leaders resisted being absorbed by the new agency, were also involved, as were leaders of new institutions such as the United Nations Relief and Rehabilitation Administration (UNRRA).
Against this background, the first World Health Assembly met in Geneva in June 1948 and formally created the World Health Organization. The Office Internationale d’Hygiène Publique, the League of Nations Health Organization, and UNRRA merged into the new agency. The Pan American Sanitary Bureau—then headed by Fred L. Soper, a former Rockefeller Foundation official—was allowed to retain autonomous status as part of a regionalization scheme.14 WHO formally divided the world into a series of regions—the Americas, Southeast Asia, Europe, Eastern Mediterranean, Western Pacific, and Africa—but it did not fully implement this regionalization until the 1950s. Although an “international” and “intergovernmental” mindset prevailed in the 1940s and 1950s, naming the new organization the World Health Organization also raised sights to a worldwide, “global” perspective.
The first director general of WHO, Brock Chisholm, was a Canadian psychiatrist loosely identified with the British social medicine tradition. The United States, a main contributor to the WHO budget, played a contradictory role: on the one hand, it supported the UN system with its broad worldwide goals, but on the other, it was jealous of its sovereignty and maintained the right to intervene unilaterally in the Americas in the name of national security. Another problem for WHO was that its constitution had to be ratified by nation states, a slow process: by 1949, only 14 countries had signed on.15
As an intergovernmental agency, WHO had to be responsive to the larger political environment. The politics of the Cold War had a particular salience, with an unmistakable impact on WHO policies and personnel. Thus, when the Soviet Union and other communist countries walked out of the UN system and therefore out of WHO in 1949, the United States and its allies were easily able to exert a dominating influence. In 1953, Chisholm completed his term as director general and was replaced by the Brazilian Marcolino Candau. Candau, who had worked under Soper on malaria control in Brazil, was associated first with the “vertical” disease control programs of the Rockefeller Foundation and then with their adoption by the Pan American Sanitary Bureau when Soper moved to that agency as director.16 Candau would be director general of WHO for over 20 years. From 1949 until 1956, when the Soviet Union returned to the UN and WHO, WHO was closely allied with US interests.
In 1955, Candau was charged with overseeing WHO’s campaign of malaria eradication, approved that year by the World Health Assembly. The ambitious goal of malaria eradication had been conceived and promoted in the context of great enthusiasm and optimism about the ability of widespread DDT spraying to kill mosquitoes. As Randall Packard has argued, the United States and its allies believed that global malaria eradication would usher in economic growth and create overseas markets for US technology and manufactured goods.17 It would build support for local governments and their US supporters and help win “hearts and minds” in the battle against Communism. Mirroring then-current development theories, the campaign promoted technologies brought in from outside and made no attempt to enlist the participation of local populations in planning or implementation. This model of development assistance fit neatly into US Cold War efforts to promote modernization with limited social reform.18
With the return of the Soviet Union and other communist countries in 1956, the political balance in the World Health Assembly shifted and Candau accommodated the changed balance of power. During the 1960s, malaria eradication was facing serious difficulties in the field; ultimately, it would suffer colossal and embarrassing failures. In 1969, the World Health Assembly, declaring that it was not feasible to eradicate malaria in many parts of the world, began a slow process of reversal, returning once again to an older malaria control agenda. This time, however, there was a new twist; the 1969 assembly emphasized the need to develop rural health systems and to integrate malaria control into general health services.
When the Soviet Union returned to WHO, its representative at the assembly was the national deputy minister of health. He argued that it was now scientifically feasible, socially desirable, and economically worthwhile to attempt to eradicate smallpox worldwide.19 The Soviet Union wanted to make its mark on global health, and Candau, recognizing the shifting balance of power, was willing to cooperate. The Soviet Union and Cuba agreed to provide 25 million and 2 million doses of freeze-dried vaccine, respectively; in 1959, the World Health Assembly committed itself to a global smallpox eradication program.
In the 1960s, technical improvements—jet injectors and bifurcated needles—made the process of vaccination much cheaper, easier, and more effective. The United States’ interest in smallpox eradication sharply increased; in 1965, Lyndon Johnson instructed the US delegation to the World Health Assembly to pledge American support for an international program to eradicate smallpox from the earth.20 At that time, despite a decade of marked progress, the disease was still endemic in more than 30 countries. In 1967, now with the support of the world’s most powerful players, WHO launched the Intensified Smallpox Eradication Program. This program, an international effort led by the American Donald A. Henderson, would ultimately be stunningly successful.21
Within WHO, there have always been tensions between social and economic approaches to population health and technology-or disease-focused approaches. These approaches are not necessarily incompatible, although they have often been at odds. The emphasis on one or the other waxes and wanes over time, depending on the larger balance of power, the changing interests of international players, the intellectual and ideological commitments of key individuals, and the way that all of these factors interact with the health policymaking process.
During the 1960s and 1970s, changes in WHO were significantly influenced by a political context marked by the emergence of decolonized African nations, the spread of nationalist and socialist movements, and new theories of development that emphasized long-term socioeconomic growth rather than short-term technological intervention. Rallying within organizations such as the Non-Aligned Movement, developing countries created the UN Conference on Trade and Development (UNCTAD), where they argued vigorously for fairer terms of trade and more generous financing of development.22 In Washington, DC, more liberal politics succeeded the conservatism of the 1950s, with the civil rights movement and other social movements forcing changes in national priorities.
This changing political environment was reflected in corresponding shifts within WHO. In the 1960s, WHO acknowledged that a strengthened health infrastructure was prerequisite to the success of malaria control programs, especially in Africa. In 1968, Candau called for a comprehensive and integrated plan for curative and preventive care services. A Soviet representative called for an organizational study of methods for promoting the development of basic health services.23 In January 1971, the Executive Board of the World Health Assembly agreed to undertake this study, and its results were presented to the assembly in 1973.24 Socrates Litsios has discussed many of the steps in the transformation of WHO’s approach from an older model of health services to what would become the “Primary Health Care” approach.25 This new model drew upon the thinking and experiences of nongovernmental organizations and medical missionaries working in Africa, Asia, and Latin America at the grass-roots level. It also gained saliency from China’s reentry into the UN in 1973 and the widespread interest in Chinese “barefoot doctors,” who were reported to be transforming rural health conditions. These experiences underscored the urgency of a “Primary Health Care” perspective that included the training of community health workers and the resolution of basic economic and environmental problems.26
These new approaches were spearheaded by Halfdan T. Mahler, a Dane, who served as director general of WHO from 1973 to 1988. Under pressure from the Soviet delegate to the executive board, Mahler agreed to hold a major conference on the organization of health services in Alma-Ata, in the Soviet Union. Mahler was initially reluctant because he disagreed with the Soviet Union’s highly centralized and medicalized approach to the provision of health services.27 The Soviet Union succeeded in hosting the September 1978 conference, but the conference itself reflected Mahler’s views much more closely than it did those of the Soviets. The Declaration of Primary Health Care and the goal of “Health for All in the Year 2000” advocated an “intersectoral” and multidimensional approach to health and socioeconomic development, emphasized the use of “appropriate technology,” and urged active community participation in health care and health education at every level.28
David Tejada de Rivero has argued that “It is regrettable that afterward the impatience of some international agencies, both UN and private, and their emphasis on achieving tangible results instead of promoting change . . . led to major distortions of the original concept of primary health care.”29 A number of governments, agencies, and individuals saw WHO’s idealistic view of Primary Health Care as “unrealistic” and unattainable. The process of reducing Alma-Ata’s idealism to a practical set of technical interventions that could be implemented and measured more easily began in 1979 at a small conference—heavily influenced by US attendees and policies—held in Bellagio, Italy, and sponsored by the Rockefeller Foundation, with assistance from the World Bank. Those in attendance included the president of the World Bank, the vice president of the Ford Foundation, the administrator of USAID, and the executive secretary of UNICEF.30
The Bellagio meeting focused on an alternative concept to that articulated at Alma-Ata—“Selective Primary Health Care”—which was built on the notion of pragmatic, low-cost interventions that were limited in scope and easy to monitor and evaluate. Thanks primarily to UNICEF, Selective Primary Health Care was soon operationalized under the acronym “GOBI” (Growth monitoring to fight malnutrition in children, Oral rehydration techniques to defeat diarrheal diseases, Breastfeeding to protect children, and Immunizations).31
In the 1980s, WHO had to reckon with the growing influence of the World Bank. The bank had initially been formed in 1946 to assist in the reconstruction of Europe and later expanded its mandate to provide loans, grants, and technical assistance to developing countries. At first, it funded large investments in physical capital and infrastructure; in the 1970s, however, it began to invest in population control, health, and education, with an emphasis on population control.32 The World Bank approved its first loan for family planning in 1970. In 1979, the World Bank created a Population, Health, and Nutrition Department and adopted a policy of funding both stand-alone health programs and health components of other projects.
In its 1980 World Development Report, the Bank argued that both malnutrition and ill health could be countered by direct government action—with World Bank assistance.33 It also suggested that improving health and nutrition could accelerate economic growth, thus providing a good argument for social sector spending. As the Bank began to make direct loans for health services, it called for more efficient use of available resources and discussed the roles of the private and public sectors in financing health care. The Bank favored free markets and a diminished role for national governments.34 In the context of widespread indebtedness by developing countries and increasingly scarce resources for health expenditures, the World Bank’s promotion of “structural adjustment” measures at the very time that the HIV/AIDS epidemic erupted drew angry criticism but also underscored the Bank’s new influence.
In contrast to the World Bank’s increasing authority, in the 1980s the prestige of WHO was beginning to diminish. One sign of trouble was the 1982 vote by the World Health Assembly to freeze WHO’s budget.35 This was followed by the 1985 decision by the United States to pay only 20% of its assessed contribution to all UN agencies and to withhold its contribution to WHO’s regular budget, in part as a protest against WHO’s “Essential Drug Program,” which was opposed by leading US-based pharmaceutical companies.36 These events occurred amidst growing tensions between WHO and UNICEF and other agencies and the controversy over Selective versus Comprehensive Primary Health Care. As part of a rancorous public debate conducted in the pages of Social Science and Medicine in 1988, Kenneth Newell, a highly placed WHO official and an architect of Comprehensive Primary Health Care, called Selective Primary Health Care a “threat . . . [that] can be thought of as a counter-revolution.”37
In 1988, Mahler’s 15-year tenure as director general of WHO came to an end. Unexpectedly, Hiroshi Nakajima, a Japanese researcher who had been director of the WHO Western Pacific Regional Office in Manila, was elected new director general.38
The first citizen of Japan ever elected to head a UN agency, Nakajima rapidly became the most controversial director general in WHO’s history. His nomination had not been supported by the United States or by a number of European and Latin American countries, and his performance in office did little to assuage their doubts. Nakajima did try to launch several important initiatives—on tobacco, global disease surveillance, and public–private partnerships—but fierce criticism persisted that raised questions about his autocratic style and poor management, his inability to communicate effectively, and, worst of all, cronyism and corruption.
Another symptom of WHO’s problems in the late 1980s was the growth of “extrabudgetary” funding. As Gill Walt of the London School of Hygiene and Tropical Medicine noted, there was a crucial shift from predominant reliance on WHO’s “regular budget”—drawn from member states’ contributions on the basis of population size and gross national product—to greatly increased dependence on extrabudgetary funding coming from donations by multilateral agencies or “donor” nations.39 By the period 1986–1987, extrabudgetary funds of $437 million had almost caught up with the regular budget of $543 million. By the beginning of the 1990s, extra-budgetary funding had overtaken the regular budget by $21 million, contributing 54% of WHO’s overall budget.
Enormous problems for the organization followed from this budgetary shift. Priorities and policies were still ostensibly set by the World Health Assembly, which was made up of all member nations. The assembly, however, now dominated numerically by poor and developing countries, had authority only over the regular budget, frozen since the early 1980s. Wealthy donor nations and multilateral agencies like the World Bank could largely call the shots on the use of the extrabudgetary funds they contributed. Thus, they created, in effect, a series of “vertical” programs more or less independent of the rest of WHO’s programs and decisionmaking structure. The dilemma for the organization was that although the extrabudgetary funds added to the overall budget, “they [increased] difficulties of coordination and continuity, [caused] unpredictability in finance, and a great deal of dependence on the satisfaction of particular donors,”40 as Gill Walt explained.
Fiona Godlee published a series of articles in 1994 and 1995 that built on Walt’s critique.41 She concluded with this dire assessment: “WHO is caught in a cycle of decline, with donors expressing their lack of faith in its central management by placing funds outside the management’s control. This has prevented WHO from [developing] . . . integrated responses to countries’ long term needs.”41
In the late 1980s and early 1990s, the World Bank moved confidently into the vacuum created by an increasingly ineffective WHO. WHO officials were unable or unwilling to respond to the new international political economy structured around neoliberal approaches to economics, trade, and politics.42 The Bank maintained that existing health systems were often wasteful, inefficient, and ineffective, and it argued in favor of greater reliance on private-sector health care provision and the reduction of public involvement in health services delivery.43
Controversies surrounded the World Bank’s policies and practices, but there was no doubt that, by the early 1990s, it had become a dominant force in international health. The Bank’s greatest “comparative advantage” lay in its ability to mobilize large financial resources. By 1990, the Bank’s loans for health surpassed WHO’s total budget, and by the end of 1996, the Bank’s cumulative lending portfolio in health, nutrition, and population had reached $13.5 billion. Yet the Bank recognized that, whereas it had great economic strengths and influence, WHO still had considerable technical expertise in matters of health and medicine. This was clearly reflected in the Bank’s widely influential World Development Report, 1993: Investing in Health, in which credit is given to WHO, “a full partner with the World Bank at every step of the preparation of the Report.”44 Circumstances suggested that it was to the advantage of both parties for the World Bank and WHO to work together.
This is the context in which WHO began to refashion itself as a coordinator, strategic planner, and leader of “global health” initiatives. In January 1992, the 31-member Executive Board of the World Health Assembly decided to appoint a “working group” to recommend how WHO could be most effective in international health work in light of the “global change” rapidly overtaking the world. The executive board may have been responding, in part, to the Children’s Vaccine Initiative, perceived within WHO as an attempted “coup” by UNICEF, the World Bank, the UN Development Program, the Rockefeller Foundation, and several other players seeking to wrest control of vaccine development.45 The working group’s final report of May 1993 recommended that WHO—if it was to maintain leadership of the health sector—must overhaul its fragmented management of global, regional, and country programs, diminish the competition between regular and extrabudgetary programs, and, above all, increase the emphasis within WHO on global health issues and WHO’s coordinating role in that domain.46
Until that time, the term “global health” had been used sporadically and, outside WHO, usually by people on the political left with various “world” agendas. In 1990, G. A. Gellert of International Physicians for the Prevention of Nuclear War had called for analyses of “global health interdependence.”47 In the same year, Milton and Ruth Roemer argued that further improvements in “global health” would be dependent on the expansion of public rather than private health services.48 Another strong source for the term “global health” was the environmental movement, especially debates over world environmental degradation, global warming, and their potentially devastating effects on human health.49
In the mid-1990s, a considerable body of literature was produced on global health threats. In the United States, a new Centers for Disease Control and Prevention (CDC) journal, Emerging Infectious Diseases, began publication, and former CDC director William Foege started using the phrase “global infectious disease threats.”50 In 1997, the Institute of Medicine’s Board of International Health released a report, America’s Vital Interest in Global Health: Protecting Our People, Enhancing Our Economy, and Advancing Our International Interests.51 In 1998, the CDC’s Preventing Emerging Infectious Diseases: A Strategy for the 21st Century appeared, followed in 2001 by the Institute of Medicine’s Perspectives on the Department of Defense Global Emerging Infections Surveillance and Response System.52 Best-selling books and news magazines were full of stories about Ebola and West Nile virus, resurgent tuberculosis, and the threat of bioterrorism.53 The message was clear: there was a palpable global disease threat.
In 1998, the World Health Assembly reached outside the ranks of WHO for a new leader who could restore credibility to the organization and provide it with a new vision: Gro Harlem Brundtland, former prime minister of Norway and a physician and public health professional. Brundtland brought formidable expertise to the task. In the 1980s, she had been chair of the UN World Commission on Environment and Development and produced the “Brundtland Report,” which led to the Earth Summit of 1992. She was familiar with the global thinking of the environmental movement and had a broad and clear understanding of the links between health, environment, and development.54
Brundtland was determined to position WHO as an important player on the global stage, move beyond ministries of health, and gain a seat at the table where decisions were being made.55 She wanted to refashion WHO as a “department of consequence”55 able to monitor and influence other actors on the global scene. She established a Commission on Macroeconomics and Health, chaired by economist Jeffrey Sachs of Harvard University and including former ministers of finance and officers from the World Bank, the International Monetary Fund, the World Trade Organization, and the UN Development Program, as well as public health leaders. The commission issued a report in December 2001, which argued that improving health in developing countries was essential to their economic development.56 The report identified a set of disease priorities that would require focused intervention.
Brundtland also began to strengthen WHO’s financial position, largely by organizing “global partnerships” and “global funds” to bring together “stakeholders”—private donors, governments, and bilateral and multilateral agencies—to concentrate on specific targets (for example, Roll Back Malaria in 1998, the Global Alliance for Vaccines and Immunization in 1999, and Stop TB in 2001). These were semiautonomous programs bringing in substantial outside funding, often in the form of “public–private partnerships.”57 A very significant player in these partnerships was the Bill & Melinda Gates Foundation, which committed more than $1.7 billion between 1998 and 2000 to an international program to prevent or eliminate diseases in the world’s poorest nations, mainly through vaccines and immunization programs.58 Within a few years, some 70 “global health partnerships” had been created.
Brundtland’s tenure as director general was not without blemish nor free from criticism. Some of the initiatives credited to her administration had actually been started under Nakajima (for example, the WHO Framework Convention on Tobacco Control), others may be looked upon today with some skepticism (the Commission on Macroeconomics and Health, Roll Back Malaria), and still others arguably did not receive enough attention from her administration (Primary Health Care, HIV/AIDS, Health and Human Rights, and Child Health). Nonetheless, few would dispute the assertion that Brundtland succeeded in achieving her principal objective, which was to reposition WHO as a credible and highly visible contributor to the rapidly changing field of global health.
We can now return briefly to the questions implied at the beginning of this article: how does a historical perspective help us understand the emergence of the terminology of “global health” and what role did WHO play as an agent in its development? The basic answers derive from the fact that WHO at various times in its history alternatively led, reflected, and tried to accommodate broader changes and challenges in the ever-shifting world of international health. In the 1950s and 1960s, when changes in biology, economics, and great power politics transformed foreign relations and public health, WHO moved from a narrow emphasis on malaria eradication to a broader interest in the development of health services and the emerging concentration on smallpox eradication. In the 1970s and 1980s, WHO developed the concept of Primary Health Care but then turned from zealous advocacy to the pragmatic promotion of Selective Primary Health Care as complex changes overtook intra-and interorganizational dynamics and altered the international economic and political order. In the 1990s, WHO attempted to use leadership of an emerging concern with “global health” as an organizational strategy that promised survival and, indeed, renewal.
But just as it did not invent the eradicationist or primary care agendas, WHO did not invent “global health”; other, larger forces were responsible. WHO certainly did help promote interest in global health and contributed significantly to the dissemination of new concepts and a new vocabulary. In that process, it was hoping to acquire, as Yach and Bettcher suggested in 1998, a restored coordinating and leadership role. Whether WHO’s organizational repositioning will serve to reestablish it as the unquestioned steward of the health of the world’s population, and how this mission will be effected in practice, remains an open question at this time.
aPicks up variant term endings (e.g. “international” also picks up “internationalize” and “internationalization”; “global” also picks up “globalize” and “globalization”). bNumber for 55 months only.Decade International Healtha Global Healtha 1950s 1 007 54 1960s 3 303 155 1970s 8 369 1 137 1980s 16 924 7 176 1990s 49 158 27 794 2000–July 2005 52 169b 39 759b
The authors are grateful to the Joint Learning Initiative of the Rockefeller Foundation, which initially commissioned this article, and to the Global Health Histories Initiative of the World Health Organization, which has provided a supportive environment for continuing our research.