Last fall, a year after the anthrax attacks sounded a bioterrorism alarm, Anthony S. Fauci, the director of the National Institute of Allergy and Infectious Diseases, promised that he would strive to produce results from his agency to justify its receipt of an anticipated appropriations increase of $1.5-billion -- the largest in the history of the National Institutes of Health. “In three or four years when the question is asked, ‘What did you learn?’ the wrong answer is, ‘We learned a lot,’” he said. “The right answer is, ‘We learned a lot and now we have the following deliverables for you. ... ' We will maintain the basic science base, but we will have deliverables.”
It is a notable occasion when the director of an NIH institute publicly holds himself, and the scientists whom his institute supports, accountable for finding solutions to specific public-health threats within a given time frame. Dr. Fauci’s statement contrasts with the NIH’s long tradition of emphasizing the support of investigator-initiated research in the basic sciences, which is far less often guided by the pursuit of “deliverables.”
Dr. Fauci seems confident that his mission-driven research efforts will yield effective results. And our society deems it appropriate to devote huge amounts of money to combat an uncertain -- though potentially catastrophic -- threat of biological or chemical attack, in a concentrated effort similar to that which sent astronauts to the moon. But the basic-research establishment resists comparable efforts to tackle the absolutely certain, demographically driven explosion of illness and death associated with chronic conditions like diabetes, heart disease, cancer, depression, and Alzheimer’s disease.
Instead, the scientific establishment asks the public to wait patiently for scientific investigations to yield results that we can use in clinics and hospitals, a process propelled not by calculations of what might provide the most benefit for the most people, but rather by the unpredictable winds of third-party reimbursement, professional adoption, and pharmaceutical marketing. It often takes decades for new medical approaches to reach the public, especially when the new practice or procedure has no commercial potential.
Consider the study that John Stover, a scientist with the Futures Group International, and others published in The Lancet, in conjunction with the 2002 World AIDS Conference, reporting that using proven methods of prevention, like programs of education, needle exchanges, blood screening, and condom distribution, could prevent 29 million new infections over the next eight years. That shocking conclusion brings to mind some of the many other instances where the benefits of health research have taken a long time to reach even some of the people they could help.
For instance, in the early 1980s, an Australian researcher discovered that stomach ulcers, which doctors had thought resulted from stress, diet, and smoking, were often caused by a bacterium and were treatable with antibiotics. In spite of articles published in scientific journals and an endorsement of the researcher’s work by an NIH panel, many primary-care providers still have not changed the way they treat ulcers to conform to current medical knowledge.
And in 1979, a report strongly supporting regular eye examinations for diabetics was published. Since then, many studies have shown that such examinations help prevent eye disease, but today only half of all diabetics are being screened.
Dr. Fauci’s response to his institute’s increased appropriation signals a sea change. Members of Congress, other politicians and policy makers, and some members of the public are asking what short- and long-term benefits we are gaining from our growing investment in health research.
One key part of that question is whether we are succeeding in applying the results of research to interventions to improve health. Increased support for basic research throughout the NIH has not been matched by an equivalent investment in research to synthesize findings across disciplines and accelerate their application. In fact, the Agency for Healthcare Research and Quality, which has primary responsibility for supporting applied health research, typically gets only one-hundredth the money of the NIH, the agency charged with discovering cures for diseases.
That imbalance is not new. As the late Stephen Jay Gould observed, the “history of ideas emphasizes innovation and downgrades popularization.” The deep investment in basic health research rests in large part on three assumptions: first, that basic research will ultimately, though perhaps indirectly, lead to solutions to pressing real-world problems; second, that the best science will naturally trickle down (i.e., be used to improve health) through eventual diffusion of research findings; and third, that the market will select and promote the most promising solutions.
How accurate those assumptions have been in the past is open to question. There are a number of reasons to re-evaluate the strategy of investing primarily in basic health research while allowing innovations to filter into policy and practice at their own rate:
The strategy of deep investment in basic research and relatively limited funds for applied research and activities to put knowledge into practice is short-sighted. It is producing a huge amount of knowledge that will capsize our already overloaded process of applying new discoveries. Identifying valuable discoveries is a haphazard enterprise. We have no regular series of steps to help us identify significant new findings and shepherd them to widespread application. Similarly, no single agency or institution is charged with ensuring that application: The Agency for Healthcare Research and Quality and similar public and private enterprises rarely deal with the whole picture, or coordinate their activities. We currently have a huge backlog of knowledge that is still not part of routine medical care, public-health practice, and individual behavioral norms. We must find a way to streamline the process by which that knowledge, as well as future discoveries, is brought into our lives.
The current strategy contributes to the disparity in health between the haves and the have-nots. The fact that a great many AIDS cases around the world could be prevented by adopting practices that we already know can reduce the spread of the disease illustrates how the lag between discovery and application disproportionately affects some people living in poverty with little access to information, condoms, and life-extending medicines. Similarly, many poor Americans do not have access to the latest genetic-screening technology, scientific information on the Internet, or even basic preventive immunizations and vaccines.
The strategy restricts the growth of a “translation” work force. We need many more experts to study and evaluate health services, develop guidelines for standards of care, and otherwise improve the quality of public health. Yet today our society offers few opportunities or incentives to young people for careers in such translation-related activities.
Ultimately, the strategy may undermine the public’s willingness to invest in basic science. Voters are not likely to continue to support basic health research if they cannot clearly see the impact of that support at the personal level, as well as the national level. Investing in health research is an investment in hope -- hope for a cure for oneself, a relative, or a friend, and for future generations. If few cures materialize, or even seem imminent, the public may lose faith in science.
How can we rework the current strategy to prevent those unintended consequences? We need to build the nation’s health-research portfolio in the same way that a person builds the elements of a financial portfolio -- to provide the greatest possible benefit over time. That means diversification. In the case of health, it means finding a better balance between investigator-driven basic research and mission-driven applied research. It means devising ways to identify discoveries without commercial potential and develop them for use. It means investing in, and disseminating, syntheses of research. And it means building the capacity of health systems and professionals to absorb new information and practices.
Currently we are distracted by events in Iraq and possible future terrorist attacks, and by a struggling economy. Those issues are important, but we cannot allow them to blind us to the dangers that chronic diseases pose to our nation’s health and productivity. Preventable and treatable illnesses will kill more Americans than bioterrorism. By promising tangible results in the fight against bioterrorism, Anthony Fauci has made a heroic declaration, not just because it is a wartime call to action, but because it illuminates a path of accountability to a public that pays a high price for health research, and a higher price for the current laissez-faire approach to achieving specific aims in the prevention, treatment, and management of chronic diseases.
Jessie C. Gruman is the president and executive director of the Center for the Advancement of Health, in Washington.
http://chronicle.com Section: The Chronicle Review Volume 49, Issue 29, Page B20