A June 12 article in the New York Times entitled "A Decade Later, Genetic Map Yields Few New Cures" makes the case that 10 years of genomic research have been profoundly disappointing. Announced with fanfare by then-President Bill Clinton and British Prime Minister Tony Blair in 2000, the mapping of the human genome was expected to reveal the root genetic causes of diverse, serious diseases, and engender therapeutic insights, targeted treatments and elusive cures.
That the cures have not yet ensued is perhaps neither cause for surprise nor disappointment -- as this was always expected to take some time. After all, the work on cures for genetic diseases cannot begin in earnest until the culpable genes have been indicted. The disappointment, rather, is that the ranks of such genetic culprits are surprisingly thin.
The problem is not with the map of the genome, which is largely all it was claimed to be. Rather, the notion that specific variants of specific genes can be identified as the "cause" of a cancer, or of Alzheimer's disease, may simply be wrong. In many cases, the relevant genetic variants may be rare and difficult to find. In many more, there may be multiple genes involved rather than one.
Genomic research has led to basic biological insights, but it has failed to deliver thus far on the promise of real-world biomedical advance. Thus, while biologists and geneticists may still see the cup of genomic promise as half full, medical scientists and, apparently, New York Times reporters -- are starting to see it as half empty.
My view is that we are talking about the wrong cup. We have had another cup, overflowing with promise to advance the human condition, in our hands since 1993 at least.
In that year, a paper entitled "Actual Causes of Death in the United States" was published in the Journal of the American Medical Association by Drs. William Foege and J. Michael McGinnis. McGinnis and Foege revealed the obvious we had all overlooked: when someone dies of, say, a heart attack, it is not very illuminating to cite the cause as disease of the cardiovascular system. What we all really want to know is: what caused that?
Such answers were readily available. Overwhelmingly, premature death and chronic disease were attributable to just ten behaviors each of us ostensibly has the capacity to control: tobacco use, dietary pattern, physical activity level, alcohol consumption, exposure to microbial agents, exposure to toxic agents, use of firearms, sexual behavior, motor vehicle crashes, and illicit use of drugs. That list of ten was, in turn, much dominated by the top three -- tobacco use, dietary pattern, and physical activity level -- which alone accounted for nearly 800,000 premature deaths in 1990.
When CDC scientists reassessed this landscape a decade later, publishing their findings, again in JAMA, in 2004, they found relatively little had changed. Across the span of a decade, injudicious use of feet, forks, and fingers remained the dominant determinants of unwelcome fate.
In the summer of 2009, yet another paper was published by CDC scientists and their colleagues -- in the Archives of Internal Medicine this time -- examining lifestyle factors and health. The investigators surveyed over 23,000 German adults about four behaviors: smoking (yes or no); eating well (yes or no); getting regular physical activity (yes or no); and maintaining a recommended weight (yes or no). That weight is not a behavior is an important aside, but a topic for another day.
Those with all good answers -- not smoking, eating well, staying active, and being lean -- as compared to those with all bad answers -- had roughly an 80% lesser likelihood of experiencing any major chronic disease. Flipping the switch from bad to good on any one of the factors was associated with a 50% reduced probability of chronic disease. Any drug with a faction of such potential would be a blockbuster.
And finally, to make the tale entirely current, a group of researchers from Norway and England found much the same thing in a study of over 5000 adults in the U.K., reported in the Archives of Internal Medicine in April of this year.
The compelling case for feet, forks, and fingers as the master levers of medical destiny reaches further still. In fact, it reaches to our very genes.
In a study reported in 2008 in the Proceedings of the National Academy of Sciences, 30 men with early stage prostate cancer received an intensive lifestyle intervention for three months: wholesome, plant-based nutrition, stress management, moderate exercise, and psychosocial support. Standard measures -- weight, blood pressure, cholesterol, and so on -- all improved significantly, as one would expect. But what makes this study unique -- and ground-breaking -- is that it measured, using advanced laboratory techniques, the effects of the intervention on genes. Roughly 50 cancer suppressor genes became more active, and nearly 500 cancer promoter genes became less so.
This, and other studies like it, go so far as to indicate that the long-standing debate over the relative power of nature versus nurture is something of a boondoggle, for there is no true dichotomy. We can, in fact, nurture nature.
From a preventionist's perspective, the lesser shame is our failure, to date, to wrest much new knowledge of practical value from our genome. By far the greater shame -- measurable in years lost needlessly from life, and life lost needlessly from years -- is our failure to use the knowledge we already have -- and have had since 1993 at least -- about the overwhelming influence of lifestyle on health. Feet, forks, and fingers are, indeed, the master levers of medical destiny. This is unlikely to change, no matter what secrets our genomes ultimately reveal.
Genomic disappointments notwithstanding, we have long since led ourselves to another cup of medical promise that veritably "runneth over." The question, now looming for some 20 years is: can we make ourselves drink?
Dr. David L. Katz; www.davidkatzmd.com
Follow David Katz, M.D. on Twitter: www.twitter.com/DrDavidKatz