Recently, I saw two petitions. One was at the site of the Alzheimer Foundation, asking for increased funds for research in the disease. I don't know how long it has been there, but it has 166,000 signatures. The other was at the White House, asking that the budget of the National Institutes of Health (NIH) be increased a measly 7 percent instead of staying flat, which means an effective decrease.
For the latter petition merely to be considered, without a guarantee it would be acted upon, it needed to gather 25,000 signatures by March 19. It failed to do so by 400 signatures (February is a short month but even so, this is a frankly pitiful comment on the public understanding of how research works). So a new petition has been started, with a deadline of April 17.
Why am I describing these dull statistics and why should anyone care? The brief answer is: because the chances of finding a way to prevent or cure Alzheimer's (or any other dementia) are almost entirely dependent on research funded by the NIH. Also known as basic biomedical research, this is the largely invisible spring that enables the existence of the enormous forest of medical applications, many of them hugely profitable for their practitioners. Research scientists don't get rich on NIH monies; other motives tend to drive them.
Let me give you a parallel example from cancer. A few decades ago, "war" was declared on cancer, just as people are trying to do today for dementia (not surprisingly, the shift in focus follows the aging of baby boomers). Targeted research yielded miniscule returns. Two discoveries propelled the field from trial-and-error guesses to understanding and targeting the underlying causes. One was telomerase and its functions, originally discovered in Tetrahymena -- an obscure fresh-water protozoan. The other was oncogenes, first found as activated copies in exotic entities called retroviruses. The latter were themselves considered niche interests until the HIV epidemic exploded in the First World.
If you guessed that these crucial, groundbreaking studies were funded by the NIH, you would be right. And if you guessed that the NIH is the sole significant source of money for this research, you would also be right. The funds from private foundations, state and other federal sources are peanuts in comparison. But the fountain is about to run dry: NIH funding has fallen to the single digits for a while and continues to drop. NIH grants provided salaries, fringe benefits, supplies, infrastructure costs. Many university faculty are essentially NIH employees who rent space at their universities while laying golden eggs in the shape of patents and biotech spinoffs, to say nothing of the boosts in local and national jobs. Academic institutions, long addicted to the NIH largesse, have made few provisions for retaining research faculty during dry spells. Labs are closing in droves and their expertise and reagents are irrevocably lost. The unbroken chain that trained new apprentices and fed the pipeline of applications has snapped.
The early rounds of cuts were called "pruning dead wood" or "trimming fat." Except that for the last two decades we haven't been cutting into fat but into muscle -- and now, into bone. All the changes in the domain of basic research are essentially guaranteeing several outcomes: young people will have fewer and fewer chances (or reasons) to become researchers or independent investigators; new and small labs will disappear; and despite lip service to innovation, research will seek refuge into increasingly safer topics and/or become "big ticket" science, doing large-scale politically dictated projects in huge labs that employ lots of robots and low-level technicians -- a danger foreseen by Eisenhower in his famous "military industrial complex" address which today would label him as fringe hard left.
Biology is an intrinsically artisan discipline: it looks like a crazy quilt of intricately interwoven threads (take a look at the diagram of any biological pathway and you get the picture, let alone how things translate across scales). It is also the science in which details matter, often crucially. What is true in mice is often not true in humans; results routinely defy common-sense assumptions and therapies based on incomplete knowledge can cause more harm than good. A concrete example are the dementia vaccines based on the assumption that tangles and plaques are the toxic entities that cause the death of brain neurons (they're actually neutral warehouses; the real toxic species are soluble oligomers of tau and amyloid, so vaccines that dissolve tangles and plaques in fact speed up the disease process).
Some argue that larger labs are likelier to be innovative, because they have the money to pursue risky work and the gadgets to do so. However, large labs are often large because they pursue fashionable topics with whatever techniques are hot-du-jour, regardless of the noise and artifacts they generate (plus their heads have time for politicking at all kinds of venues). They also have enormous burnout rates and tend to train young scientists by rote and distant proxy. Few projects in biology flourish by having more automation and brute-force methodologies thrust at them. The human genome sequencing was one such project, which has distorted people's perceptions of what works for the discipline at large. National labs adopted the big-scale approach. The result is that their biology wings are essentially empty except for pieces of sophisticated equipment that now serve as expensive doorstops.
Granted, we need big-science approaches; but we need the other kind, too -- the kind that is now going extinct. And it's the latter kind that has given us most of the unexpected insights that have translated into real knowledge advances, often from neglected, poorly lit corners of the discipline. So if you do want a cure for dementia, by all means sign the Alzheimer's petition -- but make sure you also sign the one about the NIH budget increase at the White House.
The truth is stranger than fiction. Step into the world of weird news. Learn more