The Alzheimer’s “Hockey Stick” and the History of Late Twentieth Century Biomedicine

In this post, I use Google’s Ngrams to graphically summarize the history of Alzheimer’s disease, and discuss the direction of my current research.

My research on the history of Alzheimer’s began with a very simple historical question: how and why did the unlikely eponym Alzheimer’s emerge seemingly out of nowhere in the late 1970s to very quickly become a household word in the United States – one of the most feared medical diagnoses and the object of a one of the most well-funded disease-specific campaigns for public awareness and biomedical research in American history.  As can be seen in the google ngram below, which graphs the percentage of the millions of English language books in the Google database in which the term Alzheimer’s disease appeared every year from Alzheimer’s first publication of a case in 1906 to the year 2000, Alzheimer’s disease only became widely discussed in print after 1980. Since Alzheimer’s was available as a medical category from the start of the century, how do we account for the iconic “hockey stick” shape of public awareness about Alzheimer’s, suggesting that it only became a hot issue late in the 20th century?

It is commonly argued that this simply reflects heightened awareness of the aging of the population over the past few decades, making Alzheimer’s a much more visible problem. But a second ngram below for the terms old age, aged and aging suggests that it is not so simple. Though there is an uptick around 1980, public awareness and discussion of aging has been much more stable throughout the modern period. The sudden emergence of Alzheimer’s must be linked to something more specific in American culture and society.

My explanation of the Alzheimer’s hockey stick has two parts. First, of course, was that Alzheimer’s did not really emerge out of nowhere in the late 1970s, but was in fact part of a long developing set of cultural anxieties about aging, senility and selfhood. I argued in my first book that the roots of public anxiety about age-associated cognitive deterioration in American society and culture can be found in the mid-nineteenth century emergence of industrial capitalism, which saw ongoing heated public debate about whether older workers, burdened with what medical science characterized as inevitably deteriorating bodies and brains, could possibly keep up with the pace of industrial work and complex, bureaucratic management. Through the mid-twentieth century, as consumer culture became more deeply embedded in American society, anxiety about aging gradually shifted to whether older people could create and sustain a coherent self-identity in the vacuum of retirement and endless leisure, which by the late 1940s had become as an ambivalent reality for most Americans over the age of 65. The quick and dirty summary of my argument here is that, as selfhood became a more problematic cultural category in industrial and mass consumer society. as identity became more of a project than an ascribed status in modern America, old age increasingly became a site of cultural anxiety, and senile dementia – which destroyed the ability to craft a coherent and cohesive self-narrative – became a far more frightening condition, indeed one of the most frightening of all medical conditions in post-World War II America. And this fear set the stage for the explosive emergence of Alzheimer’s after 1980.

The second part of my argument about the Alzheimer’s hockey stick concerns the way in which Alzheimer’s disease was re-constructed as a specific disease category in the mid-1970s. From the time it was created by Emil Kraepelin and Alois Alzheimer in the early twentieth century, Alzheimer’s disease has always been a troublesome construction – at once one of the most stable and one of the most ambiguous of disease categories. On the one hand, Alzheimer and Kraeplin found it an interesting and potentially important disease category because it defined one of the very few psychiatric conditions with a set of clear clinical symptoms that could be correlated with an equally clear set of pathological structures in the brain. From their work to the present, Alzheimer’s has been understood within the same basic framework of brain pathology – the neuritic plaques and neurofibrillary tangles – correlated with a clear clinical picture of global cognitive deterioration. But on the other hand, it has been difficult to disentangle Alzheimer’s disease from aging. In 1910, Kraepelin created the eponym Alzheimer’s disease based on his protégé’s description of the characteristic brain pathology and clinical symptoms of senile dementia in a patient who was only 51 years old. Kraepelin thought that early age of onset – which he arbitrarily defined as before the age of 65 – was itself enough to warrant putting such cases in a separate disease category. The basic rationale was that while dementia in older people might be thought of as an extreme variant of the common if not universal cognitive decline associated with the normal aging process, the same condition occurring at an earlier age was very unusual and so could reasonably be thought of as a disease. Thus as conceptualized by Kraepelin and Alzheimer, this condition occupied a rather marginal nosological space. As senile dementia, it was a very common condition, but was so closely associated with aging that it seemed questionable to call it a disease. As Alzheimer’s pre-senile dementia, it seemed reasonable to think of it as a disease, but one that was vanishingly rare. This distinction persisted in the medical literature for decades, though it was clearly understood that they were essentially the same condition.

The distinction was eliminated by a group of American researchers, government officials within NIH and activist caregivers who sought to generate awareness and funding for biomedical research into dementia, and who were savvy about the political ramifications of disease categorization. In the mid-1970s, through a number of prominent editorials in various medical journals and NIH sponsored consensus conferences, they successfully argued that since there was no meaningful clinical or pathological distinction between Alzheimer’s and senile dementia, the two should be considered a single entity – and that entity, crucially, should be called Alzheimer’s. By combining the two categories, they could claim that the condition was a major health problem afflicting millions of people; by calling it Alzheimer’s rather than senile dementia, they could claim it was not “just aging,” but a dread disease worthy of a massive, publicly funded research initiative to understand its cause and discover a means of effective treatment or prevention. This strategy brilliantly harnessed deep anxieties about aging to the growing power and prestige of biomedicine in the second half of the twentieth century.

All of this can be read quite nicely into the final ngram below – the terms senility and senile dementia gradually rise in prevalence in English language books from the late-nineteenth through the twentieth century – culminating in the explosion of interest in Alzheimer’s disease after 1980.

My current research will go beyond these broad political and social ramifications of the emergence of Alzheimer’s as a major public issue to explore in some detail the medical world that was actually created by it. The massive investment of financial, institutional, and intellectual capital into research on the causes of and possible treatments for Alzheimer’s disease by both the federal government and private industry since the 1970s transformed dementia research from a small field with a broad agenda, to a massive multi-faceted research enterprise focused much more narrowly on pathological mechanisms. As I’ve remained peripherally connected to the Alzheimer’s field with various projects, I’ve been most impressed with the difficulty of maintaining coherent institutional and intellectual frameworks that connect and coordinate the efforts of diverse practitioners working on different agendas within modern biomedicine. In many ways what happened in the Alzheimer’s field reflects broader changes in the scale and structure of medicine during the same time period, so understanding how these developments shaped the way that researchers in the dementia field worked, the way that physicians diagnosed and treated dementia, the way that patients and their family members experienced both dementia and the treatment and care received by medical and health care providers can help move us to a richer, more nuanced understanding of what medicine has become since the latter half of the twentieth century.

Advertisements

Medical Journalism in the War on Alzheimer’s

They say that truth is the first casualty of war. So how is the truth doing in news coverage of medical research on dementia now that we have finally declared War on Alzheimer’s?

ImageThe question is prompted by the recent major article on the genetics of early onset Alzheimer’s disease by New York Times science reporter Gina Kolata in last week’s NYT Magazine. The article immediately generated a lot of positive buzz in the Alzheimer’s research and caregiver communities. It’s the kind of piece that usually garners awards, not opprobrium. So let me begin by saying what Kolata does right before making the case that she gets the most important things very, very wrong.

Kolata’s writing is a beautiful, and she tells a compelling story of the members of a family struggling to live with the burden of knowing that they may have a gene for early onset Alzheimer’s, and their sometimes enthusiastic and sometimes ambivalent involvement in medical research that can tell them for sure. She also describes some complicated genetic science with commendable clarity.

The problem is that Kolata uncritically accepts the perspective of Alzheimer’s researchers in a way that violates the fundamental value of systematic skeptical inquiry that ought to be at the heart of both journalism and science. There is nothing new or exceptional in this, of course. Frankly, Kolata’s many articles in the Times hyping the latest Alzheimer’s research, like so much of medical reporting in general, reminds me of the sort of journalistic failure, most egregiously by Judith Miller of the Times, that led so many to accept the Bush administration’s claims about weapons of mass destruction in Iraq. Just as uncritical reporting of the Bush administration’s false claims about the presence of WMDs, and its rosy assessment of how American troops would be received by the Iraqi people after dislodging Saddam, influenced the public and congress to support a war in Iraq, uncritical reporting of the sorts of claims made in the article about the imminence of therapeutic breakthroughs will influence the public and congress to continue supporting the war on Alzheimer’s and the growth of the biomedical industrial complex behind it.

Now I am not saying that the motivations of medical researchers in Alzheimer’s or other fields are the same as warmongers in the Bush administration On the whole, I am a fan of Alzheimer’s and other medical researchers and the work they do. But good journalists, whether they are covering the war on terrorism or the war on disease, should be skeptical of sources that have an obvious self-interest. And medical researchers have an obvious self-interest in presenting their research in the most favorable light possible. Thus it should be no surprise, even to people familiar with the daunting complexities of understanding, treating and preventing dementia, that the researchers profiled in the article “say that within a decade there could be a drug that staves off brain destruction and death.” But Kolata should have raised questions about this claim, and talked to experts not directly involved in the research who are far less optimistic about its potential to so quickly lead to effective treatments.

Kolata’s article uncritically reiterates two other important aspects of the perspective of many Alzheimer’s researchers: a warped view of history, and an oversimplification of the disease.

Fairytale History

Regarding history, Kolata spends about 800 words connecting German psychiatrist Alois Alzheimer’s first encounter in 1901 with a patient with what we would today call early-onset Alzheimer’s to contemporary research. Alzheimer brilliantly described the pathological features of the disease, but lacked the scientific tools needed to understand what caused it let alone how to do anything to stop it. “There matters stood until the latter part of the 20th century,” when contemporary researchers heroically enter the stage with powerful new technologies to penetrate the mysteries of the brain and will soon, we are assured, be able to set things right.

Image
Alois Alzheimer (1864-1915)

While it is attractively simple and flattering for researchers to think of themselves as part of a unified, continuous research enterprise stretching back more than a century in which they finally are able make progress on the medical mystery Alois Alzheimer unearthed in the brain of his patient more than a century ago, the truth is a good deal more complicated than that. Scientific and clinical research on dementia has never been a unified enterprise. The goals and approaches of researchers and clinicians are strongly shaped by the historical contexts in which they practice, and given the dramatically different context in which Alzheimer practiced, it is highly unlikely that he shared our concerns about age-associated cognitive decline. Though no one has done a serious historical study focusing on Alzheimer and his lab, I have looked at the available evidence enough to conclude that Alzheimer and his contemporaries simply did not view the disease that was named for him as terribly interesting or important. When he died in 1915, none of the admiring colleagues who eulogized him – not even Emil Kraepelin who named the disease for him in 1910 – listed the discovery of Alzheimer’s disease as one of his major accomplishments.

Moreover, the claim that nothing of significance happened regarding the medical understanding of Alzheimer’s disease and senile dementia between Alzheimer’ and the latter twentieth century ignores a major historical development. In the middle decades of the twentieth century, a group of American psychiatrists developed a psychodynamic framework for understanding and managing dementia and made sweeping claims about how it finally removed some of the mystery shrouding the condition and would soon lead to efficacious means of preventing cognitive deterioration in old age. It is perhaps understandable that contemporary researchers, trained in biological psychiatry and neuroscience and focused on pathology in the brain, have forgotten this chapter in the history of Alzheimer’s research. It is unfortunate though, if only because considering the now largely forgotten work of researchers who thought themselves on the cusp of being able to prevent dementia just might lead today’s researchers to consider the virtues of humility and circumspection in making claims about imminent progress.

While it might be presumptuous of me to suggest that Kolata should have known all this by making herself familiar with my work, she need not perpetuate a history of the field that is so obviously driven by the biases of contemporary researchers. It should not take a historian to recognize a self-serving fairy tale.

Simplifying Alzheimer’s

Regarding the oversimplification of the disease, there are a couple of points to make. First, the many scholars, clinical professionals and caregivers who have been working to lessen the stigma and despair associated with dementia – whom I am proud to count myself among – might take issue with the unremitting grimness with which Kolata represents having dementia. Without denying or diminishing the very real losses and challenges imposed by dementia, we have been working in different ways to show that a life cannot be reduced to a disease, even a disease that brings profound cognitive deterioration. Possibilities for human flourishing remain. Kolata’s story, like so much reporting on Alzheimer’s, represents people with dementia as pure victims – unable to comprehend or resist in any way a disorder that takes everything from them.

But I am willing to cut Kolata some slack here. While we need more stories about living well with dementia, that is not the story Kolata set out to write, and the one she did is important. Her story is about the dread associated with early-onset, familial Alzheimer’s disease, the very rare form of dementia that is associated with several autosomal-dominant gene mutations. Frankly, early-onset familial Alzheimer’s does seem more dreadful to me than the much more common variant that occurs at much older ages. Some may regard this as ageism, but developing a profound cognitive disability in your fifties or even forties does seem much worse than developing it in your seventies, eighties or nineties. And living with the sharp either/or risk of a Mendelian gene for dementia in one’s family seems much more dreadful to me than the gradually increasing risk for dementia associated with the normal vicissitudes growing older.

The problem is that Kolata’s story tends to conflate this very rare form of early-onset dementia, which is estimated to account for only one to five percent of all cases of Alzheimer’s, with the category as a whole. The story acknowledges this explicitly at only one point, nearly 700 words into a 5,400 word story, when it describes the rationale of the Dominantly Inherited Alzheimer Network (DIAN) project:

Though as much as 99 percent of all Alzheimer’s cases are not a result of a known genetic mutation, researchers have determined that the best place to find a treatment or cure for the disease is to study those who possess a mutation that causes it. It’s a method that has worked for other diseases. Statins, the drugs that are broadly prescribed to block the body’s cholesterol synthesis, were first found effective in studies of people who inherited a rare gene that led to severe and early heart disease.

Alzheimer’s is the sixth leading cause of death in this country, and is the only disease among the 10 deadliest that cannot be prevented, slowed or cured. But DIAN investigators say that within a decade there could be a drug that staves off brain destruction and death.”

Throughout the rest of the story, Kolata drops the qualifiers “early onset,” “familial” and most importantly, “rare” and simply uses the term “Alzheimer’s disease.” In the online version of the story, the first reference to Alzheimer’s disease is even linked to the NYT Health Guides general entry for Alzheimer’s disease. The elision of this important distinction has undoubtedly fueled the needless fear many people often have of being at greatly increased risk for dementia because a relative developed cognitive problems in their seventies or eighties, which can be seen in some of the many comments from readers to the online version of the article  It also reinforces the central dogma of the contemporary Alzheimer’s field – that it is a single disease, distinct from aging, caused by some unified patho-physiological mechanism that can be isolated and addressed with a linear therapeutic intervention.

As Kolata must surely be aware, many if not most researchers in the Alzheimer’s field will acknowledge privately though not so often in public that this central dogma is shaky.  In this case, some closer attention to the real history of Alzheimer’s research would be helpful. The term Alzheimer’s disease was originally created to describe cases of dementia – such as the 51-year-old woman Alzheimer encountered in 1901 – where the clinical and pathological features of senility appeared at a relatively early age. Though Alzheimer and his contemporaries had no inkling of the genetic basis, they thought that early onset was sufficient grounds for a separate disease category. In the late 1970s, a group of American researchers, government officials within NIH and activist caregivers lobbied successfully to drop the distinction. Their goal was to generate awareness and funding for biomedical research into dementia, and they were very savvy about the political ramifications of disease categorization. Since there was no meaningful clinical or pathological distinction between Alzheimer’s and senile dementia, they argued that the two should be considered a single entity – and that entity, crucially, should be called Alzheimer’s. By combining the two categories, they could claim that the condition was a major health problem afflicting millions of people; by calling it Alzheimer’s rather than senile dementia, they could claim it was not aging, but a dread disease worthy of a massive, publicly funded research initiative to understand its cause and discover a means of effective treatment or prevention.

Ironically, among the most important research findings generated by the torrent of funding that was unleashed by the political power of the re-conceptualization of Alzheimer’s was the discovery of the genes associated exclusively with the early onset form – which would logically seem to support a return to the original distinction made by Alzheimer and his contemporaries. But the concept of Alzheimer’s as a single, unified disease distinct from aging remains too powerful to abandon. So at the level of policy advocacy and popular news accounts at least, most researchers continue to talk as though Alzheimer’s disease were quite a clear-cut thing, when the reality is much more complicated.

Researchers in the DIAN project and others described in the article are exploring the distinction between presenile and senile dementia, and hoping that the ability to identify and follow subjects from families with early onset genes to test the use of drugs at much earlier stages in the development of Alzheimer’s will be quick route to a drug that can effectively prevent the disease.

I hope this strategy pays off, and that in a few years we will see more big stories of successful drug trials from the DIAN project – though even then I hope they will be stories that more accurately represent the complexities of medical research on Alzheimer’s. But given the complexity of dementia and the difficulty of identifying efficacious patho-physiological targets for drugs in a disorder with multiple, inter-related causal mechanisms, I think it much more likely that the drugs tested in these trials will be of limited value. In that case, if we read about it in publications like the NYT at all, it will likely be a much smaller item buried in the back pages. Meanwhile reporters like Kolata will be on to writing splashy front page articles about the next imminent breakthrough.

That’s how we roll in the War on Alzheimer’s.

Blogging and dementia: Why this blog? Why any blog?

Like a lot of academics, I have found it difficult to stay focused on my research and writing following the publication of my first book. In part this reflects the usual challenge university faculty have in balancing teaching and administrative obligations, and taking quite seriously the quaint notion of “having a life” outside of work. In my case, it also reflects some ambivalence about my work that I think comes from the peculiar emotional demands of this topic – ambivalence that I may explore on this blog soon. In any case, I intend to use this blog as a way to get and keep my head back in the game. It will be a space for thinking out loud, forging some new connections and staking out some commitments in public as I begin work on a new book.

So this blog will serve as a kind of open notebook to explore issues related to my ongoing work on the history of Alzheimer’s disease and aging in the modern world, which I described in some detail in the “about this blog” and “about me” pages. In the remainder of this post, I want to talk about the questionable enterprise of blogging itself and how it might relate to dementia.

My friend and colleague in the history of the neurosciences Stephen Casper has written a very nice post on his blog that lays out all the good reasons an academic might have for blogging. Still, the sheer absurdity of this enterprise ought to be acknowledged up front.

When I write for any public, regardless of whether the medium is a book for an academic press, an article for a general audience publication, or a facebook status update, I write out of a stubborn commitment to the idea that my life can and should have meaning, and that writing for others can be a means of solidifying that meaning, of somehow making a difference in the world.

But there are at least two problems with publication that have developed over the past few decades that threaten to overwhelm the modern ideal of writing as an act of meaning. The first is the crisis of the overproduction of information that threatens to overwhelm the ability of any individual to know what is important. In the academic world, lip service to interdisciplinarity aside, the accelerating production of new scholarship in every field has made increasingly quaint the notion that, as university faculty, we should be sufficiently aware of major developments across the university to be able to engage meaningfully with any and all of the important ideas of the day. Even in a very narrowly defined subfield, scholars struggle to keep up with the proliferation of new research.

In the blogosphere,where the flow of information is not controlled by peer review and the costs of production are nil, things have quickly become truly absurd. The best current estimate suggests that there are now more than 172 million blogs out there, with about 75,000 new blogs created and more than a million blogposts published each day. The old saw that a roomful of chimpanzees randomly typing for long enough will reproduce a literary masterpiece seems now extended toward its logical conclusion: an infinitely expanding number of monkeys typing will eventually produce everything that it is theoretically possible to say.

In any case, there certainly aren’t enough monkeys in the world to read all that is earnestly being written. Given such daunting numbers, how can choking the virtual world with one more blog possibly be an act of communicative meaning?

A second problem concerns the creation and maintenance of self-identity in a hyper-mediated world. Since the early 1980s, social critics have argued that post-World War II mass-consumer society has presented acute challenges to the creation and maintenance of selfhood. For my purposes here, the most cogent of these critics is psychologist Kenneth Gergen, who argued in The Saturated Self  (1991) that technology wildly proliferate human relationships to the point of “social saturation,” with the result that “the very concept of personal essences is thrown into doubt. Selves as possessors of real and identifiable characteristics – such as rationality, emotion, inspiration, and will – are dismantled.” As suggested by the artwork (shamelessly stolen below) illustrating Teddy Wayne’s brilliant send-up in last month’s New Yorker of the cultural practices of “reposting,” the emergence of blogging and other social media since then would seem only to have exacerbated this problem, as the opportunities to communicate about oneself and what we think is important through new forms of media to an expanding circle of “friends” or “contacts” are coming to seem more like obligations and burdens than opportunities.

And here is where blogging connects to dementia. As I argued in my book, it is no accident that these sorts of social critiques became commonplace at roughly the same historical moment that Alzheimer’s disease was emerging as a major public issue. A disease whose most prominent feature is the destruction of memory, and most dreaded moment is when victims no longer recognize friends and family members they have known for a lifetime, seems to perfectly embody these concern about the erosion of self. Alzheimer’s disease, it seems, is one of the emblematic disorders of a post-modern culture. And conversely, blogging and social media seemsto embody the fragility and fragmentation of postmodern selfhood that has come to make Alzheimer’s so frightening.

Having said all that, I immediately feel the need to issue caveats (which is perhaps symptomatic of the very problems I am describing.) Though the overproduction of information certainly undermines the ideal of writing to create meaning, it does not make meaning impossible. Moreover, though I think the connection I point to between the symptoms of dementia and the way that hyper-mediation of the social world challenges our ideas of selfhood is real and significant, to assert that they are the same would be absurd and dismissive of the real challenges faced by people with dementia. I make a distinction between the dementia produced by the hypocognitive situation of the person with Alzheimer’s, and the confusion produced by a hypercognitive society. Both are profoundly, perhaps at times even equally, disorienting and disruptive of a coherent sense of self. But there is a difference between having one’s cognitive abilities impaired to the degree that one cannot successfully perform expected social roles, and experiencing confusion – even extreme confusion – because social roles that one successfully performs are contradictory and incoherent.

I do not know the answers to these problems, but I do think we do well and grow by acknowledging and learning to live with them. So however absurd it may be in the crowded, narcissistic echochamber of the new media world of discourse, this blog will be a means for me engage in the ritual speech act of talking about some things that I think are important, of taking stands in the world in the hope of connecting to some larger human purpose.

Of course, I do recognize that I may be deceiving myself, that this enterprise may be part of some larger process, some larger, grander scheme beyond my ken. Perhaps I am merely being recruited into that vast and growing army of monkeys typing endlessly toward the information apocalypse that will be brought about when every conceivable idea has been expressed. And on that day the unfolding universe of words and its illusive promise of meaning will finally be brought to its fulfillment. The vast human stream of words will expand to become everything and nothing at all.

Whatever the true purpose of my efforts, I will humbly endeavor play my part…