Time to Put Up or Shut Up on the Book Project; or, A Genuine Fraud Comes Clean

If any of the handful of followers this blog has ever had over the years remain, they certainly know not to get their hope up when, every two or three (or seven!) years a new post appears on this webpage promising a fresh burst of activity.  After a brief torrent of posts, as inevitably as the setting sun, the blog slips back into dormancy.

But perhaps that’s a good sign! Perhaps the author has not had time to blog because he has been diligently writing other, more important things? Perhaps some real scholarship?

Alas, dear reader…. Nope.

It would be fair enough to blame my lack of writing productivity on my job. As I’ve explained on this blog before, after my postdoc, I’ve never had a tenure-track research position, and since January 2014 I have held a teaching position that just does not include much time for scholarship. Nonetheless, I’ve kept the embers of my research and writing going with a handful of book chapters, articles, and conference presentations. Most importantly, in 2019, I put together a book proposal to Polity for a crossover project on the history of dementia that got stellar peer reviews, resulting in a contract to deliver an 80,000-word manuscript in January 2021.

Since then…. A few fevered thoughts about different aspects of the project sketched out in longhand in my notebooks, the initial bibliographic work and a couple of thousand words toward the first substantive chapter. In other words, practically nothing. When talking about the project, I still say I have just begun – which is certainly true in terms of what I have accomplished. But after three years with so little progress, one begins to feel like a fraud.

Of course, a lot of academics talk about imposter syndrome, that haunting feeling of knowing far less and accomplishing far less than everyone around you seems to think, the occasional stabs of irrational fear that you will be publicly humiliated, and it all taken away. But given the bona fides described above, I’m here to say that, with me it’s not just talk! I’m the real thing – a genuine academic fraud! Plus, as untenured teaching faculty, I really could have it all taken away in short order.

The truth is, since taking this job, I’ve mostly been talking a good game about continuing my research and writing. I remember shortly after starting at Drexel, I was describing my teaching load to an academic colleague. “So you’ve given up the idea of continuing with research then” he replied. I corrected his assumption. I did not intend to give up on my research and writing. And I have certainly done more than is expected for a teaching position. But maybe I have been kidding myself.

In any case, I can’t stand continuing to talk about this book project if I’m not willing and able to actually write it. So it’s time to put up or shut up. My current due renegotiated date for the completed manuscript is December 1, 2023. Over the next year, I’ll give it my best shot. Finishing the entire manuscript in that time is obviously not realistic, but I’ll either make real progress on it – let’s say at least half the chapters completely drafted – or I’ll acknowledge reality and abandon the project altogether.

One of the attractive ideas to me about blogging and social media is for it to be some form of public accountability for what I’ve set out to do. So, for anyone interested in following my work, I’ll post updates here and on twitter (at least as long as twitter remains a functional platform rather than a Musky hellscape.) And if, over the next year, I do not report significant progress – I promise I’ll slip quietly into the mists and speak no more of being a writer.

Ageism and Generational Conflict in the COVID-19 Pandemic and Beyond

I wrote this piece back in April for a journal’s rapid response CfP. It was understandably rejected—trying to fit too much into a very short word count. Given time and an occasion, I’ll expand it. But in the meantime, I’ll leave it here in case someone will find it useful. 
Like all disasters, the COVID-19 pandemic pushes on fault lines running through the structure of society. In normal times, these cracks may be all too easy to ignore. But under the pressure of crisis they become unstable, threatening to break society apart. This essay is about a particularly disturbing set of cracks widening in the time of the novel coronavirus: ageism and generational conflict.

Ageism is deeply entrenched in western societies, and in early March the director-general of the World Health Organization Tedros Adhanom Ghebreyesus charged that this was behind the lack of political commitment of some countries to control the spread of coronavirus. “If anything is going to hurt the world, it is moral decay,” he said. “And not taking the death of the elderly or the senior citizens as a serious issue is moral decay.  Any individual, whatever age, any human being matters.”[i] In early April, Human Rights Watch issued a statement calling on governments around the world to combat ageism and ensure that older people retain access to necessary medical and social services.[ii]

In the United States, critics of ageism have decried the way that media accounts and the pronouncement of public health officials about COVID-19 revolve around age-associated vulnerability. In January and February, many articles appeared seeking to quell panic with some version of the claim that “COVID-19 only kills old people”; since early March, articles have warned that we must take social distancing guidelines seriously because it is “not just old people” who are getting seriously ill and dying from the disease. Geriatrician and best-selling author Louise Aronson, among others, pointed out that such accounts frame the lives of older people as less valuable, less worthy of concern.[iii]

By early April, advocacy groups had filed complaints with the federal government arguing that guidelines some states had formulated to deal with the agonizing prospect of rationing ventilators and other life-saving resources in the event that hospitals become overwhelmed by a surge of COVID-19 unfairly discriminated against patients with physical and cognitive disabilities.[iv]

Age studies scholar Margaret Morganroth Gullette criticized guidelines that explicitly or implicitly include age as a factor. For example, the guidelines produced by the University of Pittsburgh’s Department of Critical Care Medicine use age as a “tiebreaker,” favoring younger patients over older ones who would benefit equally from a ventilator. Age-based criteria in triage guidelines are typically justified in terms of life-course justice where dying at an older age is given less moral consideration because the person has had a chance to live a full life. Gullette rejects such reasoning, arguing that the perceived value of potential future life is a judgment about the social worth of a person based solely on their age and should no more be used as exclusion criteria in triage than race, gender, or social class. She is also skeptical of the idea that older people volunteering to forego treatment to save scarce resources for younger patients – or to sacrifice themselves for the sake of the economy as Texas lieutenant governor Dan Patrick suggested – can be taken at face value. Just as other socially marginalized groups learn to internalize inferiority, older people begin to believe ageist messages and, given the intensity of ageism in the COVID-19 crisis, may come to accept the idea that they are expendable. That does not make it true or morally defensible.[v]

Generational conflict has also grown worse in the early months of the COVID-19 pandemic, but it is important to disentangle it from ageism. As a set of negative attitudes toward the elderly, based solely on one’s chronological age, ageism in one form or another has been a fixed feature of the United States since at least 1900.[vi] Generational conflict may reflect or reinforce ageism, but is grounded in more specific social and political issues of a given point in history. In the United States, a conflict has been simmering, in the mass media at least, between millennials and baby boomers since the early 2000s over issues ranging from the economic opportunity to climate change.

The joking description of COVID-19 as the “Boomer Remover,” which began trending on Twitter in mid-March, may be ageist. But it is also much more clearly connected to specific grievances younger generations hold toward the baby boomers. Few #boomerremover tweets expressed hostility about older people simply for being old. Most of them were framed around the idea that baby boomers specifically were out of touch with the most pressing problems in the world today. Many were from millennials complaining that their older relatives were not taking social distancing guidelines in the emerging pandemic seriously enough. (On the other hand, a countervailing social media trope depicted young people recklessly partying in large groups with no regard for the possible harm this could cause to vulnerable people they came into contact with, particularly their parents and grandparents; so far as I know there are no data on whether there has been a significant generational pattern to following social distancing guidelines.)  Many other tweets portrayed the elevated risk boomers faced as comeuppance for decades of obstruction to progressive political causes such as wealth inequality, universal health care, gun violence, and climate changes, and stressed the irony of boomers happily ignoring all of these concerns but becoming enraged because some teenagers started using a mean hashtag about them on social media.[vii]

Generational conflict inevitably trades on overly broad generalizations and stereotypes. Baby boomers have of course been prominent in the ranks of every progressive cause, just as plenty of younger people have supported conservative politics. The injustices and threats that concern many millennials are the result of social and institutional power structures that cannot be attributed to an entire generation. All that said, there is a core of reality to the millennial/boomer conflict that should not be dismissed. The generation gap has been one of the most striking features of America’s divisive politics since 2016, with younger voters tacking far to the left of older voters. One of the most important factors in the failed campaigns of progressive champions Elizabeth Warren and Bernie Sanders, in whom many young people had placed their political hopes, was how poorly they did with older voters.

Beyond frustration at what can reasonably be seen as a baby boom electoral roadblock to the progressive political agenda that a large majority of young voters support, millennials have good reason to feel especially concerned about economic security. Entering the workforce around the time of the Great Recession of 2008, they struggled to find stable jobs with good pay and benefits. Now the pandemic presents an economic crisis that will likely be far worse, and early data suggests that millennials are suffering a significantly greater loss of employment than older, more established workers in the baby boom generation.[viii]  

COVID-19 should not be seen as an isolated event, but a nodal point in a cluster of seismic events – social, economic, political, and natural – that have been occurring over time to produce the public health and economic catastrophe we are currently experiencing. Ageism and generational conflict are part of that cluster. It is a cluster that will become more complex, unstable, and dangerous as we enter an era of increasingly frequent “natural” disasters associated with global warming. Ageism and generational conflict must be addressed in the vital conversations we are having now about justice in the COVID-19 pandemic, and they must be addressed in the broader conversations we will have about the meaning of a good individual and collective life in a future of growing climate chaos and risk.


[i] P. Barnes, “Did U.S. Response to Covid-19 Lag Due to Age Discrimination?” Forbes, March 13, 2020, March 13, 2020. https://www.forbes.com/sites/patriciagbarnes/2020/03/13/did-us-response-to-covid-19-lag-due-to-age-discrimination/#5417b4c71784.

[ii] Human Rights Watch, “Rights Risks to Older People in Covid-19 Response.”  Human Rights Watch, April 8, 2020 https://www.hrw.org/news/2020/04/07/rights-risks-older-people-covid-19-response#.

[iii] L. Aronson, “Covid-19 Kills Only Old People. Only?” New York Times (New York, N.Y), March 22, 2020. https://www.nytimes.com/2020/03/22/opinion/coronavirus-elderly.html?smid=tw-share.

[iv] S. Armour, “Plan to Ration Treatment Is Unfair to Frailest Patients, Advocates Say.” Wall Street Journal (New York), 2020. https://www.wsj.com/articles/rationing-plans-in-coronavirus-crisis-draw-growing-discrimination-complaints-11586430000.

[v] M.M. Gullette, “Avoiding Ageist Bias and Tragedy in Triage: Even a Lottery Is Fairer Than Triage by Age.” Tikkun April 14, 2020. https://www.tikkun.org/avoiding-bias-and-tragedy-in-triage.

[vi] T.K. McNamara and J.B. Williamson, Ageism: Past, Present, and Future. Routledge, 2019.

[vii] A. Whalen, “What is `Boomer Remover’ and Why is it Making People So Angry?” Newsweek, March 13, 2020. https://www.newsweek.com/boomer-remover-meme-trends-virus-coronavirus-social-media-covid-19-baby-boomers-1492190

[viii] A. Lowrey, “Millennials Don’t Stand a Chance: They’re Facing a Second Once-in-a-Lifetime Downturn at a Crucial Moment.” The Atlantic, April 13, 2020. https://www.theatlantic.com/ideas/archive/2020/04/millennials-are-new-lost-generation/609832/.

Richard Taylor: Humanity Unleashed

Last weekend, the dementia care and advocacy community lost one of its most unique and powerful voices with death of Richard Taylor.  A few years after being diagnosed with probable Alzheimer’s-type dementia, Richard Taylor emerged as a passionate, uncompromising advocate for people living with dementia. He challenged stigma wherever he found it, and was often a harsh critic of the Alzheimer’s establishment, which he argued over-emphasized the search for a future cure at the expense of programs to support people struggling with dementia today. Anyone with an interest in dementia should know his remarkable book, Alzheimer’s from the Inside Out.  

I wish I had gotten to know Richard much better. I met him through email and then at a conference about five years ago, and continued to stay in touch with him from time to time.  The interest he took in my work on dementia has been an important motivation for me to keep going.

Many of those who knew Richard well have written tributes to him, some of them gathered in a post on Truthful Loving Kindness. Kater Swaffer wrote a particularly moving post about how Richard inspired other people with dementia to become advocates. Geriatrician Al Power’s tribute included a remarkable quote which I think captures the radical challenge Richard Taylor posed:

“I believe that as people progress with dementia, their humanity increases. People have to get ready for that humanity to be unleashed.”

Aging, Death, and the Completion of Being

Blink – there went another three months.

Don’t quit reading yet – this is not going to be just another apology for failing to keep up with this blog. I have no illusions about the world needing it, waiting for it, or missing it when it’s not there. This post is frankly about what I need, and what I fear as I grow deeper into middle age.

But bear with me, I do need to explain my absence. There are, of course, perfectly good reasons why I have not kept up with this blog. I teach full time – which for me means four classes per term, suffice it to say a load most academics would consider incompatible with research and writing – and have taken on the administrative burden of directing a rapidly growing graduate degree program.  I try to stay involved in the broader academic world. In the spring, I participated in an excellent workshop on MCI (which I will try to blog about soon), and reviewed a book manuscript and a book proposal for two different academic presses. And I have a family and friends, commitments to church and civic life. And I try to give those relationships and affiliations the time they need. I have a life, as they say. So yeah, I’ve been busy.

There are, of course, also some not so good but perfectly understandable reasons why I have not kept up with this blog. I waste time. I get distracted by social media and lost in the interwebs. I forget priorities, get caught up in a labyrinth of delusion and self-doubt. In other words, I’m human.

Oh, there is no end to reasons why I have not kept up with this blog. As I suggested in my last back-from-the-dead post, the idea of keeping a blog is absurd, but really no more absurd than any of my dreams of creative work – and that’s my reason for trying to do it. Life gets in the way. But this is not a complaint or an apology. My life, for all of its imperfections, is good.

And life has been like this for a long time, decades at least, the steady accumulation of responsibilities large and small filling my life to overflowing, dreams and aspirations bobbing along in those waters like survivors of a shipwreck. But what’s new in my life now is real awareness that it is finite. I hope to live a long, healthy life, and continue creative work deep into old age. That’s possible, but there are no guarantees. A few months back, I got news of the death of a good friend from grad school that I sadly had fallen out of touch with over the decades. And just recently I’ve learned that another old friend, some years younger than me, is dealing with a serious cancer. These are tragedies, but they are not unexpected, nor unusual events in the long course of human life. I may be able to continue to do creative work for three or more decades, that’s a reasonable hope. But there are reminders every day that I may have much less time than that.

Of course, I can still remember being young, when the problem was knowing what to do with the seemingly endless void of time, days stretched out to a barren, seemingly infinite horizon. But I remember youth too well to want to go back there even if I could. No, I’m happy to accept this as my dilemma, my rock of Sisyphus, my life: an endless, growing pile of things to do, more than I can ever hope to accomplish in the limited time I have left.

Sometimes I succumb to the maudlin, imagine that my situation is like Hemingway’s hero in “The Snows of Kilimanjaro,” musing on an incomplete life with the nearness and inevitably of death palpable in the stink of his gangrenous leg: “Now he would never write the things that he had saved to write until he knew enough to write them well. Well, he would not have to fail at trying to write them either. Maybe you could never write them, and that was why you put off the starting. Well he would never know, now.”

But on good days at least, I don’t wallow in narcissism. My creative work is not about my self-fulfillment as an individual, a man. Not really about me at all. My best ideas and dreams, to the extent that I am able to make them real through creative work, are nothing more nor less  than a small, ultimately anonymous contribution to the vast human susurrus, the mumbling into being of a meaningful world. 

Still, masculine melodramatics aside, Hemingway’s musings undeniably capture something real about being human in the modern world. Acknowledging that my life is finite, I can’t help but think about what I will inevitably leave undone: At least two scholarly books fairly concrete in conception, though the necessary research is barely begun; more vaguely conceived but perhaps more vividly imagined, a series of novels, stories and poems that would be a dramatic departure from the kind of creative work I’ve done so far. If I live to be a hundred, I probably could not accomplish half of this. But, for me at least, staying alive means keeping these projects alive in my mind and doing what I can to make them real in the world. So I find it unnerving to think about these grand ideas and dreams disappearing with my death, their potentiality flickering out within the dying neurons enclosed in my skull.

Book Cover - Aging, Death and the Completion of Being

Aging, Death and the Completion of Being. 

This comes from the title of a 1979 book edited by my mentor and dissertation advisor, the late David D. Van Tassel, one of four books he edited from the late 1970s through the early 1990s that laid the foundation for the historical study of aging. The memorable phrase, “the completion of being” has always struck me as saying something important about aging, but I’ve never been sure what. From the perspective of a hopelessly overbusy middle age, bringing my own life to final and satisfying completion is difficult to imagine. Perhaps the art of living a good life, whenever the time for its ending comes, is to find closure and completion in the face of all that is left undone.

Creating Time: The Magic of Writing

Well, it’s been quite a while, so I suppose some explanation is in order. I regret losing contact with the handful of loyal readers this blog had, and apologize especially for failing to follow up with some people that I met through it who reached out to me more directly. As I noted in my last apology for a long period of absence from the blogosphere (this must be the second most prevalent genre of blog posts, right after “Hello World – Welcome to my Blog!”), one of my particular challenges has been the elimination in 2012 of the Science, Technology and Society Program at Penn State, where I worked for more than ten years. It’s hard to imagine that I will ever again find an environment as congenial and supportive of my work.

The last year and a half has been about moving on, getting over the loss and building a new life around a new job at Drexel University in Philadelphia. Since my last blog post, there has been the job search and interview, moving  to Philadelphia ahead of my family last January to start the job, countless hours on the Megabus between State College and Philly, selling our house there buying new one here, and moving the family and household here last summer. Until now it’s really been quite impossible to think about the research and writing that this blog represents.

And even now. My official title is associate teaching professor, which means I do a lot of teaching. In my first year here, I taught twelve classes on the quarter system, including the summer quarter. (I believe that would be nine on the semester system).  I’m not complaining. I’ve always loved teaching, and I love it now.

But still, pursuing an ambitious agenda of research and writing with this kind of teaching load is daunting. When I was explaining my new situation to a friend and colleague from Penn State, he said “So you’ve given up the idea of continuing with research then.”

No, no I haven’t. It will be hard, and there is no guarantee I will be able to manage it. And I am ok with failing. But I believe enough in the work I’ve laid out that I’m not ok with failing to try. So, however long the odds, I’m still in the research and writing game.

But given all that, why blog? Why carve time for this out of the precious little time I have for research and writing? For me, blogging is a way to stay connected to the world, to commit to paying attention and formulating thoughts about the things that matter to me beyond the quotidian demands of my life.

Blogging is for me also a means of developing the discipline and practice of writing – to train my mind to be as motivated by the prospect of filling a blank page as it is by the flash of a facebook or twitter notification, or the prospect of spending an evening passively soaking up a movie. Without that discipline, no amount of time will be sufficient for getting this work done. With it, I may find an abundance of time in the cracks and crevices of a demanding life. My theory is that blogging as a regular writing commitment will not just be a slice of the pie, but a way of making the pie bigger.

Maybe that’s magical thinking, but my hope is that it’s part of the real magic of writing.

Climate Change, Alzheimer’s Disease and the Conundrum of Scientific Authority

There is perhaps no greater source of authority in modern society than science.  As a result, scientific claims are nearly ubiquitous, and often controversial. How do we decide when to trust and when to doubt ostensibly authoritative science?

This question was brought home to me in a couple of exchanges I’ve been part of this year concerning climate change and Alzheimer’s disease. When Peter Whitehouse recently wrote a post on the Myth of Alzheimer’s blog asserting that extreme weather events associated with climate change pose a significant threat to elders, particularly those with cognitive impairment, a climate change denier thanked him for exposing the myth of Alzheimer’s but took him to task for falling for the myth of global warming.

Going in the opposite direction, back in March someone took great offense at a talk I gave questioning the ostensibly authoritative claims of Alzheimer’s researchers and accused me of engaging in the equivalent of climate denial: making specious arguments that are dismissive of the very real problem Alzheimer’s disease presents to individuals and society, and thus reducing the global commitment needed to recognize and respond to it. (This exchange happened to take place as I was preparing for a four day bicycle trip with my 12-year-old daughter from central PA to Washington DC to raise awareness about the need for climate action, so it did get under my skin.)

It’s tempting to dismiss these criticisms as simple ignorance. After all, as I pointed out in reply to the same climate denier’s 30-link torrent  in response to a subsequent post by Whitehouse on climate change, a very strong scientific consensus on anthropogenic climate change is in the realm of objective fact: A survey of nearly 12,000 relevant peer-reviewed scientific articles published from 1991-2011 show that 97% of them support the basic consensus on climate change, and virtually every prestigious scientific society in the United States and around the world has issued or signed on to statements supporting the consensus that climate change is being driven by emissions from the burning of fossil fuels and poses a serious threat to human society. And my criticism of the emphasis over the past thirty years in the Alzheimer’s field on cure and prevention rather than support for creative, stable caregiving hardly amounts to a denial that dementia is real, and causes real suffering and loss to society.

But a legitimate question remains. It seems that on the one hand, I am pleased to accept the claims of a majority of climate scientists as authoritative. On the other, I seem equally pleased to criticize the claims of a majority of scientists and practitioners in the Alzheimer’s field. How can I justify this apparent inconsistency? Perhaps I have enough direct familiarity with the content of the relevant branches of science in both of these broad fields to make an informed judgment? Absurd. I’m very knowledgeable about Alzheimer’s for a non-scientist, and probably better read than the average person on climate science.  But the volume and degree of specialization in modern scientific research makes it a challenge for scientists to keep up with research even in their own narrow fields. Directly assessing the volume of work in broad fields like dementia or climate research is simple impossible. At some point, no matter how broad or thorough your scientific education and competency may be, you will need to trust (or not) the claims of others about science. But how to decide who and what to trust?

Here I think the academic fields that have formed me as a scholar – the history of medicine and STS (Science, Technology and Society) – have much to offer. The implicit idea of these fields is that understanding some of the science itself is necessary but far from sufficient. To understand science deeply enough to reach sound judgments about when to trust and when to question scientific claims, one must learn and think more about science and the way it is actually practiced in the world. One must understand the social and cultural contexts that shape scientific interest and help determine what kind of scientific questions are pursued. One must consider the social, economic and political factors that inevitably influence scientists. One must be able to recognize the way that social and cultural values are embedded in seemingly mundane questions of scientific method and analysis. The point of these sorts of questions is not to dismiss or diminish science, but to understand its real power, and in so doing reach better judgments about how it should be used to better serve human flourishing.

It’s the consideration of these sorts of questions that lead to my different stances toward climate science and Alzheimer’s research. As I mentioned above, in simple numerical terms the consensus on climate change is very strong. But in socio-historical terms, the breadth and resilience of the consensus is even more impressive. As shown by physicist and historian of science Spencer Weart’s comprehensive research, the consensus around the theory of anthropogenic climate change is not supported by evidence generated from the work of a single scientific field, but emerged with the convergence of many lines of research from a broad range of scientific fields ranging from geology, chemistry, atmospheric physics, meteorology, oceanography, computer modeling and many more. Practitioners in these fields use different methods and approaches to what counts as evidence, so that the theory has been challenged and tested from multiple directions. Scientists in different fields also get research funding from different sources within the federal government and the private sector, so the potential funding bias is less than when funding comes from a more narrow range of sources. Moreover, since the theory of anthropogenic climate change implicates the energy industry, it is profoundly threatening to some of the most powerful political interests, who have responded by spending vast sums to discredit it. Historians Naomi Oreskes and Eric Conway have shown that this involve funding the activities of a handful of scientists with an anti-regulatory bent who have attacked not only the climate change consensus, but had also been involved in earlier attacks on scientific research showing the harmful effects of DDT, CFCs, and tobacco. Journalists have also begun to trace the donations of hundreds of millions of dollars from conservative billionaires with fossil fuel industry ties to public relations and lobbying campaigns aimed at attacking the climate change consensus in the media and on Capitol Hill.  That a strong consensus supporting the theory of anthropogenic climate change remains despite decades of well-funded, systematic attack enhances its credibility.

While I in no way intend to dismiss research in the Alzheimer’s field over the past several decades that has produced much important knowledge about some of the likely pathological mechanism that lead to dementia, a consideration of socio-historical factors raises questions that are not often enough asked, especially in media coverage. First, while there is no credible denial that age-associated progressive dementia exists as a significant individual and social problem, there are many different  theories regarding what causes it among respected researchers in the field, and debate within the field about whether it can truly be disentangled from usually more benign processes of systemic brain aging. Second, while researchers from diverse fields certainly conduct Alzheimer’s research, the dominant approach emphasizing the drive to pharmacological treatment and prevention is the product mostly of psychiatrists and neurologists, and this group is largely supported by a narrow funding stream from the pharmaceutical industry. Critics like David HealyCarl ElliottJohn Abramson  and others have documented the distorting effects of pharma money on medical research in general, and several authors in a book on dementia treatment I co-edited show that this happens in the Alzheimer’s field as well. Finally, unlike climate change, the dominant approach to Alzheimer’s disease is in sync with the interests of the pharmaceutical industry, which has accordingly spent vast sums to persuade the public and lawmakers of its importance to society so that there has been relatively little public debate about it. None of this amounts to a reason to dismiss mainstream Alzheimer’s research outright, but it does suggest there is a need to ask some critical questions.

Both climate change and Alzheimer’s disease are complicated problems, and much will no doubt continue to change in the scientific understandings of both of them. But the persistence of such a strong scientific consensus around the fundamental of climate change despite factors that we would normally except to weaken that consensus, especially the strong resistance of powerful economic interests, helps convince me that it is time to take strong steps as a society to lower carbon emissions. While we must also continue to take the challenge of Alzheimer’s disease very seriously, I see a need for a broader debate about whether the emphasis on developing a pharmaceutical solution – which has been promoted by powerful economic interests – has led us to pay too little attention to other ways of effectively responding.

And beyond both of these issues, we need to move beyond ubiquitous claims of scientific authority and superficial controversy to a more thoughtful public discourse about science and its place in society.

Skewering the Emerging “Brain Fitness” Industry

Capture brain fitnessPerhaps I should look into joining The New Yorker as a staff writer now that major articles on dementia seem to have become a regular feature of the magazine. In this week’s issue, humorist Patricia Marx skewers the emerging “brain fitness “ industry.  Her tone is breezy and light compared to the gravity of the two articles I reviewed in my previous post, going for laughs by probing the tension between mid-life anxiety about cognitive decline and the range of improbably diverse claims for the cognitive benefit of various activities:

It’s a pretty regular occurrence for me to leave my reading glasses God knows were or lose my train of thought or have trouble recalling the word `phlogiston’ – and, egads, what happened to all that stuff I used to know about Charlemagne’s in-laws? In my darkest moments, I imagine that my friends are humoring me when they insist that their amnesiac lapses are no less alarming than mine. (“Have you ever squeezed toothpaste onto your contact lenses?” a friend asked triumphantly.) Am I, like so many of my gang, just another one of the `worried well?’(A 2011 survey found that baby boomers were more afraid of losing their memory than of death.) Should I get out a crossword? Learn to play bridge? Chew gum? Take a nap? Drink more coffee? Eat blueberries? Give up tofu? There are studies that tout the benefits that each of these undertakings has on the brain. What to do?”

Readers who themselves are struggling with significant cognitive loss, or caring for people with dementia, may be put off by the tone of the article. But Marx is making an important point about how the purveyors of brain training software and other “neurobic exercise” programs are forging a billion dollar industry on anxiety about dementia and scientific sounding claims about various techniques for re-invigorating the brain.

Marx is not, however, completely dismissive of recent findings about neuroplasticity and the potential for diet, exercise and stress management to prevent cognitive decline. The work of Kenneth Kosik, a neuroscientists at the University of California at Santa Barbara and founder of a non-profit brain fitness center, comes off as reasonable, legitimate, and firmly grounded in real science. Citing the well-known nun study and other recent research, Kosik explains the scientific basis for believing that social, physical and intellectual enrichment that promotes good brain health across the lifespan can increase the resilience of the brain and help prevent dementia. Kosik, who is also involved in more mainstream biomedical research, is careful to keep claims about brain fitness modest, arguing that they should be part of an overall health lifestyle. Of his brain fitness center , Kosik says that “I’m sure some of my colleagues in Boston would look at this as a fringe operation, a storefront with walnuts and incense. On the other hand, we can wait for science to come up with a cure or we can jump in and try to create an atmosphere that is conducive to good brain health.”

Though Kosik is clearly right that this holistic approach to brain health as a factor in dementia is at odds with the reductionist drive in biomedicine to find the key to curing or preventing dementia in specific pathological mechanisms, he is quite wrong about how the brain fitness industry looks. The companies Marx spends most of the article quite justifiably mocking are careful not to look like smoky dens of new age mysticism. Rather, they relentlessly deploy neuroscience lingo and wildly extrapolate from limited research evidence to make absurdly inflated claims for the efficacy of their products.

After a crash course of several weeks Marx concludes:

Judging from the series of questionnaires I’d filled out during the course of my training, my mood brightened, my sleep was more restful and I felt more confident. I may also have become a bigger liar on questionnaires but that was not evaluated. As for the exercises, my scores were higher across the board. In an email summing up my progress, Merzenich [neuroscientist and co-founder of BrainHQ] wrote `Your advances on these exercises comes from brain remodeling. If we had recorded from/imaged your brain before and after training, we could have easily shown that you now have a `better’ (stronger, faster, more reliable, more accurate) brain.’ (Wouldn’t they make dandy wallet photos?) Compared with my poky old brain, my souped-up brain, according to Merzenich, has more synapses, better wiring, stronger connections, and more forceful activity. (Doesn’t that sound like an ad for a five-thousand-dollar stereo?)

I’m not sure I noticed my newfound cognitive abilities in everyday life. It’s hard to be both scientists and lab rat. On the positive side, I am slightly less troubled about the size of my hippocampus. On the negative side, why did I sprinkle NutraSweet on my broiled salmon last night?”

The New Yorker on Dementia: Crisis in Scientific Research and Progress in Caregiving

The peer-reviewed article may be the coin of the realm in academic science, but the high profile magazine article is the bellwether of popular attitudes toward health, medicine and disease. Thus it’s notable that The New Yorker, one of the few remaining popular periodicals aimed at well-educated general readers, recently published two major articles about the search for new approaches to Alzheimer’s and dementia. One focuses on the lack of progress in scientific research, and the other on encouraging developments in nursing home care.

Jerome Groopman’s “Before Night Falls,” which appeared in the June 24 issue, aims to describe the current state of research on Alzheimer’s disease and the search for new directions.  Groopman holds a prestigious medical research chair at Harvard Med, but is more widely known as the author of a number of bestselling books that explore the experience of illness, the often tangled relationship between doctors and physicians, and the inherent difficulties of the diagnostic process.  Groopman’s popularity is well-deserved.  He is an excellent writer, and here he provides a compelling description of the struggle of leading Alzheimer’s researchers to find a viable direction following “three decades of Alzheimer’s research [that] has done little to change the course of the disease.” But Groopman’s qualities as a writer and critical observer of medicine make the limitations of this article all the more disappointing.

Groopman depicts the Alzheimer’s field as essentially split between two camps – the majority of researchers who continue to believe that the key to the disease is the excess accumulation of the protein beta-amyloid that ultimately forms plaques in the brain, and the smaller group of dissenting researchers who call instead for a focus on the tau protein involved in the formation of neurofibrillary tangles or some other putative cause, or who suggest that the disease is simply too complex and scientific understanding too limited to warrant focus on a single target.

The article strikes a fair enough balance between these two camps. It focuses on the work of champions of the amyloid orthodoxy like Dennis Selkoe and Reisa Sperling, and especially on the emerging emphasis on early diagnosis and prevention in the wake of the failure of anti-amyloid drugs to improve the cognitive functioning of people who have been diagnosed with dementia. But it also gives ample space to the views of prominent critics like George Perry and Peter Davies to explain their doubts.

Indeed, the article is far from the sort of naïve expression of fervent belief in the inevitability of medical progress that is all too typical of medical journalism today. It does not gloss over the uncertainties, risks and costs involved in the new emphasis on early diagnosis and treatment targeting amyloid accumulation. Moreover, the personal and societal consequences of Alzheimer’s are depicted as so grave that, as the title suggests, the overall tone of the piece is more gloomy than triumphant. The final section of the article reiterates the apocalyptic demography that characterizes so much Alzheimer’s advocacy: the looming economic burden of the disease, reckoned to be more than a trillion dollars a year by 2050, makes spending billions of dollars on research, whatever the uncertainties, seem like the only rational course of action – even to budget hawks like Republicans Eric Cantor and Newt Gingrich. Groopman gives researcher Reisa Sperling the final, almost desperate word:

I think it’s a war – a war against Alzheimer’s disease, and we are losing, so I’m going to use military terms,” she said. She argues that those who are concerned about an ill-conceived rush to preventive trials lack the appropriate sense of urgency. “The idea of waiting another ten years, just to study the natural history of Alzheimer’s disease, is not tenable,” she said. “These are the dilemmas. How do we make the best possible decision right now, in the absences of all the data we need?” She paused for a long moment, then said, “My biggest fear is that we are just doing too little too late, and that even if we move sooner, we are not lowering amyloid sufficiently. So we will get to the end of the trial and say, `Well, here we are, and we have the same conundrum. We just did not do enough.’”

There are two glaring omissions in Groopman’s article. First, there is virtually no discussion of big pharma’s role in the development and continued dominance of the amyloid approach. Critics like David Healy, Carl Elliott, John Abramson  and others have documented the distorting effects of pharma money on medical research, and several authors in a book on dementia treatment I co-edited show that this happens in the Alzheimer’s field as well. While reasonable people may disagree about the effect of the pharmaceutical industry on medical research, it is inexcusable to simply ignore the issue in a story of this length and prominence. Second, there is absolutely no mention of the many studies suggesting that non-pharma approaches to overall brain health like diet, exercise, and social engagement can effectively lower the risk of dementia, most recently two studies showing that dementia rates in Denmark and the United Kingdom may actually be falling even as the population ages.  Such approaches, which anecdotally at least appear at least as promising as any drug, have the advantage of being safe and cheap – and something we should want to do for other compelling reasons anyway. It is surely worth asking why Alzheimer’s researchers working to establish a new regime of early diagnosis and prevention would not want to rigorously explore how non-pharma approaches could be part of that strategy.

As an antidote to the feeling of gloom and desperation in Groopman’s depiction of the search for an effective medical treatment for Alzheimer’s, one should turn to exciting developments in the field of nursing home care that are promising to dramatically improve the lives of people with dementia and their families today. These developments are described in great depth by Rebecca Mead in an article called “The Sense of an Ending” that appeared in the May 20th issue.

Mead focuses on the work in dementia care at the Beatitudes Campus in Phoenix, AZ, and the NYC chapter of the Alzheimer’s Association that is exploring ways to incorporate its approach in nursing homes there. Beatitudes is part of a culture change movement in nursing home care that encourages moving away from a medical model of care focused on treatment and efficient service delivery, to a more holistic, person centered approach to dementia care that aims to create environments that minimize distress and value the enjoyment and meaning inherent in the non-cognitive human capacity for sensory pleasures and intimacy.  “Without any immediate prospect for a cure,” Mead writes “advocacy groups have begun promoting ways to offer people with dementia a comfortable decline instead of imposing on them a medical model of care, which seeks to defer death”

But the culture change movement is about more than making people with dementia comfortable as they await the release of death. It is based on a profound re-valuing of the person with dementia, focusing not just on the cognition that is lost, but the physical and emotional capacities that remain, capacities that might even be enhanced in dementia. Mead writes that Tena Alonzo, director of education and research at Beatitudes, views people with dementia as “closer to the higher being. This is who they are: real, honest, and sometimes raw. There is no ability to reason, or to cover up who you really are. And so, for much of the time, you see the loveliness of the soul – it is bare  for everyone to acknowledge.”

Seen from this perspective, the aim of caregiver is no longer to change or manage the person with dementia, but to overcome the anxieties and defenses that prevent caregivers from developing real, life-giving relationship with who the person with dementia has become.  Mead acknowledges the difficulty of this, especially for children or partners who cannot help but grieve the abilities that the people with dementia they love have lost. Nonetheless, one can see in her rich description of the world of care at Beatitudes what caregivers can gain from these relationships. Mead describes a woman who spent most of every day walking up and down the corridors. In a typical nursing home, this behavior might be seen as pathological and worrisome, a risky problem behavior to be managed and perhaps suppressed. But the staff learned that she had worked in retail for decades, and determined that she was not going to stop walking just because she was in a dementia unit, so they sought merely to minimize her discomfort by considering additional pain killers when her feet appeared to be painfully tired. Mead describes her feelings when she interacted with her:

I spent some time on day walking along the corridor with this woman, into the sunroom and out of it again. Her face brightened at the company, and she was eager to talk, even when her side of the conversation devolved into singsong nonsense. Sometimes she stopped to shimmy for a moment, and I could imagine what a figure she must have cut at a dance. Being in her company triggered memories of being a child alongside my grandmother, now long dead. I recalled the at-a-loss-for-words feeling that I used to experience with her, even as she made me feel held within a sphere of affection. This woman’s powers of cognition were limited, but her capacity to experience emotion seemed unimpaired, and she demanded engagement the way a small child does: it made no more sense to resist her impulse toward intimacy than it would to withhold a smile from a baby. When she grasped my hands and told me I was perfect, I told her that she was perfect too.”

What will strike many people as most remarkable about Beatitudes is that is cost-effective. Mead notes that despite the innovations that make Beatitudes a much more gracious place to live than the typical nursing home, the average cost of care is roughly the same. Not least among the ways that a person-centered approach to dementia care can save money is by reducing the use of expensive anti-psychotic drugs, which can cost hundreds of dollars per month for every patient on them. Mead quotes Alice Bonner, director of the Center for Medicare and Medicaid Services: “People are starting to realize that, with some creativity and curiosity, we can figure out other ways of taking care of people with dementia.”

Taken together, these two articles suggest a change in emphasis. The future of the dementia field may lie less with developing high-tech medical interventions aimed at curing or preventing dementia, and more at developing and practicing innovative models of care. Perhaps we can learn to see dementia as less of a personal and collective apocalypse than a great human challenge that can bring out the best in us.

Wiring for Dummies: What Neuroscience Can and Can’t Tell Us about Ourselves.

connectome cover

A Review of Sebastian Seung’s The Connectome: How the Brain’s Wiring Makes Us Who We Are. Mariner Books, 2013.

There is no scientific field more exciting than contemporary neuroscience, poised as it is to develop concepts and technologies that will finally penetrate and manipulate fundamental mechanisms of the human mind. There is also no scientific field more arrogantly simplistic, bedazzled as it is by reductionist theory and fantasies of human power and control over complex biological processes. MIT neuoroscientist Sebastian Seung’s brilliant book manages to be both.

Seung is a rising rockstar scientist with drop-dead good looks, bold fashion sense (he typically sports jeans and t-shirt and showed up to a recent high profile debate wearing shiny gold sneakers), and a knack for making complicated ideas and profound problems in neuroscience not only understandable but cool. He gave a popular TED talk laying out the central thesis of his work that “I am my connectome,” that the sum total of synaptic connections within the brain constitute human selfhood in all its rich uniqueness. This book is a fuller elaboration of that idea and its intellectual history, and a fascinating account of the development of the theories and technologies that Sueng believes will make possible what he views as the ultimate goal of neuroscience – a complete map of the human connectome.

Pursuing this goal, according to Seung, will have both philosophical benefits in terms of understanding human nature and practical benefits in terms of getting at the true causes of mental disorders like autism and schizophrenia which he argues may usefully be thought of as “connectopathies”—abnormal patterns of neural connection.  Beyond this, Seung argues that it is theoretically possible (or at least not yet a proven impossibility) that in the future connectomics will advance to the point of being able to fully read off and duplicate the contents of individual human consciousness thus essentially achieving the dream of immortality. Though Seung hedges on whether he really believes projects like cryonics and the uploading of consciousness are plausible or even desirable, he clearly embraces the broader transhumanist dream of developing technological means of human improvement. He concludes the book with grand vision that evokes the technological singularity, when humans will finally transcend the limits of their biology:

Connectomics marks a turning point in human history. As we evolved from apelike ancestors on the African savannah, what distinguished us was our larger brains. We have used our brains to fashion technologies that have given us ever more amazing capabilities. Eventually these technologies will become so powerful that we will use them to know ourselves—and to change ourselves for the better.”

Seung acknowledges that neuroscience is very far from achieving the extraordinarily difficult goal of mapping the connectome. It took scientists more than a decade to map the connectome of the nematode C. elegans, whose simple brain consists of only 300 neurons and 7000 synaptic connections. The human connectome is 100 billion times larger, with a million times more connections than the human genome has letters. But as Seung explains, the development of technologies that automate much of the work are beginning to make this enormous project feasible.

However the real problem is not just size, but complexity. While the nematode connectome is relatively fixed and stable from one individual organism to the next, the connectome of every human being is unique, reflecting the ongoing re-wiring of synaptic connections within the brain as each individual interacts with the environment. Yes, there is an underlying brain structure common to all human brains, but the wiring of the connectome evolves in a way that reflects the unique experience of the world each individual gains through their senses, their thoughts and feelings, and even the ideas they have encountered through participation in human culture. Connectionism itself suggests that human consciousness can never be reducible to a universal brain structure or state; it seems quite plausible that ostensibly identical synaptic connections could be encoded with different mental content.

In recognizing the dialectical relationship between brain and environment, connectionism would seem to avoid the crude reductionism of discredited intellectual ancestors like phrenology that sought to identify specific regions of the brain with complex personality traits and mental functions. Indeed, one of Seung’s clearest messages is that “the connectome is where nature meets nurture.” If that is true, then traditional understandings of the role of culture and society should be seen as equally important in the formation of self, and psycho-social approaches equally important as biotechnology in dealing with mental disorders. But Seung and other neuroscientist are not much interested in the dense matrix of “nurture” constituted by human culture and society. They are really interested only in the wiring.

For example, Sueng makes much of the infamous discovery of the Jennifer Aniston neuron – a specific neuron in themedial temporal lobe (MTL) that fires only when research subjects look at a photograph of her, along with neurons that fire exclusively on perception of other celebrities. He extrapolates from this to theorize about how perception works in general by comparing the brain to an army of paparazzi employed by a magazine that seeks to publish titillating photos of movie stars:

One hounds Jennifer Aniston with his camera, another devotes himself to Halle Berry, and so on. Every week, their activities determine which celebrities appear in the magazine, just as the spiking of MTL neurons determines which celebrities are perceived by the person.”

paparazzi

While this is a fascinating speculative theory about the mechanism of perception in the brain, it actually tells us very little about the mind. Following the actions of this “army of papparazi” might tell us how certain things end up in “the magazine” of the mind, but it really does not tell us why that magazine devotes attention to celebrities at all, who “buys” it, etc. Seung tries to meet this kind of objection by suggesting that the “Jen neuron” is activated by a population of neurons triggered by neurons devoted to components of the perception – neurons that fire for blonde hair, dazzling smiles, blue eyes, and other attributes. But this comes to seem very much like a return to phrenology. It’s as though Seung seeks to explain why you assembled songs into a playlist by examining the circuitry of your iPod. To explain your playlist, we are going to need to know some things about the society and culture you live in.

All of that said, Seung’s book is well worth reading for its clear discussion of what neuroscience is uncovering about the workings of the brain. Neuroscience is indeed opening an exciting frontier, and work toward mapping the brain may tell us some valuable things about human nature and mental illness. But as we think about these possibilities, and how to prioritize our investments in science, we should also acknowledge that there are things it will never be able to tell us.

Around the Web: Omphalos and Hole Ousia

As I sat down to write this, I just realized that it was actually one year ago today that I launched this humble blog. One of the best reasons for blogging and social media is the chance of making contact with people you likely never would have encountered otherwise. This happened with my first post, when Peter Gordon introduced himself in a comment. Peter is a geriatric psychiatrist in Scotland who also has been thoughtfully exploring in video and writing the need to develop a richer approach to dementia informed by the humanities, and the broader problem of the division between science and the humanities in biomedicine.

I was especially flattered that he used extensive quotations from my book in one of his first videos called  The Diseased Other.  Lately, he has been courageously raising vital critical questions about the push for early diagnosis of dementia, which he summarizes in this short film:

Peter makes his videos as Omphalos and shares them on Vimeo. He blogs at Hole Ousia, and  at the Myth of Alzheimer’s blog. Check his work out.