I recently gave a talk about Alzheimer’s disease and asked people to imagine two individuals, Manny and Sue. Manny died at 85; he was showing signs of age but living independently and was mentally “all there.” Sue lived until 99. From the time she was 88, she began a slow cognitive decline. By 93, she could no longer live independently. By 95 she could not recognize friends and family. Eventually she became incontinent and was unable to walk, speak, or feed herself. She lived two more years in that state until she died in her sleep.
I asked my audience which they would rather be, Manny or Sue. To a person, they chose Manny. Although Sue had lived much longer than Manny, for most people her long decline into dementia outweighed her extra time on earth. I have given this talk a number of times and it is the rare person who would prefer to live and die like Sue.
I have been thinking of this because of the article in Nature Medicine by a group of scientists who have identified a blood test to predict whether someone will get Alzheimer’s disease in the near future. Basically, they took 525 healthy people age 70 or older. Over five years, 74 of them were diagnosed with either mild cognitive impairment (often a precursor to Alzheimer’s) or Alzheimer’s. Their blood test was able to predict cognitive decline with 90 percent accuracy.
Biomarkers to predict Alzheimer’s disease are not new, but up to now they have been expensive, invasive, time-consuming, or all three. In contrast, this is a simple blood test that in a few years’ time could be ordered by your doctor in the same way she checks your cholesterol.
Responses by the Alzheimers Association, and the scientists themselves, were predictable. First, calls for caution as the research has yet to be replicated. Then, a reminder that that there are currently no proven therapies to halt the disease once it is detected. Next, acknowledgement that some people will prefer not to know what fate awaits them; this test is not for everyone. Finally, a list of ways in which one can respond to bad results, for example by communicating with friends and family, planning one’s financial future, making arrangements for one’s care, or entering a clinical trial.
No one seems willing to name the elephant in the room. This test is good news for those of us who would passionately prefer to die before becoming demented, even to the point where we would end our own lives. Here are the facts. If you are 70 or older and “fail” this test, then you know with 90 percent degree of accuracy that you will become cognitively impaired in the near future. The average time to become impaired in this study was 2.1 years. The average life span after showing signs of Alzheimer’s is 10 years.
I believe that preemptive suicide is one reasonable response to impending dementia. Many people will not agree, of course, and there are strong societal taboos against suicide (although public discussions about legalizing assisted suicide for terminally ill people may have opened up some space for thoughtful discussion). To me, it seems quite odd to openly and passionately wish that one would die before one becomes demented (a very common sentiment) and yet recoil from the idea of making that happen.
For people who share my sentiments about preemptive suicide, a major barrier has been pinpointing the right time to act. We don’t want to act too soon, and lose what could have been many good years, but we don’t want to wait until it is too late. Some of the first things to go in dementia are awareness of one’s cognitive decline, and the ability to make a plan and carry it out. Predictive biomarker tests such as this one can be an invaluable tool to address this dilemma.
Once this new test is validated, important policy questions will need to be addressed. Will the test be available only to subjects in clinical trials, to help researchers test the efficacy of new drugs that may prevent or slow the disease? Or should it be available in your physician’s office, like a test for cholesterol or thyroid function? Unfortunately, that will put many patients in an adversarial position with their doctors, who may be afraid of legal liability if they order a test for a patient who expresses an interest in preemptive suicide. Forcing patients to lie to their doctors is demeaning and counterproductive. But why should doctors control this test at all? Why shouldn’t this be more like a pregnancy test, available over the counter? Yes, it requires a blood sample, which presents a few logistical problems, but these are easily overcome. Direct-to-consumer blood testing labs already exist, which will do various tests without a doctor’s order.
As more of these predictive tests are created and validated, we must ask whether people will have access to these biomarkers and the information they produce, or whether misplaced paternalism will prompt healthcare professionals to attempt to monopolize control. Prediction is crucial for dementing diseases because, unlike other diseases such as cancer, a person cannot wait until the disease takes hold to decide to end her life; once the disease holds sway it is already too late to act.
Critics will point out that the test is only 90 percent accurate, and that no one should end her life without complete assurance that the disease is imminent. To these critics I respond that people make many serious decisions with less than 100 percent accuracy. Women who test positive for the BRCA gene variants that confer an added risk of breast and ovarian cancer, often choose bilateral mastectomies, despite the fact that their risk is not 100 percent and that the mastectomy is not a guarantee against breast cancer. Suicide is irrevocable, but so is Alzheimer’s disease. Once one starts down that path, there is no turning back.
Dena S Davis, JD, Ph.D, is the Presidential Endowed Chair in Health and a professor of Religion Studies at Lehigh University.