Author

Craig Klugman

Publish date

by Craig Klugman, Ph.D.

According to an article in Nature Medicine, a new blood test appears to be accurate for diagnosing whether an individual is likely to develop Alzheimer’s disease. Although not yet available for public use, the technique may offer a faster, cheaper, easier, and less invasive method for diagnosis. The researchers looked for fats present in the blood of seniors in the subject pool who were diagnosed with Alzheimer’s disease. Five years earlier, this group, along with others, had a baseline blood draw performed. As it turns out, the levels of 10 lipids in Alzheimer’s patients are higher than in those individuals with even mild cognitive impairment. Researchers report predictability of 90%.

More research is needed on this test in a much larger sample pool. Plus it would need to receive FDA approval before it could become a regular screening tool. Current screening techniques for Alzheimer’s disease include MRI and PET scans, which are expensive and inconvenient as well as spinal taps that are invasive and uncomfortable. Definitive diagnosis is available only via autopsy when tell tale signs of the disease can be seen in the brain. Other methods are also being investigated such as presence of certain genes associated with the disease like APOE-4.

Assuming that one of these tests becomes available and approved by the FDA, consider that interpreting tests can be tricky. Just because these fats or this gene is associated with the disease does not mean the tests are 100% accurate. You could live your life waiting to be stricken and never come down with the disease. Lack of these indicators may not mean that you have zero chance of the disease either. With a 90% accuracy rate, that can mean that 10% of cases lack the lipids (or that 10% of people with the lipids do not have Alzheimer’s).

Such tests could potentially be useful in two ways. The first is in providing a definitive diagnosis to people who are displaying symptoms. The second is informing people that they are at high-risk, years before they may have the first symptom.

At what age should a person be tested? In part, the science will answer this question because it depends on when the lipids appear in the body. But a genetic test would be available from conception. A couple could abort an embryo with the suspect gene, or choose not to use an embryo created through reproductive technology techniques. Or the couple may choose to test their 10-year-old child: Should a child be forced to carry the burden of knowing his or her disease future?

The question in my mind is whether a person would want to know if they are likely to get the disease. On the one hand, you can prepare for getting the disease, take efforts to delay the disease (none is scientifically known), and make arrangements for your well-being. You can make sure you get through your bucket list earlier. You can take supplements or seek nontraditional interventions, or you may enroll in clinical trials to help prevent the disease. You may have children at a younger age so that you can be there for them to grow up.

If family members had the disease, knowing that you don’t have it could be a huge relief.

On the other hand, you may walk around waiting for the sword of Damocles to drop. You forgot your keys this morning—is that a sign of Alzheimer’s disease? You forgot your aunt’s birthday—is that a sign of Alzheimer’s disease? Your life could be one filled with anxiety.

If family members had the disease, knowing that you have it might cause you to take action to be sure you do not get it—i.e. ending your own life before the age when family members were diagnosed. You may choose not to have children so that you do not pass on the disease.

This debate is similar to that faced by families with Huntington’s disease. Since the early 1990s, a screening test has been available that shows whether a person carries the Huntington’s gene. This disease has no cure or preventive treatment. According to the HD Foundation, most people choose not to have the screening because of the emotional toll of a positive finding and the potential risks to medical confidentiality if others found out. Estimates hold that only between 5% and 7% of people at-risk have been tested. The HD Foundation also suggests not testing minors unless the minor shows symptoms of disease.

The decision whether a person would want to be tested before having the disease is a deeply personal one. For me, I would rather not because the burden would outweigh any benefit. Then again, I do like surprises. However, for use as a diagnostic tool once symptoms develop, this tool could be invaluable saving time, money, and discomfort. The future is here. As more screening tests are developed, the need for asking these sorts of questions will proliferate.

What will you choose?

We use cookies to improve your website experience. To learn about our use of cookies and how you can manage your cookie settings, please see our Privacy Policy. By closing this message, you are consenting to our use of cookies.