October 11, 2022 | Screenings for heart-related conditions have woven their way into routine health care, like the blood pressure checks that take place at many medical appointments. So why isn’t there a quick and easy method to regularly evaluate brain health?
It’s a question some are looking to address with the help of artificial intelligence (AI) and machine learning-based approaches. Linus Health and Cognetivity Neurosciences have developed iPad-delivered tests to proactively screen for signs of cognitive impairment and detect disorders like dementia early on.
Meanwhile, Beacon Biosignals is applying machine learning to EEG data to identify people with Alzheimer’s who could be at risk of accelerated disease progression.
Emerging iPad-Administered Cognitive Tests
This summer, Boston-based Linus Health announced the launch of its brain health platform. It combines a brain function assessment with a digital questionnaire, providing insights in under 10 minutes meant to offer a window into both current cognitive status and future risk of dementia.
The technology expands on the company’s DCTclock, an iPad-based version of the paper-based Clock Drawing Test, and its AI technology examines a person’s clock drawings as well as their drawing process. The first portion involves a neuropsychological assessment, explained David Bates, CEO of Linus Health, and gathers valuable information while the test is taken.
“We’re capturing voice features; we’re capturing speech; we’re capturing drawing, behavior, planning, motion—all these kinds of things,” he told Diagnostics World.
Rather than focusing solely on test scores, this type of process-based approach emphasizes the importance of other aspects, like how a patient arrives at an answer. These other aspects can yield useful insights and help distinguish between different forms of cognitive disease.
“Research over the last several decades employing an analysis of process and errors has been able to dissociate between dementia patients diagnosed with Alzheimer’s disease, vascular dementia associated with MRI-determined white matter alterations, and Parkinson’s disease,” he explained, in addition to distinguishing between subtypes of mild cognitive impairment.
However, a process-based approach is also labor intensive and relies on specialists. “There’s just simply not enough specialists in the world, and there never will be,” Bates said, and this is where AI comes into play.
One of the key ideas behind the technology is that early detection and intervention are critical. “In neurodegenerative diseases, we know the disease starts over a decade before the symptoms present,” Bates said. Detecting disease before symptoms appear enables possibilities like the promotion of brain resilience or the start of treatments.
It can also help patients and their family members make important decisions. “People want to be able to decide on their life and plan for their future even in the face of disease,” he noted. “And importantly, something can be done.”
The traditional process seen in today’s health care environment can be drawn out: A problem emerges, a patient is sent to a specialist, and an appointment is scheduled—sometimes many months down the road. Following that wait time and the eventual assessment, the primary care doctor can then begin carrying out the care plan. “We shortcut that year-long plus process in just a couple of minutes,” he said.
The Linus Health platform uses insights gained from the testing to offer personalized action plans intended to help primary care providers and their patients harness evidence-based interventions. These Brain Health Action Plans are based on each person’s unique response to the Life and Health Questionnaire.
The questions combine non-modifiable factors (like age, gender, and medical history) with modifiable factors—both of which influence brain health over the long haul. Non-modifiable factors need to be monitored because they can trigger specific diagnostic pathways, according to Bates, and modifiable factors are crucial because they represent an opportunity that can be acted upon.
One of the challenges, though, is that there are so many modifiable factors when it comes to brain health. So the questionnaire has clustered these factors (such as nutrition, exercise, and social connectivity) around eight “pillars” of brain health. The inputs are run through the company’s model, and an action plan highlights that person’s top four areas to focus on.
While the assessment might reveal that some people need to focus on nutrition, for instance, others might need to work on clarifying their purpose in life. Even if two people receive results indicating the same areas in need of attention, the action plans will likely differ in approach.
“I may want to start slow, one at a time, and you may want to tackle more areas simultaneously. You may feel that it matters more to you to start with nutrition, and I may feel like I want to prioritize my mental health more actively,” Bates explained.
“And even if we both felt compelled to engage in more cognitively stimulating activities immediately, for example, I might find it’s a great time to volunteer for the PTA at my son’s school, and you may pick up the guitar that has been in the attic since high school,” he added.
Cognetivity Neurosciences, based in Canada and the UK, is another company pursuing the potential of an iPad-administered AI brain health test. It has developed a CE-IVD marked, FDA-approved technology called the Integrated Cognitive Assessment (CognICA) that is designed to detect early signs of cognitive impairment.
According to Co-Founder and Chief Scientific Officer Seyed Khaligh-Razavi, it is “a quick, five-minute test that is primarily designed with the idea of measuring the speed of processing information in the brain, as opposed to looking solely at the memory.” The analogy they use is that instead of focusing on the hard drive of the brain, they measure its CPU performance.
To take the test, patients react to 100 images that appear on an iPad in rapid succession. Along the way, they simply select whether each image includes an animal or not. In some images, an animal’s head or body is plainly visible. In others, the animals are more challenging to spot.
The test is based on the notion that the human response to animals has been fine-tuned over time. “This idea of animal versus non-animal discrimination is a very simple idea,” said Khaligh-Razavi. Despite its simplicity, it has deep roots in how we have evolved over millions of years.
Humans (and more generally, primates) react strongly to animal images. “Our studies—and prior to that, previous studies—have shown in the literature that our brain is responsive to images that contain animals or not,” he explained. This processing happens automatically and engages brain areas that are affected in the early stages of Alzheimer's.
The ICA test is designed to be independent of culture and language, said Khaligh-Razavi, and so far it has been used in both primary and secondary care settings.
In primary care, it has been used mainly for referral but also for monitoring individuals with mild cognitive impairment. In secondary care settings, it has been used as an aid for diagnosis. The technology has been tested in a variety of disorders but primarily in patients with mild cognitive impairment and mild Alzheimer's disease, he added.
In a study published last year in Frontiers in Psychiatry (DOI: 10.3389/fpsyt.2021.706695), Cognetivity explored the use of its ICA with 230 participants, including 95 healthy individuals, 80 people with mild cognitive impairment, and 55 people with Alzheimer’s disease. The AI model detected cognitive impairment with an AUC of 81 percent for mild cognitive impairment patients and 88 percent for mild Alzheimer’s disease.
“Our findings suggest that this 5-min test can identify broad cognitive impairments across different stages of impairment,” lead author and Cognetivity Chief Medical Officer Chris Kalafatis and colleagues concluded. The test can support clinicians by helping diagnose mild cognitive impairment and Alzheimer’s disease, they added, and “is appropriate for large-scale screening of cognitive impairment.”
Applying New Technology to an Old One
Boston-based startup Beacon Biosignals is applying machine learning to a much older medical tool: electroencephalogram (EEG). The test has long been used to detect abnormalities in brain waves using electrodes applied to the scalp. “EEG is really an ancient technology,” Medical Director Jay Pathmanathan told Diagnostics World.
It was first developed in the 1920s and has long been in clinical use. For most of that time, EEG has been interpreted by humans by looking at squiggly lines and trying to decipher the meaning hidden within an enormous amount of data. Just a 20-minute recording results in 100 pages of activity, he said, and each page has 20 channels of data.
Because of the limitations involved, he said the utility of EEG was generally limited to significant abnormalities or severe conditions such as strokes, seizures, or tumors. And with the advent of imaging, its use became even more limited. So why might EEG be helpful for a condition like Alzheimer’s disease or dementia?
When proteins such as beta-amyloid and tau accumulate, the result can be neuronal cell death and abnormal electrical activity. This activity, measurable by EEG as epileptiform discharges, can spark a cycle that causes accelerated cell death and deterioration.
In other words, brain cells communicate using electrical signals that talk to each other with electricity, and EEG captures this activity. “It’s not able to see individual brain cells, but it can get the aggregate information in large areas of the brain,” Pathmanathan said. “And that electrical communication is going to be disturbed in any condition that causes any sort of damage or disruption to brain activity.”
The problem is that these changes in communication are subtle enough that it was too challenging to identify with visual inspection, so Beacon applies machine learning to analyze EEG in ways that were never possible in the past. The technology is trained to recognize patterns in data that humans may not be capable of spotting.
The company has recently been evaluating a similar but more specific notion. “Over the last five years, there’s been an incredible explosion of interest in using EEG in identifying patients with Alzheimer’s disease who may have a faster rate of decline or a more severe disease condition,” he explained.
Damage in the temporal lobe of the brain can cause EEG abnormalities that are visually identifiable by humans. “But it’s really hard to spot those patterns,” he said, so Beacon has trained a machine learning algorithm to do just that. Whereas a human might take hours to review 20 minutes of EEG and find that pattern, a machine can accomplish that same task in seconds.
At this summer’s Alzheimer’s Association International Conference, Pathmanathan presented research highlighting the ability to identify people with a more aggressive form of Alzheimer’s disease. The research looked at 90 people with an Alzheimer’s diagnosis and 39 with mild cognitive impairment. All EEGs were analyzed using the Beacon platform and by human reviewers.
The main finding was that machine learning could “identify abnormalities that have been shown to predict the rate of decline in Alzheimer’s disease,” he said, noting that the technology could detect those patterns far faster and much more efficiently than humans. Looking ahead, the company envisions the automated analysis of EEG data as a promising option for stratifying patients, tracking disease progression, and improving patient selection for clinical trials.
Paul Nicolaus is a freelance writer specializing in science, nature, and health. Learn more at www.nicolauswriting.com.