Our model first extracts behavioral features from gait, speech, and drawing data in a comprehensive fashion (Fig. A). For example, from gait data of normal walking, it extracts gait features categorized into five domains: pace (e.g., gait speed), rhythm (e.g., step time), variability (e.g., step length variability), left-right asymmetry (e.g., the difference between left and right step time), and postural control (e.g., mediolateral fluctuation). In a similar way, speech data is characterized by acoustic, prosodic, and linguistic features; and drawing data is characterized by kinematics, writing pressure, and time-related features. In our study, we showed that all three behavioral modalities had features that significantly differentiated AD or MCI patients from CN participants. For example, AD or MCI patients walk with lower gait speed, shorter step length, longer step time, greater variability, greater left-right asymmetry, and greater mediolateral fluctuation (Fig. B).
By effectively combining these behavioral features, our model classifies individuals to three diagnosis categories of AD, MCI, and CN. The model using two modalities could achieve better classification performance than that using a single modality, and combining all three behavioral modalities could consistently achieve more accurate classifications than combining any two modalities (Fig. C). Our model improved accuracy by 11.1% compared with a model based on the previous studies using single behavioral modalities, and it achieved 93% accuracy for classifying AD, MCI, and CN (AuROC of 0.98). We also showed that the enhancement of classification performance by combining multimodal behaviors results from their complementary nature regarding cognitive and clinical measures for the diagnosis of AD and MCI. Regression analyses showed that multimodal behavioral data of gait, speech, and drawing have different and complementary associations with cognitive impairments (Fig. D).
Our results have important implications for developing screening tools in both clinical settings and free-living setting. First, in terms of clinical implications, these behavioral data may be easily acquired in routine clinical practice. In fact, the speech and drawing data can be collected during standard cognitive tests, and a gait test can be easily incorporated into the clinical practice. Thus, our results suggest that combining such multimodal behavioral data can help clinicians accurately detect patients with AD and MCI. This would also be useful as an easy-to-perform screening tool to select individuals who should be further examined with biomarkers in clinical practice and clinical treatment trials.
Second, our study illustrates the potential value of daily behavioral data for screening AD and MCI. When comparing with other approaches using computerized assessment tools including digitalized cognitive tests, our approach focusing on behavioral data may help promote future efforts towards the development of continuous and passive monitoring tools for the early detection of AD from data collected in everyday life. Further, in our study, we showed that our model performance was significantly higher than an AuROC value of 0.86 calculated as a baseline value by using scores of the most standard cognitive tests (i.e., mini mental state examination). Thus, our approach combining multimodal daily behavioral data may help reliably identify AD and MCI as an easy-to-perform screening tool.
There are two main exciting directions to extend the scope of application based on our approach. The first direction is to investigate whether combing multiple behavioral modalities can improve estimation of neuropathological changes. Because validated diagnostic biomarkers, such as those using cerebrospinal fluid or PET imaging, are either invasive or expensive, there is an urgent need for developing non-invasive and inexpensive screening tools allowing early detection of neuropathological changes. The second direction is to investigate whether our multimodal behavioral approach can reliably identify AD and MCI from these behavioral data collected in free-living situations. This would allow for regular and timely screening and help lower the barrier to early detection of AD.
For more details, please see the original paper.[1] This is a piece of our long-term joint effort with universities towards early detection of AD and other kinds of cognitive, physical, or mental problems through analysis on daily behaviors. Other results so far include detecting AD by analyzing conversational contents during phone calls,[2] detecting mental fatigue by analyzing eye movements while watching videos,[3] as well as analyzing conversational speech with AI chatbots for detecting MCI,[4] assessing loneliness,[5], and predicting future driving accident risks.[6]
[1]: Yamada Y, Shinkawa K, Kobayashi M, Caggiano V, Nemoto M, Nemoto K, et al. Combining Multimodal Behavioral Data of Gait, Speech, and Drawing for Classification of Alzheimer’s Disease and Mild Cognitive Impairment. Journal of Alzheimer’s Disease. 2021 Jan 1;84(1):315–327.
[2]: Yamada Y, Shinkawa K, Shimmei K. Atypical Repetition in Daily Conversation on Different Days for Detecting Alzheimer Disease: Evaluation of Phone-Call Data From Regular Monitoring Service. JMIR Ment Health. 2020 Jan 12;7(1).
[3]: Yamada Y, Kobayashi M. Detecting mental fatigue from eye-tracking data gathered while watching video: Evaluation in younger and older adults. Artif Intell Med. 2018 Sep 18;91:39–48.
[4]: Yamada Y, Shinkawa K, Kobayashi M, Nishimura M, Nemoto M, Tsukada E, et al. Tablet-Based Automatic Assessment for Early Detection of Alzheimer’s Disease Using Speech Responses to Daily Life Questions. Front Digit Health. 2021 Mar 17;3:653904.
[5]: Yamada Y, Shinkawa K, Nemoto M, Arai T. Automatic Assessment of Loneliness in Older Adults Using Speech Analysis on Responses to Daily Life Questions. Front Psychiatry. 2021 Dec;12:712251.
[6]: Yamada Y, Shinkawa K, Kobayashi M, Takagi H, Nemoto M, Nemoto K, et al. Using Speech Data From Interactions With a Voice Assistant to Predict the Risk of Future Accidents for Older Drivers: Prospective Cohort Study. J Med Internet Res. 2021 Apr 8;23(4).