About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
IEEE TBioCAS
Paper
Multimodality sensor system for long-term sleep quality monitoring
Abstract
Sleep monitoring is an important issue and has drawn considerable attention in medicine and healthcare. Given that traditional approaches, such as polysomnography, are usually costly, and often require subjects to stay overnight at clinics, there has been a need for a low-cost system suitable for long-term sleep monitoring. In this paper, we propose a system using low-cost multimodality sensors such as video, passive infrared, and heart-rate sensors for sleep monitoring. We apply machine learning methods to automatically infer a person's sleep state, especially differentiating sleep and wake states. This is useful information for inferring sleep latency, efficiency, and duration that are important for long-term monitoring of sleep quality in healthy individuals and in those with a sleep-related disorder diagnosis. Our experiments show that the proposed approach offers reasonable performance compared to an existing standard approach (i.e., actigraphy), and that multimodality data fusion can improve the robustness and accuracy of sleep state detection. © 2007 IEEE.