About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
AAAI 2017
Conference paper
From semantic models to cognitive buildings
Abstract
Today's operation of buildings is either based on simple dashboards that are not scalable to thousands of sensor data or on rules that provide very limited fault information only. In either case considerable manual effort is required for diagnosing building operation problems related to energy usage or occupant comfort. We present a Cognitive Building demo that uses (i) semantic reasoning to model physical relationships of sensors and systems, (ii) machine learning to predict and detect anomalies in energy flow, occupancy and user comfort, and (iii) speech-enabled Augmented Reality interfaces for immersive interaction with thousands of devices. Our demo analyzes data from more than 3, 300 sensors and shows how we can automatically diagnose building operation problems.