About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
IJCNN 2013
Conference paper
Multi-sensory integration using sparse spatio-temporal encoding
Abstract
The external world consists of objects that stimulate multiple sensory pathways simultaneously, such as auditory, visual and touch. Our brains receive and process these sensory streams to arrive at a coherent internal representation of the world. Though much attention has been paid to these streams individually, their integration is comparatively less well understood. In this paper we propose the principle of sparse spatio-temporal encoding as a foundation to build a framework for multi-sensory integration. We derive the dynamics that govern a network of oscillatory units that achieves phase synchronization, and is capable of binding related attributes of objects. We simulate objects that produce simultaneous visual and auditory input activations. We demonstrate that our system can bind features in both these sensory modalities. We examine the effect of varying a tuning function that governs the ability of the units to synchronize, and show that by broadening this function we reduce the ability of the network to disambiguate mixtures of objects. Thus, our model offers the potential to study brain disorders such as autism, which may arise from a disruption of synchronization. © 2013 IEEE.