About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
ICASSP 2019
Conference paper
A Bayesian Attention Neural Network Layer for Speaker Recognition
Abstract
Neural network based attention modeling has found utility in areas such as visual analysis, speech recognition and more recently speaker recognition. Attention represents a gating (or weighting) function on information and governs how the corresponding statistics are accumulated. In the context of speaker recognition, attention can be incorporated as a frame weighted mean of an information stream. These weights can be made to sum to one (the standard approach) or be calculated in other ways. If the weights can be made to represent event observation probabilities, we can extend the approach to be within a Bayesian framework. More specifically, we combine prior information with the frame weighted statistics to produce an adapted or posterior estimate of the mean. We evaluate the proposed method on NIST data.