About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
CNSM 2014
Conference paper
Mining temporal lag from fluctuating events for correlation and root cause analysis
Abstract
The importance of mining time lags of hidden temporal dependencies from sequential data is highlighted in many domains including system management, stock market analysis, climate monitoring, and more. Mining time lags of temporal dependencies provides useful insights into understanding the sequential data and predicting its evolving trend. Traditional methods mainly utilize the predefined time window to analyze the sequential items or employ statistic techniques to identify the temporal dependencies from the sequential data. However, it is a challenging task for existing methods to find time lag of temporal dependencies in the real world, where time lags are fluctuating, noisy, and tend to be interleaved with each other. This paper introduces a parametric model to describe noisy time lags. Then an efficient expectation maximization approach is proposed to find the time lag with maximum likelihood. This paper also contributes an approximation method for learning time lag to improve the scalability without incurring significant loss of accuracy. Extensive experiments on both synthetic and real data sets are conducted to demonstrate the effectiveness and efficiency of proposed methods.