About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
Systems and Control Letters
Paper
Learning low-complexity autoregressive models via proximal alternating minimization
Abstract
We consider the estimation of the state transition matrix in vector autoregressive models, when time sequence data is limited but nonsequence steady-state data is abundant. To leverage both sources of data, we formulate the least squares minimization problem regularized by a Lyapunov penalty. We impose cardinality or rank constraints to reduce the complexity of the autoregressive model. The resulting nonconvex, nonsmooth problem is solved by using the proximal alternating linearization method (PALM). We prove that PALM is globally convergent to a critical point and that the estimation error monotonically decreases. Explicit formulas are obtained for the proximal operators to facilitate the implementation of PALM. We demonstrate the effectiveness of the developed method by numerical experiments.