About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
IEEE Trans. Inf. Theory
Paper
Convergence and consistency of regularized boosting with weakly dependent observations
Abstract
This paper studies the statistical convergence and consistency of regularized boosting methods, where the samples need not be independent and identically distributed but can come from stationary weakly dependent sequences. Consistency is proven for the composite classifiers that result from a regularization achieved by restricting the 1-norm of the base classifiers' weights. The less restrictive nature of sampling considered here is manifested in the consistency result through a generalized condition on the growth of the regularization parameter. The weaker the sample dependence, the faster the regularization parameter is allowed to grow with increasing sample size. A consistency result is also provided for data-dependent choices of the regularization parameter. © 1963-2012 IEEE.