About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
ICPR 2000
Conference paper
A least square kernel machine with box constraints
Abstract
In this paper, we present a least square kernel machine with box constraints (LSKMBC). The existing least square machines assume Gaussian hyperpriors and subsequently express the optima of the regularized squared loss as a set of linear equations. The generalized LASSO framework deviates from the assumption of Gaussian hyperpriors and employs a more general Huber loss function. In our approach, we consider uniform priors and obtain the loss functional for a given margin considered to be a model selection parameter. The framework not only differs from the existing least square kernel machines, but also it does not require Mercer condition satisfiability. Experimentally we validate the performance of the classifier and show that it is able to outperform SVM and LSSVM on certain real-life datasets. © 2008 IEEE.