About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
ACM COLT 1998
Conference paper
Efficient learning of monotone concepts via quadratic optimization
Abstract
We consider a non-parametric regression model, in which the regression function to be estimated is coordinate-wise monotone. Such functions can model, for example, the dependence of an option price on a strike price, duration and a price of an underlying asset. We propose a simple algorithm based on quadratic optimization, which estimates the regression function in polynomial time. Numerical results are provided, which show that the algorithm is quite robust to possible discontinuities of the regression function. Our main theoretical result shows that the expected VC-entropy of the space of coordinate-wise monotone functions grows subexponentially. As a result, the estimating function, constructed by our algorithm, converges to the true regression function, as the number of observations goes to infinity, and the estimating procedure is consistent.