About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
NeurIPS 2023
Workshop paper
Stochastic FISTA Step Search Algorithm for Convex Optimization
Abstract
In this paper, we propose an accelerated stochastic step search algorithm which combines an accelerated method with a fully adaptive step size parameter for convex problems in (Scheinberg et. al., 2014) with stochastic step search analysis in (Paquette and Scheinberg, 2020). Under appropriate conditions on the accuracy of the estimates of gradient and function value our algorithm achieves expected iteration complexity of to reach an -accurate solution which satisfies . This complexity matches with the iteration complexity of deterministic Nesterov's accelerated and FISTA algorithms (Nesterov, 1983, Beck and Teboulle, 2009). This paper continues the line of work on stochastic adaptive algorithms studied in (Berahas et. al., 2021, Blanchet et. al., 2019, Paquette and Scheinberg, 2020) and is the first one to develop an accelerated gradient descent type algorithm in this domain.