Amadou Ba, Fearghal O'Donncha, et al.
INFORMS 2023
We provide a unified convergence analysis for a class of shuffling-type gradient methods for solving a well-known finite-sum minimization problem commonly used in machine learning. This algorithm covers various variants such as randomized reshuffling, single shuffling, and cyclic/incremental gradient schemes. We consider two different settings: strongly convex and non-convex problems. Our main contribution consists of new non-asymptotic and asymptotic convergence rates for a general class of shuffling-type gradient methods to solve both non-convex and strongly convex problems. While our rate in the non-convex problem is new, the rate on the strongly convex case matches (up to a constant) the best-known results. However, unlike existing works in this direction, we only use standard assumptions such as smoothness and strong convexity.
Amadou Ba, Fearghal O'Donncha, et al.
INFORMS 2023
Dhaval Patel, Dzung Phan, et al.
ICDE 2022
Robert Baseman
TechConnect 2024
Shubhi Asthana, Pawan Chowdhary, et al.
INFORMS 2020