Publication
NeurIPS 2024
Conference paper

Shuffling Gradient-Based Methods for Nonconvex-Concave Minimax Optimization

Abstract

This paper aims at developing novel shuffling gradient-based methods for tackling two classes of minimax problems: nonconvex-linear and nonconvex-strongly concave settings. The first algorithm addresses the nonconvex-linear minimax setting and achieves the state-of-the-art oracle complexity typically observed in nonconvex optimization. It also employs a new shuffling estimator for the ``hyper-gradient,'' departing from standard shuffling techniques in optimization. The second method consists of two variants: semi-shuffling and full-shuffling schemes. These variants tackle the nonconvex-strongly concave minimax setting. We establish their oracle complexity bounds under standard assumptions, which, to our best knowledge, are the first for this specific setting. Numerical examples demonstrate the performance of our algorithms and compare them to two other methods. The results indicate that the new methods achieve comparable performance to SGD, supporting the potential of incorporating shuffling strategies into minimax algorithms.

Date

Publication

NeurIPS 2024

Authors

Topics

Share