About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
NeurIPS 2023
Conference paper
An Alternating Optimization Method for Bilevel Problems under the Polyak-Łojasiewicz Condition
Abstract
Bilevel optimization has recently regained interest owing to its applications in emerging machine learning fields such as hyperparameter optimization, meta-learning, and reinforcement learning. Recent results have shown that simple alternating (implicit) gradient-based algorithms can achieve the same convergence rate of single-level gradient descent (GD) for bilevel problems with a strongly convex lower-level objective. However, it remains unclear whether this result can be generalized to bilevel problems beyond this basic setting. In this paper, we propose a \textsf{G}eneralized \textsf{AL}ternating m\textsf{E}thod for bilevel op\textsf{T}imization (\textsf{GALET}) with a nonconvex lower-level objective that satisfies the Polyak-Łojasiewicz (PL) condition. We first introduce a stationary metric for the considered bilevel problems, which generalizes the existing metric. We then establish that GALET achieves an -stationary metric for the considered problem within iterations, which matches the iteration complexity of GD for single-level smooth nonconvex problems.