Bilevel Optimization with a Lower-level Contraction: Optimal Sample Complexity without Warm-start
From MaRDI portal
Publication:6390408
arXiv2202.03397MaRDI QIDQ6390408
Author name not available (Why is that?)
Publication date: 7 February 2022
Abstract: We analyse a general class of bilevel problems, in which the upper-level problem consists in the minimization of a smooth objective function and the lower-level problem is to find the fixed point of a smooth contraction map. This type of problems include instances of meta-learning, equilibrium models, hyperparameter optimization and data poisoning adversarial attacks. Several recent works have proposed algorithms which warm-start the lower-level problem, i.e. they use the previous lower-level approximate solution as a staring point for the lower-level solver. This warm-start procedure allows one to improve the sample complexity in both the stochastic and deterministic settings, achieving in some cases the order-wise optimal sample complexity. However, there are situations, e.g., meta learning and equilibrium models, in which the warm-start procedure is not well-suited or ineffective. In this work we show that without warm-start, it is still possible to achieve order-wise (near) optimal sample complexity. In particular, we propose a simple method which uses (stochastic) fixed point iterations at the lower-level and projected inexact gradient descent at the upper-level, that reaches an -stationary point using and samples for the stochastic and the deterministic setting, respectively. Finally, compared to methods using warm-start, our approach yields a simpler analysis that does not need to study the coupled interactions between the upper-level and lower-level iterates.
Has companion code repository: https://github.com/prolearner/hypertorch
This page was built for publication: Bilevel Optimization with a Lower-level Contraction: Optimal Sample Complexity without Warm-start
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6390408)