Differentiating Nonsmooth Solutions to Parametric Monotone Inclusion Problems
From MaRDI portal
Publication:6136656
DOI10.1137/22m1541630zbMath1530.49026arXiv2212.07844OpenAlexW4311599212MaRDI QIDQ6136656
Edouard Pauwels, Jérôme Bolte, Antonio Silveti-Falls
Publication date: 17 January 2024
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2212.07844
maximal monotone operatorClarke subdifferentialgeneralized gradientgeneralized equationmonotone inclusionimplicit differentiationconservative fielddifferentiating solutions
Sensitivity, stability, well-posedness (49K40) Nonsmooth analysis (49J52) Set-valued and variational analysis (49J53) Optimality conditions for minimax problems (49K35)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- An inertial forward-backward-forward primal-dual splitting algorithm for solving monotone inclusion problems
- Inertial Douglas-Rachford splitting for monotone inclusion problems
- On the ergodic convergence rates of a first-order primal-dual algorithm
- A primal-dual splitting method for convex optimization involving Lipschitzian, proximable and linear composite terms
- Necessary and sufficient optimality conditions for mathematical programs with equilibrium constraints
- Finding best approximation pairs relative to two closed convex sets in Hilbert spaces
- Fifty years of maximal monotonicity
- Proto-differentiability of set-valued mappings and its applications in optimization
- Sensitivity analysis for nonsmooth generalized equations
- Automatic differentiation of iterative processes
- Error bounds in mathematical programming
- Sensitivity analysis of generalized equations
- From error bounds to the complexity of first-order descent methods for convex functions
- A first-order primal-dual algorithm for convex problems with applications to imaging
- Conservative set valued fields, automatic differentiation, stochastic gradient methods and deep learning
- Perturbation techniques for convergence analysis of proximal gradient method and other first-order algorithms via variational analysis
- Sensitivity analysis of maximally monotone inclusions via the proto-differentiability of the resolvent operator
- The degrees of freedom of partly smooth regularizers
- Clarke generalized Jacobian of the projection onto the cone of positive semidefinite matrices
- On the maximal monotonicity of subdifferential mappings
- Convergence of the forward-backward algorithm: beyond the worst-case with the help of geometry
- Bilevel Optimization with Nonsmooth Lower Level Problems
- Clarke Subgradients of Stratifiable Functions
- Splitting Algorithms for the Sum of Two Nonlinear Operators
- Strongly Regular Generalized Equations
- Generalized equations and their solutions, Part I: Basic theory
- Tangent Cones, Generalized Gradients and Mathematical Programming in Banach Spaces
- Sensitivity Analysis of Solutions to Generalized Equations
- Variational Analysis
- Lagrangian Duality and Related Multiplier Methods for Variational Inequality Problems
- Sensitivity Analysis for Mirror-Stratifiable Convex Functions
- Sparsifying Transform Learning With Efficient Optimal Updates and Convergence Guarantees
- First-Order Methods in Optimization
- Solving monotone inclusions via compositions of nonexpansive averaged operators
- Generalized Hessian Properties of Regularized Nonsmooth Functions
- A Primal-Dual Algorithm with Line Search for General Convex-Concave Saddle Point Problems
- Convergence of a Piggyback-Style Method for the Differentiation of Solutions of Standard Saddle-Point Problems
- Fixed Point Strategies in Data Science
- A Forward-Backward Splitting Method for Monotone Inclusions Without Cocoercivity
- The Strong Second-Order Sufficient Condition and Constraint Nondegeneracy in Nonlinear Semidefinite Programming and Their Implications
- Signal Recovery by Proximal Forward-Backward Splitting
- Local linear convergence analysis of Primal–Dual splitting methods
- Learning Consistent Discretizations of the Total Variation
- Learning Maximally Monotone Operators for Image Recovery
- Accelerated Bregman Primal-Dual Methods Applied to Optimal Transport and Wasserstein Barycenter Problems
- Convex analysis and monotone operator theory in Hilbert spaces
- LASSO Reloaded: A Variational Analysis Perspective with Applications to Compressed Sensing