Stochastic composition optimization of functions without Lipschitz continuous gradient
From MaRDI portal
Publication:6108982
DOI10.1007/s10957-023-02180-warXiv2207.09364OpenAlexW4323833499MaRDI QIDQ6108982
No author found.
Publication date: 26 July 2023
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2207.09364
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Stochastic compositional gradient descent: algorithms for minimizing compositions of expected-value functions
- Projected subgradient methods with non-Euclidean distances for non-differentiable convex minimization and variational inequalities
- Algorithms and applications for approximate nonnegative matrix factorization
- Why least squares and maximum entropy? An axiomatic approach to inference for linear inverse problems
- A simplified view of first order methods for optimization
- Mirror descent and nonlinear projected subgradient methods for convex optimization.
- Analysis of biased stochastic gradient descent using sequential semidefinite programs
- Accelerated Bregman proximal gradient methods for relatively smooth convex optimization
- Fastest rates for stochastic mirror descent methods
- Stochastic variance-reduced prox-linear algorithms for nonconvex composite optimization
- Optimal complexity and certification of Bregman first-order methods
- On linear convergence of non-Euclidean gradient methods without strong convexity and Lipschitz gradient continuity
- Statistical estimation of composite risk functionals and risk optimization problems
- The Ordered Subsets Mirror Descent Optimization Method with Applications to Tomography
- Convergence Analysis of a Proximal-Like Minimization Algorithm Using Bregman Functions
- Multigrid Methods and Sparse-Grid Collocation Techniques for Parabolic Optimal Control Problems with Random Coefficients
- Robust Stochastic Approximation Approach to Stochastic Programming
- First Order Methods Beyond Convexity and Lipschitz Gradient Continuity with Applications to Quadratic Inverse Problems
- Relatively Smooth Convex Optimization by First-Order Methods, and Applications
- Multilevel Stochastic Gradient Methods for Nested Composition Optimization
- Mean-Variance Risk-Averse Optimal Control of Systems Governed by PDEs with Random Parameter Fields Using Quadratic Approximations
- MultiLevel Composite Stochastic Optimization via Nested Variance Reduction
- A Stochastic Subgradient Method for Nonsmooth Nonconvex Multilevel Composition Optimization
- Convex-Concave Backtracking for Inertial Bregman Proximal Gradient Algorithms in Nonconvex Optimization
- Stochastic Multilevel Composition Optimization Algorithms with Level-Independent Convergence Rates
- A Single Timescale Stochastic Approximation Method for Nested Stochastic Optimization
- Learning the parts of objects by non-negative matrix factorization
- Sample Average Approximation Method for Compound Stochastic Optimization Problems
- A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications
- A Bregman Forward-Backward Linesearch Algorithm for Nonconvex Composite Optimization: Superlinear Convergence to Nonisolated Local Minima
- Solving Stochastic Compositional Optimization is Nearly as Easy as Solving Stochastic Optimization
- Independent Component Analysis and Blind Signal Separation
- Mini-batch stochastic approximation methods for nonconvex stochastic composite optimization