Complementary composite minimization, small gradients in general norms, and applications
From MaRDI portal
Publication:6634528
DOI10.1007/s10107-023-02040-5MaRDI QIDQ6634528
Cristóbal Guzmán, Jelena Diakonikolas
Publication date: 7 November 2024
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Numerical mathematical programming methods (65K05) Convex programming (90C25) Large-scale problems in mathematical programming (90C06)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Gradient methods for minimizing composite functions
- First-order methods of smooth convex optimization with inexact oracle
- Fast first-order methods for composite convex optimization with backtracking
- On lower complexity bounds for large-scale smooth convex optimization
- Universal gradient methods for convex optimization problems
- On general minimax theorems
- On optimality of Krylov's information when solving linear operator equations
- Information-based complexity of linear operator equations
- Sharp uniform convexity and smoothness inequalities for trace norms
- Asymptotic analysis of the exponential penalty trajectory in linear programming
- Universal method for stochastic composite optimization problems
- The convex geometry of linear inverse problems
- A first-order primal-dual algorithm for convex problems with applications to imaging
- Optimizing the efficiency of first-order methods for decreasing the gradient of smooth convex functions
- Nonsmooth optimization using Taylor-like models: error bounds, convergence, and termination criteria
- On linear convergence of non-Euclidean gradient methods without strong convexity and Lipschitz gradient continuity
- Mirror Prox algorithm for multi-term composite minimization and semi-separable problems
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Deterministic and stochastic primal-dual subgradient algorithms for uniformly convex minimization
- Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization
- Uniformly convex functions on Banach spaces
- Optimal methods of smooth convex minimization
- An unconstrained convex programming view of linear programming
- First-Order Methods in Optimization
- Relatively Smooth Convex Optimization by First-Order Methods, and Applications
- The Approximate Duality Gap Technique: A Unified Theory of First-Order Methods
- 10.1162/153244302760200704
- Lower Bounds for Parallel and Randomized Convex Optimization
- Faster p-norm minimizing flows, via smoothed q-norm problems
- Error Bounds, Quadratic Growth, and Linear Convergence of Proximal Methods
- An homotopy method for l p regression provably beyond self-concordance and in input-sparsity time
- Regularization and Variable Selection Via the Elastic Net
- Optimal Affine-Invariant Smooth Minimization Algorithms
- On first-order algorithms forl1/nuclear norm minimization
- Convex Analysis
- A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications
- Accelerated Bregman Primal-Dual Methods Applied to Optimal Transport and Wasserstein Barycenter Problems
- On uniformly convex functions
This page was built for publication: Complementary composite minimization, small gradients in general norms, and applications
Report a bug[[Talk:Publication:6634528|Q6634528]]