Dual Space Preconditioning for Gradient Descent
From MaRDI portal
Publication:5857297
DOI10.1137/19M130858XzbMath1462.90091arXiv1902.02257OpenAlexW3151987817MaRDI QIDQ5857297
Arnaud Doucet, Daniel Paulin, Chris J. Maddison, Yee Whye Teh
Publication date: 31 March 2021
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1902.02257
convex optimizationfirst-order methodexponential penalty functionrelative smoothnessnonlinear preconditioning\(p\)-norm regression
Analysis of algorithms and problem complexity (68Q25) Numerical mathematical programming methods (65K05) Convex programming (90C25) Numerical methods involving duality (49M29)
Related Items (1)
Dualities for Non-Euclidean Smoothness and Strong Convexity under the Light of Generalized Conjugacy
Cites Work
- Strong convergence of an explicit numerical method for SDEs with nonglobally Lipschitz continuous coefficients
- On the convergence of the exponential multiplier method for convex programming
- Lectures on convex optimization
- A note on tamed Euler approximations
- Stable exponential-penalty algorithm with superlinear convergence
- Asymptotic analysis of the exponential penalty trajectory in linear programming
- Nonlinear rescaling and proximal-like methods in convex optimization
- A simplified view of first order methods for optimization
- A fast dual proximal gradient algorithm for convex minimization and applications
- Mirror descent and nonlinear projected subgradient methods for convex optimization.
- Preconditioning techniques for large linear systems: A survey
- Implementable tensor methods in unconstrained convex optimization
- Forward-backward splitting with Bregman distances
- Nonlinear Preconditioning: How to Use a Nonlinear Schwarz Method to Precondition Newton's Method
- A General Framework for the Analysis of Sets of Constraints
- Applications of a Splitting Algorithm to Decomposition in Convex Programming and Variational Inequalities
- Asymptotic Analysis for Penalty and Barrier Methods in Convex and Linear Programming
- First Order Methods Beyond Convexity and Lipschitz Gradient Continuity with Applications to Quadratic Inverse Problems
- Relatively Smooth Convex Optimization by First-Order Methods, and Applications
- Harder, Better, Faster, Stronger Convergence Rates for Least-Squares Regression
- Nonlinearly Preconditioned Inexact Newton Algorithms
- Faster p-norm minimizing flows, via smoothed q-norm problems
- An homotopy method for l p regression provably beyond self-concordance and in input-sparsity time
- Iterative Refinement for ℓp-norm Regression
- Preconditioning
- Online Learning Meets Optimization in the Dual
- Convex Analysis
- A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications
- Methods of conjugate gradients for solving linear systems
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Dual Space Preconditioning for Gradient Descent