Worst-Case Convergence Analysis of Inexact Gradient and Newton Methods Through Semidefinite Programming Performance Estimation
From MaRDI portal
Publication:5116548
DOI10.1137/19M1281368zbMath1448.90070arXiv1709.05191MaRDI QIDQ5116548
Etienne de Klerk, Adrien B. Taylor, François Glineur
Publication date: 18 August 2020
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1709.05191
semidefinite programminggradient methodinterior point methodsinexact search directionsperformance estimation problems
Semidefinite programming (90C22) Abstract computational complexity for mathematical programming problems (90C60) Nonconvex programming, global optimization (90C26)
Related Items
A frequency-domain analysis of inexact gradient methods, On the Properties of Convex Functions over Open Sets, Complexity Analysis of a Sampling-Based Interior Point Method for Convex Optimization, Potential Function-Based Framework for Minimizing Gradients in Convex and Min-Max Optimization, A note on the optimal convergence rate of descent methods with fixed step sizes for smooth strongly convex functions, The exact worst-case convergence rate of the gradient method with fixed step lengths for \(L\)-smooth functions, Factor-\(\sqrt{2}\) acceleration of accelerated gradient methods, Conditions for linear convergence of the gradient method for non-convex optimization, A Systematic Approach to Lyapunov Analyses of Continuous-Time Models in Convex Optimization, Branch-and-bound performance estimation programming: a unified methodology for constructing optimal optimization methods, Optimal step length for the maximal decrease of a self-concordant function by the Newton method, Principled analyses and design of first-order methods with inexact proximal operators, Analysis of optimization algorithms via sum-of-squares, Optimal step length for the Newton method: case of self-concordant functions
Uses Software
Cites Work
- Unnamed Item
- Optimized first-order methods for smooth convex minimization
- An extension theorem for convex functions of class \(C^{1,1}\) on Hilbert spaces
- An optimal variant of Kelley's cutting-plane method
- First-order methods of smooth convex optimization with inexact oracle
- Smooth strongly convex interpolation and exact worst-case performance of first-order methods
- Inexact proximal Newton methods for self-concordant functions
- Lectures on convex optimization
- Exact worst-case convergence rates of the proximal gradient method for composite convex minimization
- On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions
- Efficient first-order methods for convex minimization: a constructive approach
- Performance of first-order methods for smooth convex minimization: a novel approach
- A Mathematical View of Interior-Point Methods in Convex Optimization
- Efficient Monte Carlo Procedures for Generating Points Uniformly Distributed over Bounded Regions
- Smooth Optimization with Approximate Gradient
- The geometry of logconcave functions and sampling algorithms
- Analysis and Design of Optimization Algorithms via Integral Quadratic Constraints
- Quantitative estimates of the convergence of the empirical covariance matrix in log-concave ensembles
- A relationship between the second derivatives of a convex function and of its conjugate
- Exact Worst-Case Performance of First-Order Methods for Composite Convex Optimization
- Simulated Annealing for Convex Optimization
- Convergence of methods of feasible directions in extremal problems