Analysis of the Frank-Wolfe method for convex composite optimization involving a logarithmically-homogeneous barrier
From MaRDI portal
Publication:6038640
DOI10.1007/s10107-022-01820-9zbMath1517.90108arXiv2010.08999OpenAlexW3170692334MaRDI QIDQ6038640
Publication date: 2 May 2023
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2010.08999
complexity analysisbarrierFrank-Wolfe methodself-concordancecomposite optimizationlogarithmic-homogeneity
Related Items
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Nonlinear total variation based noise removal algorithms
- Gradient methods for minimizing composite functions
- Conditional gradient algorithms for norm-regularized smooth convex optimization
- Asymptotic optimality and asymptotic equipartiton properties of log- optimum investment
- Conditional gradient algorithms with open loop step size rules
- Proximal minimization algorithm with \(D\)-functions
- Introductory lectures on convex optimization. A basic course.
- Conditional gradient type methods for composite nonlinear and stochastic optimization
- Complexity bounds for primal-dual methods minimizing the model of objective function
- Generalized stochastic Frank-Wolfe algorithm with stochastic ``substitute gradient for structured convex optimization
- Generalized self-concordant analysis of Frank-Wolfe algorithms
- A Mathematical View of Interior-Point Methods in Convex Optimization
- The Ordered Subsets Mirror Descent Optimization Method with Applications to Tomography
- CVXPY: A Python-Embedded Modeling Language for Convex Optimization
- Convex Optimization in Normed Spaces
- Duality Between Subgradient and Conditional Gradient Methods
- An Extended Frank--Wolfe Method with “In-Face” Directions, and Its Application to Low-Rank Matrix Completion
- The Equivalence of Two Extremum Problems
- An algorithm for maximizing expected log investment return
- Updating the Inverse of a Matrix
- Convergence Rates for Conditional Gradient Sequences Generated by Implicit Step Length Rules
- Rates of Convergence for Conditional Gradient Algorithms Near Singular and Nonsingular Extremals
- Relatively Smooth Convex Optimization by First-Order Methods, and Applications
- Stochastic Primal-Dual Hybrid Gradient Algorithm with Arbitrary Sampling and Imaging Applications
- Nonlinear Proximal Point Algorithms Using Bregman Functions, with Applications to Convex Programming
- Rounding of Polytopes in the Real Number Model of Computation
- Interior Proximal and Multiplier Methods Based on Second Order Homogeneous Kernels
- Computation of Minimum-Volume Covering Ellipsoids
- This is SPIRAL-TAP: Sparse Poisson Intensity Reconstruction ALgorithms—Theory and Practice
- Composite Self-Concordant Minimization
- Optimal and Efficient Designs of Experiments
- A Tight Upper Bound on the Rate of Convergence of Frank-Wolfe Algorithm
- A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications
- On the Minimum Volume Covering Ellipsoid of Ellipsoids
- Inexact model: a framework for optimization and variational inequalities
- New analysis and results for the Frank-Wolfe method