Inexact proximal memoryless quasi-Newton methods based on the Broyden family for minimizing composite functions
From MaRDI portal
Publication:2028458
DOI10.1007/s10589-021-00264-9zbMath1469.90141OpenAlexW3126432705MaRDI QIDQ2028458
Yasushi Narushima, Shummin Nakayama, Hiroshi Yabe
Publication date: 1 June 2021
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10589-021-00264-9
nonsmooth optimizationBroyden familymemoryless quasi-Newton methodglobal and local convergence propertiesinexact proximal methodproximal Newton-type method
Related Items
An inexact successive quadratic approximation method for a class of difference-of-convex optimization problems, Inexact proximal DC Newton-type method for nonconvex composite functions, Proximal quasi-Newton method for composite optimization over the Stiefel manifold, An approximate Newton-type proximal method using symmetric rank-one updating formula for minimizing the nonsmooth composite functions, Proximal gradient/semismooth Newton methods for projection onto a polyhedron via the duality-gap-active-set strategy, A diagonally scaled Newton-type proximal method for minimization of the models with nonsmooth composite cost functions, A proximal quasi-Newton method based on memoryless modified symmetric rank-one formula
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Practical inexact proximal quasi-Newton method with global complexity analysis
- Inexact proximal Newton methods for self-concordant functions
- Spectral scaling BFGS method
- Templates for convex cone problems with applications to sparse signal recovery
- Memoryless quasi-Newton methods based on spectral-scaling Broyden family for unconstrained optimization
- A modified self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method for unconstrained optimization
- Forward-backward quasi-Newton methods for nonsmooth optimization problems
- Optimization theory and methods. Nonlinear programming
- A sufficient descent three-term conjugate gradient method via symmetric rank-one update for large-scale optimization
- Proximal Newton-Type Methods for Minimizing Composite Functions
- Linear Convergence of Proximal Gradient Algorithm with Extrapolation for a Class of Nonconvex Nonsmooth Minimization Problems
- Quasi-Newton Algorithms with Updates from the Preconvex Part of Broyden's Family
- A generalized proximal point algorithm for certain non-convex minimization problems
- Conjugate Gradient Methods with Inexact Searches
- Continuous Characterizations of the Maximum Clique Problem
- Sparse Reconstruction by Separable Approximation
- Forward-Backward Envelope for the Sum of Two Nonconvex Functions: Further Properties and Nonmonotone Linesearch Algorithms
- First-Order Methods in Optimization
- Regression Shrinkage and Selection via The Lasso: A Retrospective
- A MEMORYLESS SYMMETRIC RANK-ONE METHOD WITH SUFFICIENT DESCENT PROPERTY FOR UNCONSTRAINED OPTIMIZATION
- On measure functions for the self-scaling updating formulae for quasi-newton methods∗
- Massive data discrimination via linear support vector machines
- On Quasi-Newton Forward-Backward Splitting: Proximal Calculus and Convergence
- Model Selection and Estimation in Regression with Grouped Variables
- A Formulation of Variable Metric Methods
- A modified BFGS method and its global convergence in nonconvex minimization
- Benchmarking optimization software with performance profiles.