A NONMONOTONE ADMM-BASED DIAGONAL QUASI-NEWTON UPDATE WITH APPLICATION TO THE COMPRESSIVE SENSING PROBLEM
DOI10.3846/mma.2023.16993zbMath1529.90084OpenAlexW4387811480MaRDI QIDQ6142230
Zohre Aminifard, Saman Babaie-Kafaki
Publication date: 25 January 2024
Published in: Mathematical Modelling and Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.3846/mma.2023.16993
large-scale optimizationsimulated annealingnonmonotone line searchsmoothing techniquecompressive sensingquasi-Newton updateADMM strategy
Methods of quasi-Newton type (90C53) Approximation methods and heuristics in mathematical programming (90C59) Image processing (compression, reconstruction, etc.) in information and communication theory (94A08)
Cites Work
- Unnamed Item
- Unnamed Item
- Smooth minimization of non-smooth functions
- A particle swarm-BFGS algorithm for nonlinear programming problems
- Scaling on diagonal quasi-Newton update for large-scale unconstrained optimization
- Smoothing methods for nonsmooth, nonconvex minimization
- A linear hybridization of the Hestenes-Stiefel method and the memoryless BFGS technique
- On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae
- Multivariate spectral gradient method for unconstrained optimization
- A new structured quasi-Newton algorithm using partial information on Hessian
- A new gradient method via quasi-Cauchy relation which guarantees descent
- Robust regression: Asymptotics, conjectures and Monte Carlo
- Two--parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length
- Diagonally scaled memoryless quasi-Newton methods with application to compressed sensing
- Nonmonotone diagonally scaled limited-memory BFGS methods with application to compressive sensing based on a penalty model
- Diagonal approximation of the Hessian by finite differences for unconstrained optimization
- A modified descent Polak-Ribiére-Polyak conjugate gradient method with global convergence property for nonconvex functions
- New quasi-Newton methods for unconstrained optimization problems
- Optimization theory and methods. Nonlinear programming
- On the Global Convergence of the BFGS Method for Nonconvex Unconstrained Optimization Problems
- Sizing and Least-Change Secant Methods
- Algorithm 851
- From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images
- Two-Point Step Size Gradient Methods
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- Self-Scaling Variable Metric (SSVM) Algorithms
- Optimal conditioning of self-scaling variable Metric algorithms
- A diagonal quasi-Newton updating method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- The Quasi-Cauchy Relation and Diagonal Updating
- A Nonmonotone Line Search Technique for Newton’s Method
- A Simulated Annealing-Based Barzilai–Borwein Gradient Method for Unconstrained Optimization Problems
- CUTEr and SifDec
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Benchmarking optimization software with performance profiles.
- A survey of quasi-Newton equations and quasi-Newton methods for optimization
This page was built for publication: A NONMONOTONE ADMM-BASED DIAGONAL QUASI-NEWTON UPDATE WITH APPLICATION TO THE COMPRESSIVE SENSING PROBLEM