Proximal Newton-Type Methods for Minimizing Composite Functions
From MaRDI portal
Publication:2934484
DOI10.1137/130921428zbMath1306.65213arXiv1206.1623OpenAlexW2963173886MaRDI QIDQ2934484
Michael A. Saunders, Jason D. Lee, Yuekai Sun
Publication date: 12 December 2014
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1206.1623
convergencenonsmooth optimizationconvex optimizationNewton-type methodsproximal gradient methodproximal mapping
Numerical mathematical programming methods (65K05) Convex programming (90C25) Methods of quasi-Newton type (90C53)
Related Items
A Trust-region Method for Nonsmooth Nonconvex Optimization, An inexact successive quadratic approximation method for a class of difference-of-convex optimization problems, Further properties of the forward-backward envelope with applications to difference-of-convex programming, Sparse solutions to an underdetermined system of linear equations via penalized Huber loss, Sparse Approximations with Interior Point Methods, Local convergence of tensor methods, Scalable proximal methods for cause-specific hazard modeling with time-varying coefficients, Second order semi-smooth proximal Newton methods in Hilbert spaces, Adaptive Quadratically Regularized Newton Method for Riemannian Optimization, A flexible coordinate descent method, A stochastic extra-step quasi-Newton method for nonsmooth nonconvex optimization, An Improved Fast Iterative Shrinkage Thresholding Algorithm for Image Deblurring, A Subspace Acceleration Method for Minimization Involving a Group Sparsity-Inducing Regularizer, Hessian informed mirror descent, A New Homotopy Proximal Variable-Metric Framework for Composite Convex Minimization, A proximal iteratively regularized Gauss-Newton method for nonlinear inverse problems, A Proximal Quasi-Newton Trust-Region Method for Nonsmooth Regularized Optimization, An active set Newton-CG method for \(\ell_1\) optimization, A regularized semi-smooth Newton method with projection steps for composite convex programs, A hybrid quasi-Newton projected-gradient method with application to lasso and basis-pursuit denoising, Separating variables to accelerate non-convex regularized optimization, Forward-Backward Envelope for the Sum of Two Nonconvex Functions: Further Properties and Nonmonotone Linesearch Algorithms, COAP 2021 best paper prize, Newton acceleration on manifolds identified by proximal gradient methods, Inexact successive quadratic approximation for regularized optimization, Empirical risk minimization: probabilistic complexity and stepsize strategy, A derivative-free scaling memoryless DFP method for solving large scale nonlinear monotone equations, An inexact quasi-Newton algorithm for large-scale \(\ell_1\) optimization with box constraints, Generalized damped Newton algorithms in nonsmooth optimization via second-order subdifferentials, Optimal rates for estimation of two-dimensional totally positive distributions, Randomized Block Proximal Damped Newton Method for Composite Self-Concordant Minimization, Deep-plug-and-play proximal Gauss-Newton method with applications to nonlinear, ill-posed inverse problems, A Regularized Newton Method for \({\boldsymbol{\ell}}_{q}\) -Norm Composite Optimization Problems, An acceleration of proximal diagonal Newton method, Globally convergent coderivative-based generalized Newton methods in nonsmooth optimization, Composite Convex Minimization Involving Self-concordant-Like Cost Functions, LASSO Reloaded: A Variational Analysis Perspective with Applications to Compressed Sensing, Inexact proximal DC Newton-type method for nonconvex composite functions, Proximal quasi-Newton method for composite optimization over the Stiefel manifold, A globally convergent proximal Newton-type method in nonsmooth convex optimization, A proximal trust-region method for nonsmooth optimization with inexact function and gradient evaluations, Accelerating inexact successive quadratic approximation for regularized optimization through manifold identification, Testing and non-linear preconditioning of the proximal point method, Stochastic variable metric proximal gradient with variance reduction for non-convex composite optimization, Minimizing oracle-structured composite functions, An approximate Newton-type proximal method using symmetric rank-one updating formula for minimizing the nonsmooth composite functions, A Newton-type proximal gradient method for nonlinear multi-objective optimization problems, Proximal gradient/semismooth Newton methods for projection onto a polyhedron via the duality-gap-active-set strategy, A global two-stage algorithm for non-convex penalized high-dimensional linear regression problems, Inexact proximal Newton methods in Hilbert spaces, Global complexity analysis of inexact successive quadratic approximation methods for regularized optimization under mild assumptions, A Riemannian Proximal Newton Method, The Variable Metric Forward-Backward Splitting Algorithm Under Mild Differentiability Assumptions, Local convergence analysis of an inexact trust-region method for nonsmooth optimization, A proximal quasi-Newton method based on memoryless modified symmetric rank-one formula, A Highly Efficient Semismooth Newton Augmented Lagrangian Method for Solving Lasso Problems, A Multilevel Proximal Gradient Algorithm for a Class of Composite Optimization Problems, A Unified Adaptive Tensor Approximation Scheme to Accelerate Composite Convex Optimization, Finite-sample analysis of \(M\)-estimators using self-concordance, Inexact proximal stochastic second-order methods for nonconvex composite optimization, A fractal shape optimization problem in branched transport, Unnamed Item, Sub-sampled Newton methods, A family of inexact SQA methods for non-smooth convex minimization with provable convergence guarantees based on the Luo-Tseng error bound property, Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates, DC programming and DCA: thirty years of developments, Accelerating the DC algorithm for smooth functions, PNKH-B: A Projected Newton--Krylov Method for Large-Scale Bound-Constrained Optimization, Inexact proximal Newton methods for self-concordant functions, IMRO: A Proximal Quasi-Newton Method for Solving $\ell_1$-Regularized Least Squares Problems, Analysis of continuous \(H^{-1}\)-least-squares methods for the steady Navier-Stokes system, Linear convergence of inexact descent method and inexact proximal gradient algorithms for lower-order regularization problems, Inexact proximal memoryless quasi-Newton methods based on the Broyden family for minimizing composite functions, Globalized inexact proximal Newton-type methods for nonconvex composite functions, Stochastic proximal quasi-Newton methods for non-convex composite optimization, Subspace quadratic regularization method for group sparse multinomial logistic regression, Projected Dynamical Systems on Irregular, Non-Euclidean Domains for Nonlinear Optimization, A second-order method for convex1-regularized optimization with active-set prediction, Regularized estimation for highly multivariate log Gaussian Cox processes, A Multilevel Framework for Sparse Optimization with Application to Inverse Covariance Estimation and Logistic Regression, On Quasi-Newton Forward-Backward Splitting: Proximal Calculus and Convergence, A Stochastic Semismooth Newton Method for Nonsmooth Nonconvex Optimization, Fisher information regularization schemes for Wasserstein gradient flows, On the local convergence of a stochastic semismooth Newton method for nonsmooth nonconvex optimization, Distributed block-diagonal approximation methods for regularized empirical risk minimization, An Efficient Linearly Convergent Regularized Proximal Point Algorithm for Fused Multiple Graphical Lasso Problems, Generalized self-concordant functions: a recipe for Newton-type methods, Fused Multiple Graphical Lasso, Variational Asymptotic Preserving Scheme for the Vlasov--Poisson--Fokker--Planck System, An Inexact Semismooth Newton Method on Riemannian Manifolds with Application to Duality-Based Total Variation Denoising, Computation for latent variable model estimation: a unified stochastic proximal framework, One-Step Estimation with Scaled Proximal Methods, Prediction errors for penalized regressions based on generalized approximate message passing
Uses Software