A Riemannian dimension-reduced second-order method with application in sensor network localization
From MaRDI portal
Publication:6562381
DOI10.1137/23M1567229zbMATH Open1548.90458MaRDI QIDQ6562381
Tianyun Tang, Yinyu Ye, Kim-Chuan Toh, Nachuan Xiao
Publication date: 26 June 2024
Published in: SIAM Journal on Scientific Computing (Search for Journal in Brave)
Programming involving graphs or networks (90C35) Nonlinear programming (90C30) Methods of quasi-Newton type (90C53)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- A Dai-Yuan-type Riemannian conjugate gradient method with the weak Wolfe conditions
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- Theory of semidefinite programming for sensor network localization
- On solving trust-region and other regularised subproblems in optimization
- Solving Euclidean distance matrix completion problems via semidefinite progrmming
- A nonlinear programming algorithm for solving semidefinite programs via low-rank factorization
- QSDPNAL: a two-phase augmented Lagrangian method for convex quadratic semidefinite programming
- Adaptive regularization with cubics on manifolds
- Trust-region methods on Riemannian manifolds
- Local minima and convergence in low-rank semidefinite programming
- Cubic regularization of Newton method and its global performance
- On the rank of extreme matrices in semidefinite programs and the multiplicity of optimal eigenvalues
- Robust low-rank matrix completion by Riemannian optimization
- Riemannian optimization for high-dimensional tensor completion
- Elliptic preconditioner for accelerating the self-consistent field iteration in Kohn-Sham density functional theory
- On the Convergence of the Self-Consistent Field Iteration in Kohn--Sham Density Functional Theory
- Manopt, a Matlab toolbox for optimization on manifolds
- A Broyden Class of Quasi-Newton Methods for Riemannian Optimization
- Low-Rank Optimization on the Cone of Positive Semidefinite Matrices
- On the Complexity of Steepest Descent, Newton's and Regularized Newton's Methods for Nonconvex Unconstrained Optimization Problems
- Newton's method on Riemannian manifolds and a geometric model for the human spine
- A Unified Theorem on SDP Rank Reduction
- Adaptive Quadratically Regularized Newton Method for Riemannian Optimization
- Estimating the Largest Eigenvalue by the Power and Lanczos Algorithms with a Random Start
- Computation of Ground States of the Gross--Pitaevskii Functional via Riemannian Optimization
- Solving the Trust-Region Subproblem using the Lanczos Method
- The Molecule Problem: Exploiting Structure in Global Optimization
- Global rates of convergence for nonconvex optimization on manifolds
- Solving Large-Scale Cubic Regularization by a Generalized Eigenvalue Problem
- An Inertial Newton Algorithm for Deep Learning
- Error estimates for iterative algorithms for minimizing regularized quadratic subproblems
- A new, globally convergent Riemannian conjugate gradient method
- Preconditioned Low-rank Riemannian Optimization for Linear Systems with Tensor Product Structure
- Finding stationary points on bounded-rank matrices: a geometric hurdle and a smooth remedy
- Solving graph equipartition SDPs on an algebraic variety
- A Feasible Method for Solving an SDP Relaxation of the Quadratic Knapsack Problem
This page was built for publication: A Riemannian dimension-reduced second-order method with application in sensor network localization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6562381)