Scalar correction method for finding least-squares solutions on Hilbert spaces and its applications
From MaRDI portal
Publication:2016276
DOI10.1016/j.amc.2013.03.001zbMath1290.65049OpenAlexW2142895088MaRDI QIDQ2016276
Predrag S. Stanimirović, Sladjana Miljković, Dragan S. Djordjević, Marko B. Miladinović
Publication date: 20 June 2014
Published in: Applied Mathematics and Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.amc.2013.03.001
unconstrained optimizationconvergence rategradient methodgeneralized inversesleast-squares solutions
Theory of matrix inversion and generalized inverses (15A09) Numerical solutions to equations with linear operators (65J10)
Related Items (1)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Scalar correction method for solving large scale unconstrained minimization problems
- On a successive transformation of probability distribution and its application to the analysis of the optimum gradient method
- Report on test matrices for generalized inverses
- Conjugate gradient method for computing the Moore-Penrose inverse and rank of a matrix
- A classification of quasi-Newton methods
- Computing Moore-Penrose inverses of Toeplitz matrices by Newton's iteration
- Generalized inverses. Theory and applications.
- A self-correcting matrix iteration for the Moore-Penrose generalized inverse
- On the asymptotic directions of the s-dimensional optimum gradient method
- Einige abstrakte Begriffe in der numerischen Mathematik (Anwendungen der Halbordnung).(Some abstract notions in the numerical mathematic. (Applications et semiorder))
- A Variable Metric Method for Approximating Generalized Inverses of Matrices
- R-linear convergence of the Barzilai and Borwein gradient method
- Two-Point Step Size Gradient Methods
- Pseudoinversus and conjugate gradients
- Iterative methods for computing generalized inverses related with optimization methods
- On the Barzilai and Borwein choice of steplength for the gradient method
- Steepest Descent for Singular Linear Operator Equations
- On the Convergence of the Conjugate Gradient Method for Singular Linear Operator Equations
- Adaptive two-point stepsize gradient algorithm
- On the nonmonotone line search
- Relaxed steepest descent and Cauchy-Barzilai-Borwein method
- Accelerated gradient descent methods with line search
This page was built for publication: Scalar correction method for finding least-squares solutions on Hilbert spaces and its applications