Regularization tools for training large feed-forward neural networks using automatic differentiation∗
From MaRDI portal
Publication:4227926
DOI10.1080/10556789808805701zbMath0913.68177OpenAlexW2069077343MaRDI QIDQ4227926
Per Lindström, Mårten Gulliksson, Jerry Eriksson, Per-Åke Wedin
Publication date: 25 May 1999
Published in: Optimization Methods and Software (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/10556789808805701
Learning and adaptive systems in artificial intelligence (68T05) Parallel algorithms in computer science (68W10)
Related Items (4)
Local results for the Gauss-Newton method on constrained rank-deficient nonlinear least squares ⋮ Variable projections neural network training ⋮ Unnamed Item ⋮ KKT conditions for rank-deficient nonlinear least-square problems with rank-deficient nonlinear constraints
Uses Software
Cites Work
- An implicit shift bidiagonalization algorithm for ill-posed systems
- A new linesearch algorithm for nonlinear least squares problems
- Inexact Newton Methods
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Restart procedures for the conjugate gradient method
- Unnamed Item
- Unnamed Item
This page was built for publication: Regularization tools for training large feed-forward neural networks using automatic differentiation∗