Convergence analysis of Chauvin's PCA learning algorithm with a constant learning rate
From MaRDI portal
Publication:2466667
DOI10.1016/j.chaos.2005.12.007zbMath1492.62110OpenAlexW2033738158MaRDI QIDQ2466667
Publication date: 15 January 2008
Published in: Chaos, Solitons and Fractals (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.chaos.2005.12.007
Factor analysis and principal components; correspondence analysis (62H25) Asymptotic properties of nonparametric inference (62G20) Learning and adaptive systems in artificial intelligence (68T05)
Cites Work
- Unnamed Item
- Adaptive algorithms for first principal eigenvector computation
- On stochastic approximation of the eigenvectors and eigenvalues of the expectation of a random matrix
- A simplified neuron model as a principal component analyzer
- Exponential stability of continuous-time and discrete-time bidirectional associative memory networks with delays
- Generalized neural networks for spectral analysis: dynamics and Liapunov functions
- Global robust stability analysis of neural networks with discrete time delays
- Estimation of exponential convergence rate and exponential stability for neural networks with time-varying delay
- Analysis of recursive stochastic algorithms
- Global convergence of Lotka-Volterra recurrent neural networks with delays
This page was built for publication: Convergence analysis of Chauvin's PCA learning algorithm with a constant learning rate