Global asymptotic stability for hopfield-type neural networks with diffusion effects
From MaRDI portal
Publication:940240
DOI10.1007/S10483-007-0309-XzbMath1231.34071OpenAlexW1972576671MaRDI QIDQ940240
Publication date: 1 September 2008
Published in: Applied Mathematics and Mechanics. (English Edition) (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10483-007-0309-x
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A geometric approach to singular perturbation problems with applications to nerve impulse equations
- Absolute exponential stability of recurrent neural networks with Lipschitz-continuous activation functions and time delays
- Global exponential stability of Hopfield-type neural network and its applications
- Absolute exponential stability of neural networks with a general class of activation functions
- On an open problem related to the strict local minima of multilinear objective functions
- New conditions for global stability of neural networks with application to linear and quadratic programming problems
- Neural networks and physical systems with emergent collective computational abilities.
- Neurons with graded response have collective computational properties like those of two-state neurons.
This page was built for publication: Global asymptotic stability for hopfield-type neural networks with diffusion effects