Mathematical Research Data Initiative
Main page
Recent changes
Random page
Help about MediaWiki
Create a new Item
Create a new Property
Create a new EntitySchema
Merge two items
In other projects
Discussion
View source
View history
Purge
English
Log in

Some Generalized Sufficient Convergence Criteria for Nonlinear Continuous Neural Networks

From MaRDI portal
Publication:5696508
Jump to:navigation, search

DOI10.1162/0899766054026701zbMath1102.68120OpenAlexW2041497887WikidataQ51971019 ScholiaQ51971019MaRDI QIDQ5696508

Jito Vanualailai, Shin-ichi Nakagiri

Publication date: 18 October 2005

Published in: Neural Computation (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1162/0899766054026701

zbMATH Keywords

direct method of Lyapunovdynamical neural networks


Mathematics Subject Classification ID

Learning and adaptive systems in artificial intelligence (68T05)


Related Items

Practical exponential stability of nonlinear nonautonomous differential equations under perturbations, Long-Range Out-of-Sample Properties of Autoregressive Neural Networks



Cites Work

  • Absolute stability of global pattern formation and parallel memory storage by competitive neural networks
  • A comment on "Comments on 'Necessary and sufficient condition for absolute stability of neural networks'"
  • Stability properties of the Hopfield-type neural networks
  • Global Convergence Rate of Recurrently Connected Neural Networks
  • New conditions for global stability of neural networks with application to linear and quadratic programming problems
  • Neurons with graded response have collective computational properties like those of two-state neurons.
Retrieved from "https://portal.mardi4nfdi.de/w/index.php?title=Publication:5696508&oldid=30418071"
Tools
What links here
Related changes
Special pages
Printable version
Permanent link
Page information
MaRDI portal item
This page was last edited on 7 March 2024, at 05:38.
Privacy policy
About MaRDI portal
Disclaimers
Imprint
Powered by MediaWiki