Modeling and contractivity of neural-synaptic networks with Hebbian learning
From MaRDI portal
Publication:6550233
DOI10.1016/j.automatica.2024.111636zbMath1543.92002MaRDI QIDQ6550233
Giovanni Russo, Bullo, Francesco, Veronica Centorrino
Publication date: 5 June 2024
Published in: Automatica (Search for Journal in Brave)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A simplified neuron model as a principal component analyzer
- On contraction analysis for non-linear systems
- Mathematical formulations of Hebbian learning
- Transformations by diagonal matrices in a normed space
- Matrix measures, stability and contraction theory for dynamical systems on time scales
- Mathematical Equivalence of Two Common Forms of Firing Rate Models of Neural Networks
- A Differential Lyapunov Framework for Contraction Analysis
- Dynamic properties of neural networks with adapting synapses
- On Logarithmic Norms
- New conditions for global stability of neural networks with application to linear and quadratic programming problems
- Hierarchical Selective Recruitment in Linear-Threshold Brain Networks—Part I: Single-Layer Dynamics and Selective Inhibition
- Unsupervised learning by competing hidden units
- Contraction Analysis of Time-Delayed Communications and Group Cooperation
- Hebbian Learning of Recurrent Connections: A Geometrical Perspective
- Learning representations by back-propagating errors
- Neurons with graded response have collective computational properties like those of two-state neurons.
- A Contraction Approach to the Hierarchical Analysis and Design of Networked Systems
- Non-Euclidean Contraction Theory for Robust Nonlinear Stability
This page was built for publication: Modeling and contractivity of neural-synaptic networks with Hebbian learning