A nonlinear version of Halanay inequality and application to neural networks theory
From MaRDI portal
Publication:5109758
DOI10.7153/jmi-2020-14-16zbMath1439.93016OpenAlexW3012272655MaRDI QIDQ5109758
Publication date: 13 May 2020
Published in: Journal of Mathematical Inequalities (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.7153/jmi-2020-14-16
exponential stabilizationHopfield neural networknonlinear Halanay inequalitynon-Lipschitz continuous activation functions
Related Items (1)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Global asymptotic stability of neural networks with discontinuous activations
- Global convergence of neural networks with mixed time-varying delays and discontinuous neuron activations
- Robust exponential stability for interval neural networks with delays and non-Lipschitz activation functions
- Delay-dependent stability criterion for delayed Hopfield neural networks
- On Nagumo's theorem
- Stability analysis for neural networks with inverse Lipschitzian neuron activations and impulses
- Global exponential stability of Hopfield neural networks with delays and inverse Lipschitz neuron activations
- On uniqueness criteria for systems of ordinary differential equations
- On a general nonlinear problem with distributed delays
- Some sufficient conditions for global exponential stability of delayed Hopfield neural networks
- Global asymptotic stability of Hopfield neural network involving distributed delays
- Exponential decay for a system of equations with distributed delays
- Novel criteria for global exponential periodicity and stability of recurrent neural networks with time-varying delays
- A general framework for global asymptotic stability analysis of delayed neural networks based on LMI approach
- Generalized Lyapunov approach for convergence of neural networks with discontinuous or non-Lipschitz activations
- Simplified LMI condition for global asymptotic stability of delayed neural networks
- HÖLDER CONTINUOUS ACTIVATION FUNCTIONS IN NEURAL NETWORKS
- On QUAD, Lipschitz, and Contracting Vector Fields for Consensus and Synchronization of Networks
This page was built for publication: A nonlinear version of Halanay inequality and application to neural networks theory