A weak condition of globally asymptotic stability for neural networks (Q1031718)

From MaRDI portal





scientific article; zbMATH DE number 5623830
Language Label Description Also known as
English
A weak condition of globally asymptotic stability for neural networks
scientific article; zbMATH DE number 5623830

    Statements

    A weak condition of globally asymptotic stability for neural networks (English)
    0 references
    30 October 2009
    0 references
    The paper is concerned with a model of neural network described by the differential equation \[ du/dt=-Du+Ag(u)+I, \eqno(*) \] where \(u(t), I\in\mathbb{R}^n\), \(D=\text{diag}\,(d_1,\dots,d_n)>0\), \(g\) is a continuous vector function, \(A\) is a symmetric \((n\times n)\)-matrix. The author considers a general class of continuous activation functions \(g\) whose upper right Dini derivatives \(\mathcal{D}^+g_j\) satisfy the inequalities \(0<\mathcal{D}^+g_j(s)<\mathcal{D}^+g_j(0)\), \(s\neq 0\), \(j=1,\dots,n\). These functions may be neither bounded nor differentiable; however, many sigmoidal functions are included as special cases. The main result is the following theorem. Suppose that \(A\) is such that \(DG^{-1}_0-A\) is nonnegative definite, where \(G_0=\text{diag}\,(\mathcal{D}^+g_1(0),\ldots,\mathcal{D}^+g_n(0))\). If \((*)\) has an equilibrium \(u^0\), then \((*)\) has the unique equilibrium \(u^0\) and it is globally asymptotically stable. Moreover, it is also shown that differentiability of activation functions is a condition for exponential stability.
    0 references
    global asymptotic stability
    0 references
    exponential stability
    0 references
    nonnegative definite
    0 references
    compactness
    0 references
    neural network
    0 references
    0 references
    0 references

    Identifiers