Discrete mathematics of neural networks. Selected topics (Q2723177)

From MaRDI portal





scientific article; zbMATH DE number 1614057
Language Label Description Also known as
English
Discrete mathematics of neural networks. Selected topics
scientific article; zbMATH DE number 1614057

    Statements

    0 references
    3 July 2001
    0 references
    Bolean functions
    0 references
    feed-forward NN
    0 references
    Discrete mathematics of neural networks. Selected topics (English)
    0 references
    The book reveals interesting relationships between discrete mathematics and artificial Neural Networks (NNs). Feed-forward NNs implementing Boolean functions are the center point, and their construction, training modes and limits are discussed. The author gives a popular introduction into feed-forward NNs (Chapter 1) and Boolean functions (Chapter 2). Linear and polynomial threshold functions are discussed in Chapter 3 with their properties and geometrical interpretation. Chapter 4 looks into how large the class of Boolean functions formed by threshold functions is. Lower and upper bounds on the number of threshold functions are discussed, and results on the number of polynomial threshold functions of a given degree are presented. It is argued in Chapter 5 that any Boolean threshold function can be represented by integer weights and threshold. The question of how large these integers have to be is posed, and en exponential lower bound is given. The threshold order of a Boolean function is the least degree of an equivalent polynomial threshold function. NEWLINENEWLINENEWLINEChapter 6 presents some positive and negative results about the possibility of determining the threshold order of a given Boolean function. An ``expected'' threshold order of a ``typical'' Boolean function is derived. A feed-forward architecture consisting of linear threshold units is suggested in Chapter 7, called a ``threshold network''. It is proved that every Boolean function can be computed by a 2-layer feed-forward threshold network. Methods for designing the threshold network corresponding to a given Boolean function are considered. Chapter 8 focuses on the number of examples which will guarantee exact learning. A set of such examples is called a ``specifying set'' for the Boolean function. Using the specifying set, the NN can be trained rather than constructed with pre-specified weights. Two training approaches are detailed: linear programming and the incremental approach. Chapter 10 introduces probabilistic learning for NNs, the PAC model (Probably Approximately Correct). It is argued that the number of examples needed to train the NN is related to the Vapnik-Chervonenkis (VC) dimension of the NN. Bounds on the VC-dimension of several types of NNs are given in Chapter 11: the perceptron, a class of polynomial threshold functions, threshold networks and sigmoid networks. Complexity and efficiency of PAC learning algorithms are the topic of Chapter 12. Chapter 13 concludes the book with a brief account on Boltzmann machines, combinatorial optimization and related topics.
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references