Statistical mechanics of learning (Q2723579)

From MaRDI portal





scientific article; zbMATH DE number 1614845
Language Label Description Also known as
English
Statistical mechanics of learning
scientific article; zbMATH DE number 1614845

    Statements

    8 July 2001
    0 references
    intelligent behavior
    0 references
    neural networks
    0 references
    statistical mechanics
    0 references
    learning rules
    0 references
    on-line learning
    0 references
    multilayer networks
    0 references
    0 references
    0 references
    Statistical mechanics of learning (English)
    0 references
    In the framework of artificial neural networks the process of learning from examples can be specified. The authors provide various important concepts over the last decade by applying the techniques of statistical mechanics to the subject of learning processes. The book provides an introduction both to basic notions necessary to study learning processes, and techniques to obtain quantitative results. A major part deals with the perceptron as a fundamental instrument for constructing neural networks. The concepts are supplemented with background material in mathematics and physics, and contain numerous examples and exercises. While the basic computational techniques are presented in great detail, the more technical aspects have been collected in the appendices. The discussion section at the end of every chapter includes a brief review of the literature. NEWLINENEWLINENEWLINEContents: 1. Getting Started (artificial neural networks, general setup); 2. Perceptron Learning -- Basics (Gibbs learning, the simulated annealed approximation, Gardner analysis); 3. A Choice of Learning Rules (Hebb rule, adaline rule, Bayes rule); 4. Augmented Statistical Mechanics Formulation (maximal stabilities, general statistical mechanics formulation, optimal potential); 5. Noisy Teachers (perfect learning, learning with errors); 6. The Storage Problem (Cover analysis, Galilean pastiche: Ising perceptron); 7. Discontinuous Learning (smooth networks, dynamics of discontinuous learning); 8. Unsupervised Learning (deceptions of randomness, clustering through competitive learning); 9. On-line Learning (perception with a smooth transfer function, unsupervised on-line learning); 10. Making Contact with Statistics (Sauer's lemma, Vapik-Chernonenkis theorem, comparison with statistical mechanics); 11. Multifractals (the multifractal organization of internal representations); 12. Multilayer Networks (parity tree, committee tree, committee machine); 13. On-line Learning in Multilayer Networks (Bayesian on-line learning); What Else? (support vector machines, complex optimization, error-correcting codes, game theory); Appendices. NEWLINENEWLINENEWLINEThe book can be recommended both to students of the subjects artificial intelligence, statistics, of interdisciplinary subjects in psychology and philosophy, and to scientists and applied researchers interested in concepts of intelligent learning processes.
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references