Numerical investigation of the multilayer perceptron retraining (Q1281242)
From MaRDI portal
| This is the item page for this Wikibase entity, intended for internal use and editing purposes. Please use this page instead for the normal view: Numerical investigation of the multilayer perceptron retraining |
scientific article; zbMATH DE number 1266902
| Language | Label | Description | Also known as |
|---|---|---|---|
| English | Numerical investigation of the multilayer perceptron retraining |
scientific article; zbMATH DE number 1266902 |
Statements
Numerical investigation of the multilayer perceptron retraining (English)
0 references
23 March 1999
0 references
Neural networks with an architecture of multilayer of perceptrons (MLP) are widely used in mathematical test data handling. Especially they are successfully used now in plasma diagnostics of tokamak systems where it is necessary to handle voluminous database in a real time. The MLP's have the important property that they can be trained and retrained. Two methods of perceptrons retraining -- an iteration method and a direct one -- are proposed in the article. The influence of the number of latent layers in the MLP on its stability with respect to the input signal noises after retraining is investigated.
0 references
neural network
0 references
0.7184939384460449
0 references
0.7113469243049622
0 references
0.6913524866104126
0 references