Fault diagnosis method based on information entropy and relative principal component analysis (Q1794150)
From MaRDI portal
| This is the item page for this Wikibase entity, intended for internal use and editing purposes. Please use this page instead for the normal view: Fault diagnosis method based on information entropy and relative principal component analysis |
scientific article; zbMATH DE number 6954241
| Language | Label | Description | Also known as |
|---|---|---|---|
| English | Fault diagnosis method based on information entropy and relative principal component analysis |
scientific article; zbMATH DE number 6954241 |
Statements
Fault diagnosis method based on information entropy and relative principal component analysis (English)
0 references
15 October 2018
0 references
Summary: In traditional Principle Component Analysis (PCA), because of the neglect of the dimensions influence between different variables in the system, the selected Principal Components (PCs) often fail to be representative. While the relative transformation PCA is able to solve the above problem, it is not easy to calculate the weight for each characteristic variable. In order to solve it, this paper proposes a kind of fault diagnosis method based on information entropy and Relative Principle Component Analysis. Firstly, the algorithm calculates the information entropy for each characteristic variable in the original dataset based on the information gain algorithm. Secondly, it standardizes every variable's dimension in the dataset. Then, according to the information entropy, it allocates the weight for each standardized characteristic variable. Finally, it utilizes the relative-principal-components model established for fault diagnosis. Furthermore, the simulation experiments based on Tennessee Eastman process and Wine datasets demonstrate the feasibility and effectiveness of the new method.
0 references
principle component analysis
0 references
fault diagnosis
0 references
information entropy
0 references