An assessment of Hermite function based approximations of mutual information applied to independent component analysis (Q845379)
From MaRDI portal
| This is the item page for this Wikibase entity, intended for internal use and editing purposes. Please use this page instead for the normal view: An assessment of Hermite function based approximations of mutual information applied to independent component analysis |
scientific article; zbMATH DE number 5664145
| Language | Label | Description | Also known as |
|---|---|---|---|
| English | An assessment of Hermite function based approximations of mutual information applied to independent component analysis |
scientific article; zbMATH DE number 5664145 |
Statements
An assessment of Hermite function based approximations of mutual information applied to independent component analysis (English)
0 references
29 January 2010
0 references
Summary: At the heart of many ICA techniques is a nonparametric estimate of an information measure, usually via nonparametric density estimation, for example, kernel density estimation. While not as popular as kernel density estimators, orthogonal functions can be used for nonparametric density estimation (via a truncated series expansion whose coefficients are calculated from the observed data). While such estimators do not necessarily yield a valid density, which kernel density estimators do, they are faster to calculate than kernel density estimators, in particular for a modified version of Renyi's entropy of order 2. In this paper, we compare the performance of ICA using Hermite series based estimates of Shannon's and Renyi's mutual information, to that of Gaussian kernel based estimates. The comparisons also include ICA using the RADICAL estimate of Shannon's entropy and a FastICA estimate of neg-entropy.
0 references
ICA
0 references
nonparametric estimation
0 references
Hermite functions
0 references
kernel density estimation
0 references