Minimax Mutual Information Approach for Independent Component Analysis
From MaRDI portal
Publication:4832466
DOI10.1162/089976604773717595zbMath1089.68590OpenAlexW2111587777WikidataQ48546917 ScholiaQ48546917MaRDI QIDQ4832466
Deniz Erdogmus, Yadunandana N. Rao, Kenneth E. II Hild, Jose C. Principe
Publication date: 4 January 2005
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/089976604773717595
Related Items (1)
Cites Work
- Independent component analysis, a new concept?
- Estimation of distributions using orthogonal expansions
- Orthogonal Series Density Estimation and the Kernel Eigenvalue Problem
- Information Theory and Statistical Mechanics
- Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy
- On Estimation of a Probability Density Function and Mode
- Blind separation of instantaneous mixture of sources via the Gaussian mutual information criterion.
This page was built for publication: Minimax Mutual Information Approach for Independent Component Analysis