Information theoretic learning with adaptive kernels
DOI10.1016/J.SIGPRO.2010.06.023zbMath1203.94001OpenAlexW2087558162MaRDI QIDQ612575
Abhishek Singh, Jose C. Principe
Publication date: 29 December 2010
Published in: Signal Processing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.sigpro.2010.06.023
minimum error entropyadaptive system traininginformation theoretic learningkernel widthKullback Leibler divergence
Learning and adaptive systems in artificial intelligence (68T05) Measures of information, entropy (94A17) Information theory (general) (94A15) Software, source code, etc. for problems pertaining to information and communication theory (94-04)
Related Items (3)
Cites Work
- Unnamed Item
- Unnamed Item
- On Kullback-Leibler loss and density estimation
- Histogram bin width selection for time-dependent Poisson processes
- A Method for Selecting the Bin Size of a Time Histogram
- On the Choice of Smoothing Parameters for Parzen Estimators of Probability Density Functions
- Deterministic Nonperiodic Flow
- On Estimation of a Probability Density Function and Mode
This page was built for publication: Information theoretic learning with adaptive kernels