Lower bounds on expected redundancy for nonparametric classes
From MaRDI portal
Publication:4880015
DOI10.1109/18.481802zbMath0843.62006OpenAlexW2151308166MaRDI QIDQ4880015
Publication date: 2 June 1996
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1109/18.481802
Shannon capacityuniversal codingminimax lower boundsexpected redundancymutual information quantitynonparametric density classes
Density estimation (62G07) Communication, information (94A99) Statistical aspects of information-theoretic topics (62B10) Coding theorems (Shannon theory) (94A24) Nonparametric inference (62G99)
Related Items (3)
Statistical Problem Classes and Their Links to Information Theory ⋮ Mutual information, metric entropy and cumulative relative entropy risk ⋮ Improved lower bounds for learning from noisy examples: An information-theoretic approach
This page was built for publication: Lower bounds on expected redundancy for nonparametric classes