Mathematical Research Data Initiative
Main page
Recent changes
Random page
Help about MediaWiki
Create a new Item
Create a new Property
Create a new EntitySchema
Merge two items
In other projects
Discussion
View source
View history
Purge
English
Log in

Universal Estimation of Information Measures for Analog Sources

From MaRDI portal
Publication:3589014
Jump to:navigation, search

DOI10.1561/0100000021zbMath1194.94174OpenAlexW2095324495MaRDI QIDQ3589014

Sergio Verdú, Sanjeev R. Kulkarni, Qing Wang

Publication date: 10 September 2010

Published in: Foundations and Trends® in Communications and Information Theory (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1561/0100000021



Mathematics Subject Classification ID

Measures of information, entropy (94A17)


Related Items (2)

Bias Reduction and Metric Learning for Nearest-Neighbor Estimation of Kullback-Leibler Divergence ⋮ Optimal rates of entropy estimation over Lipschitz balls






This page was built for publication: Universal Estimation of Information Measures for Analog Sources

Retrieved from "https://portal.mardi4nfdi.de/w/index.php?title=Publication:3589014&oldid=16999967"
Tools
What links here
Related changes
Special pages
Printable version
Permanent link
Page information
MaRDI portal item
This page was last edited on 5 February 2024, at 03:14.
Privacy policy
About MaRDI portal
Disclaimers
Imprint
Powered by MediaWiki