A new measure of entropy of continuous random variable
From MaRDI portal
Publication:2323203
DOI10.1080/15598608.2016.1217444zbMath1477.94036OpenAlexW2485621483MaRDI QIDQ2323203
Publication date: 30 August 2019
Published in: Journal of Statistical Theory and Practice (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/15598608.2016.1217444
Measures of information, entropy (94A17) Statistical aspects of information-theoretic topics (62B10)
Related Items (4)
Kullback-Leibler divergence for Bayesian nonparametric model checking ⋮ A test for independence via Bayesian nonparametric estimation of mutual information ⋮ A Bayesian nonparametric estimation to entropy ⋮ Estimation of entropy and extropy based on right censored data: a Bayesian non-parametric approach
Cites Work
- A Mathematical Theory of Communication
- Estimation of entropy using random sampling
- Two measures of sample entropy
- On the entropy estimators
- Goodness-of-Fit Test for Exponentiality Based on Kullback–Leibler Information
- Maximum Entropy Utility
- Improvement of goodness-of-fit test for normal distribution based on entropy and power comparison
- Entropy estimators‐improvements and comparisons
- A new estimator of entropy
- Correcting moments for goodness of fit tests based on two entropy estimates
- A new class of random vector entropy estimators and its applications in testing statistical hypotheses
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: A new measure of entropy of continuous random variable