New Entropy Estimator with an Application to Test of Normality
DOI10.1080/03610926.2011.608473zbMath1319.62080arXiv1110.3436OpenAlexW2952302786MaRDI QIDQ2839082
Amor Keziou, Salim Bouzebda, Tewfik Lounis, Issam Elhattab
Publication date: 4 July 2013
Published in: Communications in Statistics - Theory and Methods (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1110.3436
entropykernel estimationquantile densitystrong approximationstest of normalityentropy testspacing-based estimatorsVasicek's estimator
Asymptotic properties of parametric estimators (62F12) Density estimation (62G07) Asymptotic distribution theory in statistics (62E20) Parametric hypothesis testing (62F03) Measures of information, entropy (94A17)
Related Items (5)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Mathematical Theory of Communication
- Uniform in bandwidth consistency of the kernel-type estimator of the Shannon's entropy
- A strong consistency of a nonparametric estimate of entropy under random censorship
- On entropy-based goodness-of-fit tests
- Asymptotic properties of nonparametric curve estimates
- On the estimation of the quantile density function
- Kernel approximations of a Wiener process
- Strong approximations of the quantile process
- Estimating densities, quantiles, quantile densities and density quantiles
- Two measures of sample entropy
- A law of the logarithm for kernel quantile density estimators
- Unified estimators of smooth quantile and quantile density functions
- Limit theorems for nonparametric sample entropy estimators
- Uniform consistency of generalized kernel estimators of quantile density
- Uniform-in-bandwidth consistency for kernel-type estimators of Shannon's entropy
- Uniform in bandwidth consistency of kernel-type function estimators
- Information Theory and Statistical Mechanics
- On the dimension and entropy of probability distributions
- Testing Exponentiality Based on Type II Censored Data and a New cdf Estimator
- A nonparametric estimation of the entropy for absolutely continuous distributions (Corresp.)
- On the entropy of continuous probability distributions (Corresp.)
- Nonparametric Statistical Data Modeling
- Entropy estimators‐improvements and comparisons
- A new estimator of entropy
- Correcting moments for goodness of fit tests based on two entropy estimates
- MONTE CARLO COMPARISON OF FOUR NORMALITY TESTS USING DIFFERENT ENTROPY ESTIMATES
- Elements of Information Theory
- Robust Statistics
This page was built for publication: New Entropy Estimator with an Application to Test of Normality