Entropy-based test for generalised Gaussian distributions
From MaRDI portal
Publication:2143021
DOI10.1016/j.csda.2022.107502OpenAlexW3093171338WikidataQ114191827 ScholiaQ114191827MaRDI QIDQ2143021
Dafydd Evans, Mehmet Siddik Cadirci, Vitalii Makogin, Nikolai N. Leonenko
Publication date: 30 May 2022
Published in: Computational Statistics and Data Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.csda.2022.107502
maximum entropy principleShannon entropygoodness-of-fit testgeneralised Gaussian distributionnearest neighbour estimator of entropy
Related Items (1)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Limit theory for point processes in manifolds
- On the Kozachenko-Leonenko entropy estimator
- Correction: A class of Rényi information estimators for multidimensional densities
- A maximum entropy characterization of symmetric Kotz type and Burr multivariate distribu\-tions
- Lectures on the nearest neighbor method
- A class of Rényi information estimators for multidimensional densities
- Entropy and the central limit theorem
- Sample estimate of the entropy of a random vector
- Consistency property of elliptical probability density functions
- Penalized principal logistic regression for sparse sufficient dimension reduction
- Efficient multivariate entropy estimation via \(k\)-nearest neighbour distances
- Statistical estimation of the Shannon entropy
- Gaussian and non-Gaussian linear time series and random fields
- Concentration of information content for convex measures
- Rejoinder on: ``Tests for multivariate normality -- a critical review with emphasis on weighted \(L^2\)-statistics
- Goodness-of-fit test in parametric mixed effects models based on estimation of the error distribution
- Infinite Shannon entropy
- A computationally efficient estimator for mutual information
- Improvement of goodness-of-fit test for normal distribution based on entropy and power comparison
- Moment-Entropy Inequalities for a Random Vector
- Entropy-Based Tests of Uniformity
- A multivariate generalization of the power exponential family of distributions
- Asymptotic moments of near–neighbour distance distributions
- Geometric k-nearest neighbor estimation of entropy and mutual information
- A new class of random vector entropy estimators and its applications in testing statistical hypotheses
- Demystifying Fixed <inline-formula> <tex-math notation="LaTeX">$k$ </tex-math> </inline-formula>-Nearest Neighbor Information Estimators
- Statistical estimation of conditional Shannon entropy
- Sharp Moment-Entropy Inequalities and Capacity Bounds for Symmetric Log-Concave Distributions
- The Entropy Per Coordinate of a Random Vector is Highly Constrained Under Convexity Conditions
- An analysis of variance test for normality (complete samples)
- Affine Moments of a Random Vector
- On Communication of Analog Data from a Bounded Source Space
- Nonparametric independence testing via mutual information
This page was built for publication: Entropy-based test for generalised Gaussian distributions