A new estimator of Kullback–Leibler information and its application in goodness of fit tests
From MaRDI portal
Publication:5107434
DOI10.1080/00949655.2019.1602870OpenAlexW2937021844MaRDI QIDQ5107434
Publication date: 27 April 2020
Published in: Journal of Statistical Computation and Simulation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/00949655.2019.1602870
Related Items
A comprehensive empirical power comparison of univariate goodness-of-fit tests for the Laplace distribution ⋮ On differential Renyi's–Tsallis divergence measure and its applications ⋮ A new goodness-of-fit test for the logistic distribution ⋮ An estimation of Phi divergence and its application in testing normality
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Mathematical Theory of Communication
- On entropy-based goodness-of-fit tests
- On the estimation of entropy
- Two measures of sample entropy
- Modified goodness-of-fit tests for the inverse Gaussian distribution
- Limit theorems for nonparametric sample entropy estimators
- An entropy characterization of the inverse Gaussian distribution and related goodness-of-fit test
- A maximum entropy type test of fit
- Goodness-of-fit tests for the inverse Gaussian distribution based on the empirical Laplace transform
- A new estimator of entropy and its application in testing normality
- Testing exponentiality using transformed data
- Monte Carlo comparison of seven normality tests
- Models of Statistic Distributions of Nonparametric Goodness-of-Fit Tests in Composite Hypotheses Testing for Double Exponential Law Cases
- Testing Goodness-of-Fit for Exponential Distribution Based on Cumulative Residual Entropy
- A location- and scale-free goodness-of-fit statistic for the exponential distribution based on maximum correlations
- Statistic Distribution Models for Some Nonparametric Goodness-of-Fit Tests in Testing Composite Hypotheses
- Entropy-Based Tests of Uniformity
- Quadratic statistics for the goodness-of-fit test of the inverse Gaussian distribution
- Goodness of fit for the inverse Gaussian distribution
- The Inverse Gaussian Distribution as a Lifetime Model
- Entropy estimators‐improvements and comparisons
- A new estimator of entropy
- Tests of Fit for the Laplace Distribution, with Applications
- MONTE CARLO COMPARISON OF FOUR NORMALITY TESTS USING DIFFERENT ENTROPY ESTIMATES
- Goodness-of-fit tests for progressively Type-II censored data from location–scale distributions
- General treatment of goodness-of-fit tests based on Kullback–Leibler information
- Monte Carlo comparison of five exponentiality tests using different entropy estimates
- Testing goodness-of-fit for Laplace distribution based on maximum entropy
- Approximate Fiducial Bounds on Reliability for the Two Parameter Negative Exponential Distribution
- Interval Estimation for the Two-Parameter Double Exponential Distribution