General cumulative Kullback–Leibler information
From MaRDI portal
Publication:4563468
DOI10.1080/03610926.2017.1321767zbMath1392.62013OpenAlexW2610034597MaRDI QIDQ4563468
Sangun Park, Ilmun Kim, Hadi Alizadeh Noughabi
Publication date: 1 June 2018
Published in: Communications in Statistics - Theory and Methods (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/03610926.2017.1321767
Nonparametric hypothesis testing (62G10) Measures of information, entropy (94A17) Statistical aspects of information-theoretic topics (62B10)
Related Items (6)
Fractional cumulative residual Kullback-Leibler information based on Tsallis entropy ⋮ Cumulative ratio information based on general cumulative entropy ⋮ A novel and effective method for quantifying complexity of nonlinear time series ⋮ On reconsidering entropies and divergences and their cumulative counterparts: Csiszár's, DPD's and Fisher's type cumulative and survival measures ⋮ Fractional cumulative residual entropy ⋮ Cumulative past Fisher information measure and its extensions
Cites Work
- Unnamed Item
- A Mathematical Theory of Communication
- On cumulative residual Kullback-Leibler information
- On cumulative entropies
- Likelihood-ratio tests for normality
- Generalized cumulative residual entropy for distributions with unrestricted supports
- Testing Goodness-of-Fit for Exponential Distribution Based on Cumulative Residual Entropy
- Cumulative Residual Entropy: A New Measure of Information
- Correcting moments for goodness of fit tests based on two entropy estimates
- Powerful Goodness-of-fit Tests Based on the Likelihood Ratio
- Nonparametric Likelihood Confidence Bands for a Distribution Function
- General treatment of goodness-of-fit tests based on Kullback–Leibler information
- An analysis of variance test for normality (complete samples)
This page was built for publication: General cumulative Kullback–Leibler information