Interpreting Kullback--Leibler divergence with the Neyman-Pearson Lemma
From MaRDI portal
Publication:855917
DOI10.1016/j.jmva.2006.03.007zbMath1101.62004OpenAlexW2152617511WikidataQ125035877 ScholiaQ125035877MaRDI QIDQ855917
Publication date: 7 December 2006
Published in: Journal of Multivariate Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jmva.2006.03.007
discriminant analysismaximum likelihoodinformation geometryROC curveexponential connectiontesting hypothesesmixture connection
Related Items
Information criteria in classification: new divergence-based classifiers, The gamma generalized normal distribution: a descriptor of SAR imagery, Information theoretic novelty detection, The essential dependence for a group of random vectors, On shape properties of the receiver operating characteristic curve, Entropic approach to multiscale clustering analysis, On the issue of convergence of certain divergence measures related to finding most nearly compatible probability distribution under the discrete set-up, Application of iterated Bernstein operators to distribution function and density approximation, Symmetry of receiver operating characteristic curves and Kullback-Leibler divergences between the signal and noise populations, Direct Learning of Sparse Changes in Markov Networks by Density Ratio Estimation, Density estimation via the random forest method, The contrast features selection with empirical data, A method for quantitative fault diagnosability analysis of stochastic linear descriptor models, On modeling count data: a comparison of some well-known discrete distributions, Forecaster's dilemma: extreme events and forecast evaluation, Genome Scanning Tests for Comparing Amino Acid Sequences Between Groups, Distance-based tests for planar shape, Influential observation in complex normal data for problems in allometry, Robust image watermarking using non-regular wavelets, Sparse Cholesky Factorization by Kullback--Leibler Minimization
Cites Work
- Second order efficiency of minimum contrast estimators in a curved exponential family
- Geometry of minimum contrast
- Differential-geometrical methods in statistics
- A class of logistic-type discriminant functions
- Information Geometry of U-Boost and Bregman Divergence
- Note on the Consistency of the Maximum Likelihood Estimate
- On Information and Sufficiency
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item