Testing for Homogeneity in Mixture Using Weighted Relative Entropy
From MaRDI portal
Publication:3543739
DOI10.1080/03610910802305009zbMath1153.62012OpenAlexW1964884967MaRDI QIDQ3543739
Publication date: 4 December 2008
Published in: Communications in Statistics - Simulation and Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/03610910802305009
mixturetest for homogeneitymaximum-likelihood estimateKullback-Leibler (K-L) informationWE (weighted relative entropy) test
Parametric hypothesis testing (62F03) Statistical aspects of information-theoretic topics (62B10) Asymptotic properties of parametric tests (62F05)
Related Items (2)
Distributions of the Kullback-Leibler divergence with applications ⋮ Improved inequalities for the Poisson and binomial distribution and upper tail quantile functions
Cites Work
- Unnamed Item
- Unnamed Item
- Multi-sample cluster analysis using Akaike's information criterion
- Estimating the dimension of a model
- Testing for the number of components in a mixture of normal distributions using moment estimators
- Asymptotic theory of the likelihood ratio test for the identification of a mixture
- Testing homogeneity in discrete mixtures
- A Modified Likelihood Ratio Test for Homogeneity in Finite Mixture Models
- How Many Clusters? Which Clustering Method? Answers Via Model-Based Cluster Analysis
- An Application of the Laplace Method to Finite Mixture Distributions
- Practical Bayesian Density Estimation Using Mixtures of Normals
- Theory & Methods: Testing for Homogeneity in an Exponential Mixture Model
- Testing Homogeneity in a Mixture Distribution via theL2Distance Between Competing Models
- On Information and Sufficiency
- Likelihood ratio test for univariate Gaussian mixture
- A new look at the statistical model identification
This page was built for publication: Testing for Homogeneity in Mixture Using Weighted Relative Entropy