Hypothesis Testing for Mixture Model Selection
From MaRDI portal
Publication:5222517
DOI10.1080/00949655.2015.1131282OpenAlexW2261930756MaRDI QIDQ5222517
Antonio Punzo, Paul D. McNicholas, Ryan P. Browne
Publication date: 1 April 2020
Published in: Journal of Statistical Computation and Simulation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/00949655.2015.1131282
Related Items (10)
Multivariate response and parsimony for Gaussian cluster-weighted models ⋮ Fitting insurance and economic data with outliers: a flexible approach based on finite mixtures of contaminated gamma distributions ⋮ Model-based clustering via new parsimonious mixtures of heavy-tailed distributions ⋮ Model-based time-varying clustering of multivariate longitudinal data with covariates and outliers ⋮ Robust clustering in regression analysis via the contaminated Gaussian cluster-weighted model ⋮ Testing equality of standardized generalized variances of \(k\) multivariate normal populations with arbitrary dimensions ⋮ Asymmetric clusters and outliers: mixtures of multivariate contaminated shifted asymmetric Laplace distributions ⋮ Unconstrained representation of orthogonal matrices with application to common principal components ⋮ Mixtures of multivariate contaminated normal regression models ⋮ Seeking outlying subsets under star-contoured errors
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Estimating common principal components in high dimensions
- Serial and parallel implementations of model-based clustering via parsimonious Gaussian mixture models
- Model-based clustering, classification, and discriminant analysis via mixtures of multivariate \(t\)-distributions
- Assessing the pattern of covariance matrices via an augmentation multiple testing procedure
- Orthogonal Stiefel manifold optimization for eigen-decomposed covariance parameter estimation in mixture models
- Choosing starting values for the EM algorithm for getting the highest likelihood in multivariate Gaussian mixture models
- Choosing initial values for the EM algorithm for finite mixtures
- Constrained monotone EM algorithms for finite mixture of multivariate Gaussians
- Model selection and Akaike's information criterion (AIC): The general theory and its analytical extensions
- Estimating the dimension of a model
- The distribution of the likelihood ratio for mixtures of densities from the one-parameter exponential family
- The model selection criterion AICu.
- Degeneracy in the maximum likelihood estimation of univariate Gaussian mixtures with EM.
- Model-based clustering via linear cluster-weighted models
- Clustering and classification via cluster-weighted factor analyzers
- Finite mixtures of unimodal beta and gamma densities and the \(k\)-bumps algorithm
- A likelihood-based constrained algorithm for multivariate normal mixture models
- Parsimonious mixtures of multivariate contaminated normal distributions
- The EM Algorithm and Extensions, 2E
- Regression and time series model selection in small samples
- Numerical Optimization
- Model-Based Gaussian and Non-Gaussian Clustering
- Model-Based Clustering, Discriminant Analysis, and Density Estimation
- An Algorithm for Simultaneous Orthogonal Transformation of Several Positive Definite Symmetric Matrices to Nearly Diagonal Form
- Advantages of the Closed Testing Method in Multiple Comparisons Procedures
- On the Spectral Decomposition in Normal Discriminant Analysis
- Identifiability of Finite Mixtures
- On the Identifiability of Finite Mixtures
- Closed Likelihood Ratio Testing Procedures to Assess Similarity of Covariance Matrices
This page was built for publication: Hypothesis Testing for Mixture Model Selection