An effective strategy for initializing the EM algorithm in finite mixture models
From MaRDI portal
Publication:2418282
DOI10.1007/S11634-016-0264-8zbMath1414.62256OpenAlexW2478992108MaRDI QIDQ2418282
Semhar Michael, Volodymyr Melnykov
Publication date: 3 June 2019
Published in: Advances in Data Analysis and Classification. ADAC (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11634-016-0264-8
Related Items (3)
Finite mixture of regression models for a stratified sample ⋮ Mixture modeling of data with multiple partial right-censoring levels ⋮ Matrix normal cluster-weighted models
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Initializing the EM algorithm in Gaussian mixture models with an unknown number of components
- Methods for merging Gaussian mixture components
- Choosing starting values for the EM algorithm for getting the highest likelihood in multivariate Gaussian mixture models
- Bayesian model averaging: A tutorial. (with comments and a rejoinder).
- Model-based clustering of high-dimensional data: a review
- Semi-supervised model-based clustering with positive and negative constraints
- Studying Complexity of Model-based Clustering
- Finding Groups in Data
- Algorithms for Model-Based Gaussian Hierarchical Clustering
- How Many Clusters? Which Clustering Method? Answers Via Model-Based Cluster Analysis
- Model Selection and Accounting for Model Uncertainty in Graphical Models Using Occam's Window
- The multivariate skew-normal distribution
- Model-Based Clustering, Discriminant Analysis, and Density Estimation
- Finite mixture models
This page was built for publication: An effective strategy for initializing the EM algorithm in finite mixture models