Bayesian inference for infinite asymmetric Gaussian mixture with feature selection
From MaRDI portal
Publication:2099966
DOI10.1007/s00500-021-05598-4zbMath1498.62125OpenAlexW3127555970MaRDI QIDQ2099966
Samr Ali, Nizar Bouguila, Ziyang Song
Publication date: 21 November 2022
Published in: Soft Computing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s00500-021-05598-4
Gibbs samplingMCMCMetropolis-Hastingsfeature selectionbackground subtractioninfinite asymmetric Gaussian mixture model
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Bayesian inference (62F15) Learning and adaptive systems in artificial intelligence (68T05)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Variable selection for model-based clustering using the integrated complete-data likelihood
- Bayesian learning of finite generalized Gaussian mixture models on images
- Mixtures of Dirichlet processes with applications to Bayesian nonparametric problems
- Modelling the role of variables in model-based cluster analysis
- Dynamic textures
- Discriminative variable selection for clustering with the sparse Fisher-EM algorithm
- Variable Selection for Model-Based High-Dimensional Clustering and Its Application to Microarray Data
- Modeling the shape of the scene: A holistic representation of the spatial envelope
- Variational inference for Dirichlet process mixtures
This page was built for publication: Bayesian inference for infinite asymmetric Gaussian mixture with feature selection