Regularized Estimation and Feature Selection in Mixtures of Gaussian-Gated Experts Models
From MaRDI portal
Publication:3305484
DOI10.1007/978-981-15-1960-4_3zbMath1445.62325arXiv1909.05494OpenAlexW3007594395MaRDI QIDQ3305484
Faicel Chamroukhi, Florian Lecocq, Hien Duy Nguyen
Publication date: 7 August 2020
Published in: Communications in Computer and Information Science (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1909.05494
Ridge regression; shrinkage estimators (Lasso) (62J07) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Statistical aspects of big data and data science (62R07)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Time series modeling by a regression approach based on a latent process
- Rejoinder to the comments on: \(\ell _{1}\)-penalization for mixture regression models
- Laplace mixture of linear experts
- Robust mixture of experts modeling using the \(t\) distribution
- Pathwise coordinate optimization
- Coordinate descent algorithms for lasso penalized regression
- New estimation and feature selection methods in mixture-of-experts models
- The EM Algorithm and Extensions, 2E
This page was built for publication: Regularized Estimation and Feature Selection in Mixtures of Gaussian-Gated Experts Models