Stochastic complexities of general mixture models in variational Bayesian learning
From MaRDI portal
Publication:872400
DOI10.1016/j.neunet.2006.05.030zbMath1112.68111OpenAlexW2030085689WikidataQ51935534 ScholiaQ51935534MaRDI QIDQ872400
Sumio Watanabe, Kazuho Watanabe
Publication date: 27 March 2007
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.neunet.2006.05.030
free energyBayesian learningexponential familymixture modelvariational Bayesstochastic complexityKullback informationnon-regular model
Related Items (4)
Divergence measures and a general framework for local variational approximation ⋮ An alternative view of variational Bayes and asymptotic approximations of free energy ⋮ Comparing two Bayes methods based on the free energy functions in Bernoulli mixtures ⋮ Stochastic complexity for mixture of exponential families in generalized variational Bayes
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Stochastic complexity and modeling
- Estimating the dimension of a model
- Singularities in mixture models and upper bounds of stochastic complexity.
- Undamped oscillation of the sample autocovariance function and the effect of prewhitening operation
- Algebraic Analysis for Nonidentifiable Learning Machines
- Online Model Selection Based on the Variational Bayes
- Testing Homogeneity in Gamma Mixture Models
- Algorithmic Learning Theory
- On some inequalities for the gamma and psi functions
This page was built for publication: Stochastic complexities of general mixture models in variational Bayesian learning