Sparse Partially Collapsed MCMC for Parallel Inference in Topic Models
From MaRDI portal
Publication:3391126
DOI10.1080/10618600.2017.1366913OpenAlexW2533991995MaRDI QIDQ3391126
Mattias Villani, David Broman, Måns Magnusson, Leif Jonsson
Publication date: 28 March 2022
Published in: Journal of Computational and Graphical Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1506.03784
computational complexityparallel computingBayesian inferenceGibbs samplingmassive datasetslatent Dirichlet allocation
Related Items (3)
DOLDA: a regularized supervised topic model for high-dimensional multi-class regression ⋮ Learning Topic Models: Identifiability and Finite-Sample Analysis ⋮ GPU-accelerated Gibbs sampling: a case study of the horseshoe probit model
Uses Software
Cites Work
- First hitting time analysis of the independence Metropolis sampler
- Regression density estimation using smooth adaptive Gaussian mixtures
- Dirichlet and Related Distributions
- An Efficient Method for Generating Discrete Random Variables with General Distributions
- Partially Collapsed Gibbs Samplers
- The Collapsed Gibbs Sampler in Bayesian Computations with Applications to a Gene Regulation Problem
- A simple method for generating gamma variables
- 10.1162/jmlr.2003.3.4-5.993
- Gibbs Sampling Methods for Stick-Breaking Priors
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Sparse Partially Collapsed MCMC for Parallel Inference in Topic Models