Fast yet Simple Natural-Gradient Descent for Variational Inference in Complex Models
From MaRDI portal
Publication:6304127
arXiv1807.04489MaRDI QIDQ6304127
Mohammad Emtiyaz Khan, Didrik Nielsen
Publication date: 12 July 2018
Abstract: Bayesian inference plays an important role in advancing machine learning, but faces computational challenges when applied to complex models such as deep neural networks. Variational inference circumvents these challenges by formulating Bayesian inference as an optimization problem and solving it using gradient-based optimization. In this paper, we argue in favor of natural-gradient approaches which, unlike their gradient-based counterparts, can improve convergence by exploiting the information geometry of the solutions. We show how to derive fast yet simple natural-gradient updates by using a duality associated with exponential-family distributions. An attractive feature of these methods is that, by using natural-gradients, they are able to extract accurate local approximations for individual model components. We summarize recent results for Bayesian deep learning showing the superiority of natural-gradient approaches over their gradient counterparts.
Has companion code repository: https://github.com/ssggreg/active_learning
This page was built for publication: Fast yet Simple Natural-Gradient Descent for Variational Inference in Complex Models
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6304127)