Discriminative Bayesian filtering lends momentum to the stochastic Newton method for minimizing log-convex functions
From MaRDI portal
Publication:2693789
DOI10.1007/s11590-022-01895-5OpenAlexW4287200304MaRDI QIDQ2693789
Publication date: 24 March 2023
Published in: Optimization Letters (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2104.12949
sequential Bayesian inferencediscriminative Bayesian filteringmomentum in optimizationstochastic Newton method
Inference from stochastic processes and prediction (62M20) Convex programming (90C25) Newton-type methods (49M15) Stochastic programming (90C15)
Uses Software
Cites Work
- Stochastic global optimization as a filtering problem
- Incremental proximal methods for large scale convex optimization
- On stochastic approximation of the eigenvectors and eigenvalues of the expectation of a random matrix
- A simplified neuron model as a principal component analyzer
- A survey of truncated-Newton methods
- Sub-sampled Newton methods
- Momentum and stochastic momentum for stochastic gradient, Newton, proximal point and subspace descent methods
- Minimization of functions having Lipschitz continuous first partial derivatives
- Adaptive stochastic approximation by the simultaneous perturbation method
- Hybrid Deterministic-Stochastic Methods for Data Fitting
- Newton Sketch: A Near Linear-Time Optimization Algorithm with Linear-Quadratic Convergence
- On the Use of Stochastic Hessian Information in Optimization Methods for Machine Learning
- Large-Scale Machine Learning with Stochastic Gradient Descent
- Acceleration of Stochastic Approximation by Averaging
- Gaussian filters for nonlinear filtering problems
- Adaptive Sampling Strategies for Stochastic Optimization
- A Mean-Field Optimal Control Formulation for Global Optimization
- Probabilistic Line Searches for Stochastic Optimization
- Optimization Methods for Large-Scale Machine Learning
- 10.1162/jmlr.2003.3.4-5.993
- Incremental Least Squares Methods and the Extended Kalman Filter
- The Discriminative Kalman Filter for Bayesian Filtering with Nonlinear and Nongaussian Observation Models
- An investigation of Newton-Sketch and subsampled Newton methods
- Robust Closed-Loop Control of a Cursor in a Person with Tetraplegia using Gaussian Process Regression
- A Stochastic Line Search Method with Expected Complexity Analysis
- Elements of Information Theory
- Some methods of speeding up the convergence of iteration methods
- Monte Carlo techniques to estimate the conditional expectation in multi-stage non-linear filtering†
- A Stochastic Approximation Method
- Exact and inexact subsampled Newton methods for optimization
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Discriminative Bayesian filtering lends momentum to the stochastic Newton method for minimizing log-convex functions