Tracking and Regret Bounds for Online Zeroth-Order Euclidean and Riemannian Optimization
From MaRDI portal
Publication:5072586
DOI10.1137/21M1405551WikidataQ114074079 ScholiaQ114074079MaRDI QIDQ5072586
Iman Shames, Jonathan H. Manton, Alejandro I. Maass, Dragan Nešić, Chris Manzie
Publication date: 29 April 2022
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2010.00211
Computational learning theory (68Q32) Convex programming (90C25) Derivative-free methods and methods using generalized derivatives (90C56) Learning and adaptive systems in artificial intelligence (68T05)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- An optimal approach to collaborative target tracking with performance guarantees
- Riemannian geometry for the statistical analysis of diffusion tensor data
- Direct search methods on reductive homogeneous spaces
- Computing the Karcher mean of symmetric positive definite matrices
- From Bayesian inference to MCMC and convex optimisation in Hadamard manifolds
- Combining Bayesian optimization and Lipschitz optimization
- Random gradient-free minimization of convex functions
- A Riemannian framework for tensor computing
- Non-Stationary Stochastic Optimization
- Low-Rank Matrix Completion by Riemannian Optimization
- Convex analysis and optimization in Hadamard spaces
- Manopt, a Matlab toolbox for optimization on manifolds
- Complete Dictionary Recovery Over the Sphere II: Recovery by Riemannian Trust-Region Method
- Online Learning and Online Convex Optimization
- Volume Growth and Escape Rate of Brownian Motion on a Cartan—Hadamard Manifold
- A Panoramic View of Riemannian Geometry
- Online Learning With Inexact Proximal Online Gradient Descent Algorithms
- Derivative-Free Methods for Policy Optimization: Guarantees for Linear Quadratic Systems
- Weakly Convex Optimization over Stiefel Manifold Using Riemannian Subgradient-Type Methods
- Global rates of convergence for nonconvex optimization on manifolds
- Tuning of multivariable model predictive controllers through expert bandit feedback
- Proximal Gradient Method for Nonsmooth Optimization over the Stiefel Manifold
- The Extrinsic Geometry of Dynamical Systems Tracking Nonlinear Matrix Projections
- Riemannian Gaussian Distributions on the Space of Symmetric Positive Definite Matrices
- An efficient method for finding the minimum of a function of several variables without calculating derivatives
- Stochastic Gradient Descent on Riemannian Manifolds
- Optimization algorithms exploiting unitary constraints
- Regret Analysis of Stochastic and Nonstochastic Multi-armed Bandit Problems
- Geometric Means in a Novel Vector Space Structure on Symmetric Positive‐Definite Matrices
- Manifolds of Negative Curvature