Optimistic optimisation of composite objective with exponentiated update
From MaRDI portal
Publication:6097136
DOI10.1007/s10994-022-06229-1arXiv2208.04065OpenAlexW4292663683MaRDI QIDQ6097136
Weijia Shao, Şahin Albayrak, Fikret Sivrikaya
Publication date: 12 June 2023
Published in: Machine Learning (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2208.04065
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- A generalized online mirror descent with applications to classification and regression
- Exponentiated gradient versus gradient descent for linear predictors
- The robustness of the \(p\)-norm algorithms
- Introductory lectures on convex optimization. A basic course.
- Scale-free online learning
- First-order and stochastic optimization methods for machine learning
- Optimal Rates for Zero-Order Convex Optimization: The Power of Two Function Evaluations
- Convexity and Optimization in Banach Spaces
- On the Generalization Ability of On-Line Learning Algorithms
- Improved Risk Tail Bounds for On-Line Algorithms
- Smoothed Low Rank and Sparse Matrix Recovery by Iteratively Reweighted Least Squares Minimization
- Linear Coupling: An Ultimate Unification of Gradient and Mirror Descent
This page was built for publication: Optimistic optimisation of composite objective with exponentiated update