A one-bit, comparison-based gradient estimator
From MaRDI portal
Publication:2155805
DOI10.1016/j.acha.2022.03.003OpenAlexW3122422392MaRDI QIDQ2155805
Wotao Yin, HanQin Cai, Daniel McKenzie, Zhen-Liang Zhang
Publication date: 15 July 2022
Published in: Applied and Computational Harmonic Analysis (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2010.02479
reinforcement learninghyperparameter tuningzeroth-order optimizationone-bit compressed sensingcomparison-based optimizationnormalized gradient descent
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- CoSaMP: Iterative signal recovery from incomplete and inaccurate samples
- Parallel distributed block coordinate descent methods based on pairwise comparison oracle
- Preference-based reinforcement learning: a formal framework and a policy iteration algorithm
- Restricted strong convexity and its applications to convergence analysis of gradient-type methods in convex optimization
- Adaptive stochastic approximation by the simultaneous perturbation method
- Bayesian Optimization in a Billion Dimensions via Random Embeddings
- Active Subspaces
- Robust 1-bit Compressed Sensing and Sparse Logistic Regression: A Convex Programming Approach
- Minimax Number of Strata for Online Stratified Sampling Given Noisy Samples
- Zeroth-Order Regularized Optimization (ZORO): Approximately Sparse Gradients and Adaptive Sampling
- A dimensionality reduction technique for unconstrained global optimization of functions with low effective dimensionality
- Derivative-free optimization methods
This page was built for publication: A one-bit, comparison-based gradient estimator