Recursive aggregation of estimators by the mirror descent algorithm with averaging
From MaRDI portal
Publication:2432961
DOI10.1007/s11122-006-0005-2zbMath1123.62044arXivmath/0505333OpenAlexW2166365115MaRDI QIDQ2432961
Alexandre B. Tsybakov, Nicolas Vayatis, Anatoli B. Juditsky, Alexander Nazin
Publication date: 26 October 2006
Published in: Problems of Information Transmission (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/math/0505333
Nonparametric regression and quantile regression (62G08) Asymptotic properties of nonparametric inference (62G20) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Nonparametric inference (62G99)
Related Items
Algorithms of inertial mirror descent in convex problems of stochastic optimization, Some multivariate risk indicators: Minimization by using a Kiefer–Wolfowitz approach to the mirror stochastic algorithm, Unifying mirror descent and dual averaging, A mirror descent algorithm for minimization of mean Poisson flow driven losses, An adaptive multiclass nearest neighbor classifier, Aggregation of estimators and stochastic optimization, First-order methods for convex optimization, An optimal method for stochastic composite optimization, General oracle inequalities for model selection, Noisy independent factor analysis model for density estimation and classification, Mirror averaging with sparsity priors, Aggregation for Gaussian regression, Simultaneous adaptation to the margin and to complexity in classification, On variance reduction for stochastic smooth convex optimization with multiplicative noise, Saddle point mirror descent algorithm for the robust PageRank problem, Learning by mirror averaging, Prediction of time series by statistical learning: general losses and fast rates, Stochastic Quasi-Newton Methods for Nonconvex Stochastic Optimization, Randomized algorithm to determine the eigenvector of a stochastic matrix with application to the PageRank problem, Iterative feature selection in least square regression estimation, Sparse estimation by exponential weighting, Generalized mirror averaging and \(D\)-convex aggregation, On the exponentially weighted aggregate with the Laplace prior, A MOM-based ensemble method for robustness, subsampling and hyperparameter tuning, Aggregation by exponential weighting, sharp PAC-Bayesian bounds and sparsity, On efficient randomized algorithms for finding the PageRank vector, On the efficiency of a randomized mirror descent algorithm in online optimization problems
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Exponentiated gradient versus gradient descent for linear predictors
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Mirror descent and nonlinear projected subgradient methods for convex optimization.
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Functional aggregation for nonparametric regression.
- On the Bayes-risk consistency of regularized boosting methods.
- Statistical behavior and consistency of classification methods based on convex risk minimization.
- Optimal aggregation of classifiers in statistical learning.
- Boosting a weak learning algorithm by majority
- The Ordered Subsets Mirror Descent Optimization Method with Applications to Tomography
- On the Generalization Ability of On-Line Learning Algorithms
- Acceleration of Stochastic Approximation by Averaging
- Variational Analysis
- Proximal Minimization Methods with Generalized Bregman Functions
- Learning Theory and Kernel Machines
- A Second-Order Perceptron Algorithm
- Online Learning with Kernels
- Convexity, Classification, and Risk Bounds
- Relative loss bounds for multidimensional regression problems