Empirical minimization

From MaRDI portal
Publication:2494402

DOI10.1007/s00440-005-0462-3zbMath1142.62348OpenAlexW3187217851WikidataQ105583449 ScholiaQ105583449MaRDI QIDQ2494402

Shahar Mendelson, Bartlett, Peter L.

Publication date: 26 June 2006

Published in: Probability Theory and Related Fields (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1007/s00440-005-0462-3



Related Items

Learning theory of minimum error entropy under weak moment conditions, Empirical variance minimization with applications in variance reduction and optimal control, Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder), Classification with reject option, Noisy discriminant analysis with boundary assumptions, Regularization in kernel learning, Unnamed Item, Unnamed Item, Fast rates of minimum error entropy with heavy-tailed noise, Inverse statistical learning, Robust statistical learning with Lipschitz and convex loss functions, Posterior concentration and fast convergence rates for generalized Bayesian learning, Convergence rates for empirical barycenters in metric spaces: curvature, convexity and extendable geodesics, \(\ell _{1}\)-regularized linear regression: persistence and oracle inequalities, Sample average approximation with heavier tails. I: Non-asymptotic bounds with weak assumptions and stochastic constraints, Sample average approximation with heavier tails II: localization in stochastic convex optimization and persistence results for the Lasso, Empirical risk minimization for heavy-tailed losses, Concentration behavior of the penalized least squares estimator, User-friendly Introduction to PAC-Bayes Bounds, Statistical performance of support vector machines, Ranking and empirical minimization of \(U\)-statistics, On the optimality of the empirical risk minimization procedure for the convex aggregation problem, Relaxing the i.i.d. assumption: adaptively minimax optimal regret via root-entropic regularization, Robust regression using biased objectives, Robust classification via MOM minimization, Sharper lower bounds on the performance of the empirical risk minimization algorithm, Empirical risk minimization is optimal for the convex aggregation problem, Optimal upper and lower bounds for the true and empirical excess risks in heteroscedastic least-squares regression, Oracle inequalities for cross-validation type procedures, General oracle inequalities for model selection, On the optimality of the aggregate with exponential weights for low temperatures, ERM and RERM are optimal estimators for regression problems when malicious outliers corrupt the labels, General nonexact oracle inequalities for classes with a subexponential envelope, Obtaining fast error rates in nonconvex situations, Unnamed Item, Sharp oracle inequalities for least squares estimators in shape restricted regression, Unnamed Item, A high-dimensional Wilks phenomenon, On the Optimality of Sample-Based Estimates of the Expectation of the Empirical Minimizer, FAST RATES FOR ESTIMATION ERROR AND ORACLE INEQUALITIES FOR MODEL SELECTION, A statistical learning assessment of Huber regression, Unnamed Item, Estimation bounds and sharp oracle inequalities of regularized procedures with Lipschitz loss functions, Convergence rates of least squares regression estimators with heavy-tailed errors, Confidence sets with expected sizes for Multiclass Classification, Localized Gaussian width of \(M\)-convex hulls with applications to Lasso and convex aggregation, Unnamed Item, Learning rates for partially linear support vector machine in high dimensions, Unnamed Item, Local Rademacher complexities, Minimax fast rates for discriminant analysis with errors in variables, Suboptimality of constrained least squares and improvements via non-linear predictors



Cites Work