Finite-sample analysis of \(M\)-estimators using self-concordance
From MaRDI portal
Publication:2219231
DOI10.1214/20-EJS1780zbMath1490.62068arXiv1810.06838MaRDI QIDQ2219231
Francis Bach, Dmitrii M. Ostrovskii
Publication date: 19 January 2021
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1810.06838
robustnesslogistic regressionrandom designempirical risk minimizationself-concordancefast rates\(M\)-estimators
Asymptotic properties of parametric estimators (62F12) Estimation in multivariate analysis (62H12) Point estimation (62F10) Generalized linear models (logistic models) (62J12) Robustness and adaptive procedures (parametric inference) (62F35) Applications of mathematical programming (90C90)
Related Items
A Newton Frank-Wolfe method for constrained self-concordant minimization, A New Homotopy Proximal Variable-Metric Framework for Composite Convex Minimization, SCORE: approximating curvature information under self-concordant regularization, Generalized self-concordant analysis of Frank-Wolfe algorithms, Composite convex optimization with global and local inexact oracles, Unnamed Item, Unnamed Item
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- High dimensional robust M-estimation: asymptotic variance via approximate message passing
- Random design analysis of ridge regression
- Concentration inequalities and moment bounds for sample covariance operators
- Optimal prediction for sparse linear models? Lower bounds for coordinate-separable M-estimators
- On verifiable sufficient conditions for sparse signal recovery via \(\ell_{1}\) minimization
- Support recovery without incoherence: a case for nonconvex regularization
- Approximating the moments of marginals of high-dimensional distributions
- Parametric estimation. Finite sample theory
- A tail inequality for quadratic forms of subgaussian random vectors
- On density estimation in the view of Kolmogorov's ideas in approximation theory
- A game of prediction with expert advice
- Introductory lectures on convex optimization. A basic course.
- Adaptive estimation of a quadratic functional by model selection.
- Self-concordant analysis for logistic regression
- The landscape of empirical risk for nonconvex losses
- Statistical consistency and asymptotic normality for high-dimensional robust \(M\)-estimators
- Robust covariance estimation under \(L_4\)-\(L_2\) norm equivalence
- Generalized self-concordant functions: a recipe for Newton-type methods
- Optimal rates for the regularized least-squares algorithm
- Simultaneous analysis of Lasso and Dantzig selector
- Central limit theorems and bootstrap in high dimensions
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Adaptivity of averaged stochastic gradient descent to local strong convexity for logistic regression
- Proximal Newton-Type Methods for Minimizing Composite Functions
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- Asymptotic Statistics
- Finite Sample Analysis of Approximate Message Passing Algorithms
- The Generic Chaining
- Precise Error Analysis of Regularized <inline-formula> <tex-math notation="LaTeX">$M$ </tex-math> </inline-formula>-Estimators in High Dimensions
- A modern maximum-likelihood theory for high-dimensional logistic regression
- Optimal errors and phase transitions in high-dimensional generalized linear models
- Composite Self-Concordant Minimization
- Regularized M-estimators with nonconvexity: Statistical and algorithmic theory for local optima
- Robust Estimation of a Location Parameter
- Convex Analysis
- Convexity, Classification, and Risk Bounds
- APPROXIMATE CONFIDENCE INTERVALS
- Maximum Likelihood Estimation of Misspecified Models
- Probability theory. A comprehensive course
- Quasi-likelihood and/or robust estimation in high dimensions
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers