Precise statistical analysis of classification accuracies for adversarial training
From MaRDI portal
Publication:2091832
DOI10.1214/22-AOS2180MaRDI QIDQ2091832
Adel Javanmard, Mahdi Soltanolkotabi
Publication date: 2 November 2022
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2010.11213
Asymptotic properties of parametric estimators (62F12) Asymptotic distribution theory in statistics (62E20) Generalized linear models (logistic models) (62J12)
Related Items (1)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- High dimensional robust M-estimation: asymptotic variance via approximate message passing
- On the impact of predictor geometry on the performance on high-dimensional ridge-regularized generalized robust regression estimators
- On the distribution of the largest eigenvalue in principal components analysis
- The distribution of the Lasso: uniform control over sparse balls and adaptive parameter tuning
- Fundamental barriers to high-dimensional regression with convex penalties
- Surprises in high-dimensional ridgeless least squares interpolation
- The phase transition for the existence of the maximum likelihood estimate in high-dimensional logistic regression
- Optimal shrinkage of eigenvalues in the spiked covariance model
- Sparse PCA via Covariance Thresholding
- Phase transitions in semidefinite relaxations
- Precise Error Analysis of Regularized <inline-formula> <tex-math notation="LaTeX">$M$ </tex-math> </inline-formula>-Estimators in High Dimensions
- Adversarial Risk via Optimal Transport and Optimal Couplings
- The Generalization Error of Random Features Regression: Precise Asymptotics and the Double Descent Curve
- Does SLOPE outperform bridge regression?
- On the Adversarial Robustness of Robust Estimators
- Algorithmic Analysis and Statistical Estimation of SLOPE via Approximate Message Passing
- Living on the edge: phase transitions in convex programs with random data
- Reconciling modern machine-learning practice and the classical bias–variance trade-off
- A modern maximum-likelihood theory for high-dimensional logistic regression
- The LASSO Risk for Gaussian Matrices
- The Noise-Sensitivity Phase Transition in Compressed Sensing
- Information-Theoretically Optimal Compressed Sensing via Spatial Coupling and Approximate Message Passing
- Sharp Time–Data Tradeoffs for Linear Inverse Problems
This page was built for publication: Precise statistical analysis of classification accuracies for adversarial training