AdaBoost and robust one-bit compressed sensing
From MaRDI portal
Publication:2102435
DOI10.4171/MSL/31MaRDI QIDQ2102435
Geoffrey Chinot, Felix Kuchelmeister, Sara van de Geer, Matthias Löffler
Publication date: 28 November 2022
Published in: Mathematical Statistics and Learning (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2105.02083
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Signal theory (characterization, reconstruction, filtering, etc.) (94A12)
Related Items
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Moment inequalities for sums of dependent random variables under projective conditions
- Inequalities of Bernstein-Jackson-type and the degree of compactness of operators in Banach spaces
- A decision-theoretic generalization of on-line learning and an application to boosting
- Arcing classifiers. (With discussion)
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Adaptive estimation of a quadratic functional by model selection.
- Empirical margin distributions and bounding the generalization error of combined classifiers
- Population theory for boosting ensembles.
- Process consistency for AdaBoost.
- On the Bayes-risk consistency of regularized boosting methods.
- Improved boosting algorithms using confidence-rated predictions
- Non-Gaussian hyperplane tessellations and robust one-bit compressed sensing
- On the robustness of minimum norm interpolators and regularized empirical risk minimizers
- Surprises in high-dimensional ridgeless least squares interpolation
- A precise high-dimensional asymptotic theory for boosting and minimum-\(\ell_1\)-norm interpolated classifiers
- Stability and instance optimality for Gaussian measurements in compressed sensing
- One-bit compressed sensing with non-Gaussian measurements
- Reconstruction and subgaussian operators in asymptotic geometric analysis
- Boosting for high-dimensional linear models
- Boosting with early stopping: convergence and consistency
- Learning without Concentration
- One-Bit Compressed Sensing by Linear Programming
- The Rate of Convergence of AdaBoost
- One-Bit Compressive Sensing With Norm Estimation
- Robust 1-Bit Compressive Sensing via Binary Stable Embeddings of Sparse Vectors
- Robust 1-bit Compressed Sensing and Sparse Logistic Regression: A Convex Programming Approach
- The Integral of a Symmetric Unimodal Function over a Symmetric Convex Set and Some Probability Inequalities
- On the minimum of several random variables
- Mathematical Foundations of Infinite-Dimensional Statistical Models
- On sparse reconstruction from Fourier and Gaussian measurements
- Normal Approximation by Stein’s Method
- Atomic Decomposition by Basis Pursuit
- The Generalization Error of Random Features Regression: Precise Asymptotics and the Double Descent Curve
- Benign overfitting in linear regression
- A model of double descent for high-dimensional binary linear classification
- Reconciling modern machine-learning practice and the classical bias–variance trade-off
- One-bit compressive sensing of dictionary-sparse signals
- Soft margins for AdaBoost