Compressive statistical learning with random feature moments
From MaRDI portal
Publication:2664824
DOI10.4171/MSL/20zbMath1478.62164arXiv1706.07180OpenAlexW3127767495MaRDI QIDQ2664824
Rémi Gribonval, Gilles Blanchard, Nicolas Keriven, Yann Traonmilin
Publication date: 18 November 2021
Published in: Mathematical Statistics and Learning (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1706.07180
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Computational learning theory (68Q32) Nonparametric estimation (62G05) Learning and adaptive systems in artificial intelligence (68T05)
Related Items
Sketched learning for image denoising, Compressive Learning for Patch-Based Image Denoising, Statistical learning guarantees for compressive clustering and compressive mixture modeling, On Design of Polyhedral Estimates in Linear Inverse Problems, Applied harmonic analysis and data processing. Abstracts from the workshop held March 25--31, 2018, Sparse mixture models inspired by ANOVA decompositions
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A mathematical introduction to compressive sensing
- Fast rates for empirical vector quantization
- Dimensionality reduction with subgaussian matrices: a unified theory
- Adaptive Dantzig density estimation
- Concentration inequalities and model selection. Ecole d'Eté de Probabilités de Saint-Flour XXXIII -- 2003.
- The restricted isometry property and its implications for compressed sensing
- NP-hardness of Euclidean sum-of-squares clustering
- A simple proof of the restricted isometry property for random matrices
- The empirical characteristic function and its applications
- Nonasymptotic upper bounds for the reconstruction error of PCA
- New analysis of manifold embeddings and signal recovery from compressive measurements
- Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder)
- Statistical properties of kernel principal component analysis
- Local Rademacher complexities
- Statistical learning guarantees for compressive clustering and compressive mixture modeling
- GENERALIZATION OF GMM TO A CONTINUUM OF MOMENT CONDITIONS
- PhaseLift: Exact and Stable Signal Recovery from Magnitude Measurements via Convex Programming
- ON THE ASYMPTOTIC EFFICIENCY OF GMM
- Hilbert space embeddings and metrics on probability measures
- Synopses for Massive Data: Samples, Histograms, Wavelets, Sketches
- On-Line Expectation–Maximization Algorithm for latent Data Models
- Fundamental Performance Limits for Ideal Decoders in High-Dimensional Linear Inverse Problems
- Learning mixtures of spherical gaussians
- Compressed sensing and best 𝑘-term approximation
- A Hilbert Space Embedding for Distributions
- On the Eigenspectrum of the Gram Matrix and the Generalization Error of Kernel-PCA
- On coresets for k-means and k-median clustering
- A FAST k-MEANS IMPLEMENTATION USING CORESETS
- The complexity of the generalized Lloyd - Max problem (Corresp.)
- Stable low-rank matrix recovery via null space properties
- Deep Neural Networks with Random Gaussian Weights: A Universal Classification Strategy?
- An improved data stream summary: the count-min sketch and its applications
- Real Analysis and Probability
- Compressive learning with privacy guarantees
- Sketching for large-scale learning of mixture models
- Recipes for Stable Linear Embeddings From Hilbert Spaces to $ {\mathbb {R}}^{m}$
- On the Equivalence between Kernel Quadrature Rules and Random Feature Expansions
- A unified framework for approximating and clustering data
- Stable signal recovery from incomplete and inaccurate measurements
- Polynomial Learning of Distribution Families
- Privacy Aware Learning
- Compressed sensing