Observed universality of phase transitions in high-dimensional geometry, with implications for modern data analysis and signal processing
From MaRDI portal
Publication:3559946
DOI10.1098/rsta.2009.0152zbMath1185.94029arXiv0906.2530OpenAlexW2144006746WikidataQ33508769 ScholiaQ33508769MaRDI QIDQ3559946
Publication date: 8 May 2010
Published in: Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/0906.2530
combinatorial geometrycompressed sensingrobust linear modelshigh dimension low sample size datasetshigh-throughput measurements
Inference from spatial processes (62M30) Signal theory (characterization, reconstruction, filtering, etc.) (94A12) Measures of information, entropy (94A17)
Related Items
Computing and analyzing recoverable supports for sparse reconstruction, Derandomized compressed sensing with nonuniform guarantees for \(\ell_1\) recovery, Image reconstruction from undersampled Fourier data using the polynomial annihilation transform, Compressive Sensing with Cross-Validation and Stop-Sampling for Sparse Polynomial Chaos Expansions, Book Review: A mathematical introduction to compressive sensing, Testable uniqueness conditions for empirical assessment of undersampling levels in total variation-regularized X-ray CT, Counting the faces of randomly-projected hypercubes and orthants, with applications, On the universality of noiseless linear estimation with respect to the measurement matrix, Recovering Structured Signals in Noise: Least-Squares Meets Compressed Sensing, Sharp MSE bounds for proximal denoising, The restricted isometry property of block diagonal matrices for group-sparse signal recovery, Revisiting compressed sensing: exploiting the efficiency of simplex and sparsification methods, High dimensional robust M-estimation: asymptotic variance via approximate message passing, Compressive sampling of polynomial chaos expansions: convergence analysis and sampling strategies, The essential ability of sparse reconstruction of different compressive sensing strategies, Sparse microwave imaging: principles and applications, Sparse SAR imaging based on \(L_{1/2}\) regularization, Expander \(\ell_0\)-decoding, Sparse high-dimensional regression: exact scalable algorithms and phase transitions, Performance comparisons of greedy algorithms in compressed sensing, PAC-Bayesian risk bounds for group-analysis sparse regression by exponential weighting, Characterizing the SLOPE trade-off: a variational perspective and the Donoho-Tanner limit, Sharp recovery bounds for convex demixing, with applications, General stochastic separation theorems with optimal bounds, Analytic approach to variance optimization under an \(\mathcal{l}_1\) constraint, A simple homotopy proximal mapping algorithm for compressive sensing, Unnamed Item, A power analysis for Model-X knockoffs with \(\ell_p\)-regularized statistics, Geometry and applied statistics, Universality of approximate message passing with semirandom matrices, Sparse Legendre expansions via \(\ell_1\)-minimization, The Lasso with general Gaussian designs with applications to hypothesis testing, Knowledge elicitation via sequential probabilistic inference for high-dimensional prediction, Analysis \(\ell_1\)-recovery with frames and Gaussian measurements, Two are better than one: fundamental parameters of frame coherence, A Rice method proof of the null-space property over the Grassmannian, Correction of AI systems by linear discriminants: probabilistic foundations, Minimax risks for sparse regressions: ultra-high dimensional phenomenons, Flavors of Compressive Sensing, Concentration of the Frobenius Norm of Generalized Matrix Inverses, Asymptotic risk and phase transition of \(l_1\)-penalized robust estimator, The restricted isometry property for random block diagonal matrices, A discussion on practical considerations with sparse regression methodologies, An Introduction to Compressed Sensing, Replica approach to mean-variance portfolio optimization, Cross validation in LASSO and its acceleration, Analytic solution to variance optimization with no short positions, Statistical mechanics of complex economies, Sparse decomposition by iterating Lipschitzian-type mappings, Consistent parameter estimation for Lasso and approximate message passing, Deterministic matrices matching the compressed sensing phase transitions of Gaussian random matrices, The phase transition of matrix recovery from Gaussian measurements matches the minimax MSE of matrix denoising, Energy preserved sampling for compressed sensing MRI, A numerical exploration of compressed sampling recovery, High-dimensional regression with unknown variance, Sparse recovery from extreme eigenvalues deviation inequalities, Critical behavior and universality classes for an algorithmic phase transition in sparse reconstruction, Sparse classification: a scalable discrete optimization perspective, Partial gradient optimal thresholding algorithms for a class of sparse optimization problems, On a game of chance in Marc Elsberg's thriller ``GREED, Universality in polytope phase transitions and message passing algorithms, Lah distribution: Stirling numbers, records on compositions, and convex hulls of high-dimensional random walks, Theory and applications of compressed sensing, PHASE TRANSITIONS IN ERROR CORRECTING AND COMPRESSED SENSING BY ℓ1 LINEAR PROGRAMMING, LASSO risk and phase transition under dependence, Empirical average-case relation between undersampling and sparsity in X-ray CT, Threshold phenomena for random cones
Cites Work