scientific article
From MaRDI portal
Publication:3796199
zbMath0651.46021MaRDI QIDQ3796199
Publication date: 1988
Title: zbMATH Open Web Interface contents unavailable due to conflicting licenses.
distance between a randomly chosen k-codimensional subspace and convex sets situated at various locationsisoperimetric inequalities on the spherequantitative versions of Dvoretzki theorem for subspaces of quotients of finite dimensional Banach spacessharp versions of Milman's inequality..
Related Items
Average-case complexity without the black swans ⋮ Randomized numerical linear algebra: Foundations and algorithms ⋮ A geometrical stability condition for compressed sensing ⋮ When random proportional subspaces are also random quotients ⋮ Random sections of ellipsoids and the power of random information ⋮ The asymptotic distribution of the MLE in high-dimensional logistic models: arbitrary covariance ⋮ A precise high-dimensional asymptotic theory for boosting and minimum-\(\ell_1\)-norm interpolated classifiers ⋮ Recovering Structured Signals in Noise: Least-Squares Meets Compressed Sensing ⋮ Cosparsity in Compressed Sensing ⋮ Sharp MSE bounds for proximal denoising ⋮ Low \(M^*\)-estimates on coordinate subspaces ⋮ \(L_{p}\)-moments of random vectors via majorizing measures ⋮ Recovery analysis for weighted \(\ell_{1}\)-minimization using the null space property ⋮ High-dimensional change-point estimation: combining filtering with convex optimization ⋮ Generalized notions of sparsity and restricted isometry property. II: Applications ⋮ \(\ell^1\)-analysis minimization and generalized (co-)sparsity: when does recovery succeed? ⋮ Generalizing CoSaMP to signals from a union of low dimensional linear subspaces ⋮ The phase transition for the existence of the maximum likelihood estimate in high-dimensional logistic regression ⋮ Simple bounds for recovering low-complexity models ⋮ Generic error bounds for the generalized Lasso with sub-exponential data ⋮ Fast and Reliable Parameter Estimation from Nonlinear Observations ⋮ Sharp global convergence guarantees for iterative nonconvex optimization with random data ⋮ Automatic bias correction for testing in high‐dimensional linear models ⋮ Robust analysis ℓ1-recovery from Gaussian measurements and total variation minimization ⋮ A unified approach to uniform signal recovery from nonlinear observations ⋮ Random sections of \(\ell_p\)-ellipsoids, optimal recovery and Gelfand numbers of diagonal operators ⋮ Essay on Kashin's remarkable 1977 decomposition theorem ⋮ Sampling rates for \(\ell^1\)-synthesis ⋮ \( \varepsilon \)-isometric dimension reduction for incompressible subsets of \(\ell_p\) ⋮ Terracini convexity ⋮ Noisy linear inverse problems under convex constraints: exact risk asymptotics in high dimensions ⋮ Universality of regularized regression estimators in high dimensions ⋮ Analysis \(\ell_1\)-recovery with frames and Gaussian measurements ⋮ A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression ⋮ Guarantees of total variation minimization for signal recovery ⋮ Stable low-rank matrix recovery via null space properties ⋮ Linear regression with sparsely permuted data ⋮ Estimates of covering numbers ⋮ An Introduction to Compressed Sensing ⋮ Toward a unified theory of sparse dimensionality reduction in Euclidean space ⋮ Dvoretzky's theorem and the complexity of entanglement detection ⋮ Unnamed Item ⋮ Gaussian averages of interpolated bodies and applications to approximate reconstruction ⋮ Persistent homology for low-complexity models ⋮ Logarithmic reduction of the level of randomness in some probabilistic geometric constructions ⋮ Gelfand numbers related to structured sparsity and Besov space embeddings with small mixed smoothness ⋮ Estimation in High Dimensions: A Geometric Perspective ⋮ Regular random sections of convex bodies and the random quotient-of-subspace theorem ⋮ Robust recovery of complex exponential signals from random Gaussian projections via low rank Hankel matrix reconstruction ⋮ The distribution of the Lasso: uniform control over sparse balls and adaptive parameter tuning ⋮ REGULARIZATION OF NON-NORMAL MATRICES BY GAUSSIAN NOISE—THE BANDED TOEPLITZ AND TWISTED TOEPLITZ CASES ⋮ Extremal problems and isotropic positions of convex bodies ⋮ Isolated calmness of solution mappings and exact recovery conditions for nuclear norm optimization problems ⋮ Empirical processes and random projections ⋮ Unnamed Item ⋮ Geometry of the 𝐿_{𝑞}-centroid bodies of an isotropic log-concave measure ⋮ Precise statistical analysis of classification accuracies for adversarial training ⋮ Unnamed Item ⋮ Unnamed Item ⋮ On the Convergence Rate of Projected Gradient Descent for a Back-Projection Based Objective ⋮ Proof methods for robust low-rank matrix recovery ⋮ Robust Width: A Characterization of Uniformly Stable and Robust Compressed Sensing