Randomized Sketches of Convex Programs With Sharp Guarantees
From MaRDI portal
Publication:2977306
DOI10.1109/TIT.2015.2450722zbMath1359.90097arXiv1404.7203OpenAlexW2963459305MaRDI QIDQ2977306
Mert Pilanci, Martin J. Wainwright
Publication date: 28 April 2017
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1404.7203
Related Items
On principal components regression, random projections, and column subsampling, Randomized numerical linear algebra: Foundations and algorithms, Side-constrained minimum sum-of-squares clustering: mathematical programming and random projections, Functional principal subspace sampling for large scale functional data analysis, Convexification with Bounded Gap for Randomly Projected Quadratic Optimization, Sharper Bounds for Regularized Data Fitting, Hierarchical inference for genome-wide association studies: a view on methodology with software, RidgeSketch: A Fast Sketching Based Solver for Large Scale Ridge Regression, Generic error bounds for the generalized Lasso with sub-exponential data, Noisy Euclidean Distance Realization: Robust Facial Reduction and the Pareto Frontier, Distributed learning for sketched kernel regression, M-IHS: an accelerated randomized preconditioning method avoiding costly matrix decompositions, Sketched approximation of regularized canonical correlation analysis, Randomized Quasi-Newton Updates Are Linearly Convergent Matrix Inversion Algorithms, Random projections for quadratic programs, Newton Sketch: A Near Linear-Time Optimization Algorithm with Linear-Quadratic Convergence, Sketching for Principal Component Regression, Structured Random Sketching for PDE Inverse Problems, Reduced rank regression with matrix projections for high-dimensional multivariate linear regression model, Toward a unified theory of sparse dimensionality reduction in Euclidean space, Randomized sketches for kernel CCA, On b-bit min-wise hashing for large-scale regression and classification with sparse data, Low-rank matrix completion using nuclear norm minimization and facial reduction, Redundancy Techniques for Straggler Mitigation in Distributed Optimization and Learning, A stochastic subspace approach to gradient-free optimization in high dimensions, Approximate nonparametric quantile regression in reproducing kernel Hilbert spaces via random projection, On nonparametric randomized sketches for kernels with further smoothness, High-dimensional model recovery from random sketched data by exploring intrinsic sparsity, Unnamed Item, Tensor-Structured Sketching for Constrained Least Squares, ISLET: Fast and Optimal Low-Rank Tensor Regression via Importance Sketching