Database-friendly random projections: Johnson-Lindenstrauss with binary coins.
From MaRDI portal
Publication:1401965
DOI10.1016/S0022-0000(03)00025-4zbMath1054.68040WikidataQ57254827 ScholiaQ57254827MaRDI QIDQ1401965
Publication date: 19 August 2003
Published in: Journal of Computer and System Sciences (Search for Journal in Brave)
Related Items (only showing first 100 items - show all)
On principal components regression, random projections, and column subsampling ⋮ Random-walk based approximate \(k\)-nearest neighbors algorithm for diffusion state distance ⋮ High-dimensional clustering via random projections ⋮ Randomized approaches to accelerate MCMC algorithms for Bayesian inverse problems ⋮ Side-constrained minimum sum-of-squares clustering: mathematical programming and random projections ⋮ Efficient binary embedding of categorical data using BinSketch ⋮ Dimension reduction and construction of feature space for image pattern recognition ⋮ Derandomizing restricted isometries via the Legendre symbol ⋮ Gaussian random projections for Euclidean membership problems ⋮ Compressive Sensing with Redundant Dictionaries and Structured Measurements ⋮ Numerical bifurcation analysis of PDEs from lattice Boltzmann model simulations: a parsimonious machine learning approach ⋮ A Survey of Compressed Sensing ⋮ Kernels as features: on kernels, margins, and low-dimensional mappings ⋮ Sparser Johnson-Lindenstrauss Transforms ⋮ Variable selection in identification of a high dimensional nonlinear non-parametric system ⋮ The perfect marriage and much more: combining dimension reduction, distance measures and covariance ⋮ Practical non-interactive publicly verifiable secret sharing with thousands of parties ⋮ Entropy-randomized projection ⋮ Efficient clustering on Riemannian manifolds: a kernelised random projection approach ⋮ Simple bounds for recovering low-complexity models ⋮ Randomized projective methods for the construction of binary sparse vector representations ⋮ Estimates on compressed neural networks regression ⋮ Binary random projections with controllable sparsity patterns ⋮ A Sketch Algorithm for Estimating Two-Way and Multi-Way Associations ⋮ Vector data transformation using random binary matrices ⋮ Structural conditions for projection-cost preservation via randomized matrix multiplication ⋮ Distance geometry and data science ⋮ Sparsified randomization algorithms for low rank approximations and applications to integral equations and inhomogeneous random field simulation ⋮ Random projections for quadratic programs ⋮ Algorithmic paradigms for stability-based cluster validity and model selection statistical methods, with applications to microarray data analysis ⋮ A variant of the Johnson-Lindenstrauss lemma for circulant matrices ⋮ Dense fast random projections and Lean Walsh transforms ⋮ Random projections of linear and semidefinite problems with linear inequalities ⋮ Representation and coding of signal geometry ⋮ Time for dithering: fast and quantized random embeddings via the restricted isometry property ⋮ Newton Sketch: A Near Linear-Time Optimization Algorithm with Linear-Quadratic Convergence ⋮ Optimal CUR Matrix Decompositions ⋮ It ain't where you're from, it's where you're at: hiring origins, firm heterogeneity, and wages ⋮ Robustness properties of dimensionality reduction with Gaussian random matrices ⋮ Hypercontractivity via tensor calculus ⋮ Limiting laws of coherence of random matrices with applications to testing covariance structure and construction of compressed sensing matrices ⋮ On variants of the Johnson–Lindenstrauss lemma ⋮ Acceleration of randomized Kaczmarz method via the Johnson-Lindenstrauss lemma ⋮ On deterministic sketching and streaming for sparse recovery and norm estimation ⋮ Streaming techniques and data aggregation in networks of tiny artefacts ⋮ Classification Scheme for Binary Data with Extensions ⋮ Approximating Spectral Clustering via Sampling: A Review ⋮ Learning intersections of halfspaces with a margin ⋮ Toward a unified theory of sparse dimensionality reduction in Euclidean space ⋮ Forecasting using random subspace methods ⋮ Bayesian compressed vector autoregressions ⋮ On using Toeplitz and circulant matrices for Johnson-Lindenstrauss transforms ⋮ Real-valued embeddings and sketches for fast distance and similarity estimation ⋮ On orthogonal projections for dimension reduction and applications in augmented target loss functions for learning problems ⋮ MULTIVARIATE CALIBRATION WITH SUPPORT VECTOR REGRESSION BASED ON RANDOM PROJECTION ⋮ Dimensionality reduction with subgaussian matrices: a unified theory ⋮ Randomized large distortion dimension reduction ⋮ Two-dimensional random projection ⋮ A performance driven methodology for cancelable face templates generation ⋮ A randomized method for solving discrete ill-posed problems ⋮ The Mailman algorithm: a note on matrix-vector multiplication ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Efficient large scale global optimization through clustering-based population methods ⋮ Randomized interpolative decomposition of separated representations ⋮ Fuzzy \(c\)-means and cluster ensemble with random projection for big data clustering ⋮ Geometric component analysis and its applications to data analysis ⋮ Correlations between random projections and the bivariate normal ⋮ Fast and RIP-optimal transforms ⋮ Stochastic quasi-gradient methods: variance reduction via Jacobian sketching ⋮ MREKLM: a fast multiple empirical kernel learning machine ⋮ Formation of similarity-reflecting binary vectors with random binary projections ⋮ Efficient extreme learning machine via very sparse random projection ⋮ Random projections for conic programs ⋮ R3P-Loc: a compact multi-label predictor using ridge regression and random projection for protein subcellular localization ⋮ A stochastic subspace approach to gradient-free optimization in high dimensions ⋮ Simple Classification using Binary Data ⋮ Optimal Bounds for Johnson-Lindenstrauss Transformations ⋮ Learning the truth vector in high dimensions ⋮ Almost Optimal Explicit Johnson-Lindenstrauss Families ⋮ Randomized linear algebra for model reduction. I. Galerkin methods and error estimation ⋮ Frequent Directions: Simple and Deterministic Matrix Sketching ⋮ Bayesian random projection-based signal detection for Gaussian scale space random fields ⋮ Guided Projections for Analyzing the Structure of High-Dimensional Data ⋮ Compressed and Penalized Linear Regression ⋮ Optimal fast Johnson-Lindenstrauss embeddings for large data sets ⋮ Testing proximity to subspaces: approximate \(\ell_\infty\) minimization in constant time ⋮ Johnson-Lindenstrauss lemma for circulant matrices** ⋮ Dimensionality reduction for \(k\)-distance applied to persistent homology ⋮ Sparse Learning for Large-Scale and High-Dimensional Data: A Randomized Convex-Concave Optimization Approach ⋮ On the strong restricted isometry property of Bernoulli random matrices ⋮ Structured matrix estimation and completion ⋮ High-dimensional model recovery from random sketched data by exploring intrinsic sparsity ⋮ Fast and memory-optimal dimension reduction using Kac's walk ⋮ Unnamed Item ⋮ Variance reduction in feature hashing using MLE and control variate method ⋮ Recent advances in text-to-pattern distance algorithms ⋮ Near-neighbor preserving dimension reduction via coverings for doubling subsets of \(\ell_1\) ⋮ Random projections as regularizers: learning a linear discriminant from fewer observations than dimensions ⋮ On Lipschitz extension from finite subsets
Cites Work
- Unnamed Item
- Unnamed Item
- The Johnson-Lindenstrauss lemma and the sphericity of some graphs
- Latent semantic indexing: A probabilistic analysis
- The geometry of graphs and some of its algorithmic applications
- Clustering for edge-cost minimization (extended abstract)
- Extensions of Lipschitz mappings into a Hilbert space
- Algorithmic derandomization via complexity theory
- Efficient Search for Approximate Nearest Neighbor in High Dimensional Spaces
- Learning mixtures of arbitrary gaussians
This page was built for publication: Database-friendly random projections: Johnson-Lindenstrauss with binary coins.