Perturbation Bounds for Procrustes, Classical Scaling, and Trilateration, with Applications to Manifold Learning
From MaRDI portal
Publication:4969047
zbMath1498.68221arXiv1810.09569MaRDI QIDQ4969047
Bruno Pelletier, Ery Arias-Castro, Adel Javanmard
Publication date: 5 October 2020
Full work available at URL: https://arxiv.org/abs/1810.09569
Statistics on manifolds (62R30) Learning and adaptive systems in artificial intelligence (68T05) Pattern recognition, speech recognition (68T10)
Related Items (4)
Infinite multidimensional scaling for metric measure spaces ⋮ Universally consistent estimation of the reach ⋮ Localization in 1D non-parametric latent space models from pairwise affinities ⋮ On the estimation of latent distances using graph distances
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Localization from incomplete noisy distance measurements
- Manifold estimation and singular deconvolution under Hausdorff loss
- Discrete Hessian eigenmaps method for dimensionality reduction
- Tight minimax rates for manifold estimation under Hausdorff loss
- User-friendly tail bounds for sums of random matrices
- Theory of semidefinite programming for sensor network localization
- Towards a theoretical foundation for Laplacian-based manifold methods
- Continuum isomap for manifold learnings
- Perturbation analysis of the orthogonal Procrustes problem
- The solution of orthogonal Procrustes problems for a family of orthogonally invariant norms
- Unconstrained and curvature-constrained shortest-path distances and their approximation
- Consistency of spectral clustering
- Diffusion maps
- From graph to manifold Laplacian: the convergence rate
- Multidimensional scaling. I: Theory and method
- Curvature Measures
- Empirical graph Laplacian approximation of Laplace–Beltrami operators: Large sample results
- The Effect of Coherence on Sampling from Matrices with Orthonormal Columns, and Preconditioned Least Squares Problems
- Randomized Approximation of the Gram Matrix: Exact Computation and Probabilistic Bounds
- Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data
- Learning Theory
- Some distance properties of latent root and vector methods used in multivariate analysis
This page was built for publication: Perturbation Bounds for Procrustes, Classical Scaling, and Trilateration, with Applications to Manifold Learning