A fast algorithm for manifold learning by posing it as a symmetric diagonally dominant linear system
DOI10.1016/j.acha.2015.10.004zbMath1376.94008OpenAlexW1757232183WikidataQ57258910 ScholiaQ57258910MaRDI QIDQ262960
Praneeth Vepakomma, Ahmed Elgammal
Publication date: 4 April 2016
Published in: Applied and Computational Harmonic Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.acha.2015.10.004
linearly constrained optimizationfast manifold learningfast SDD solverfast sparse Cholesky decompositionsymmetric diagonally dominant (SDD) system
Learning and adaptive systems in artificial intelligence (68T05) Image processing (compression, reconstruction, etc.) in information and communication theory (94A08)
Related Items (max. 100)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Continuum isomap for manifold learnings
- Multiple-Rank Modifications of a Sparse Cholesky Factorization
- Nearly Linear Time Algorithms for Preconditioning and Solving Symmetric, Diagonally Dominant Linear Systems
- Spectral Sparsification of Graphs
- Modifying a Sparse Cholesky Factorization
- Geometric diffusions as a tool for harmonic analysis and structure definition of data: Diffusion maps
- Principal Manifolds and Nonlinear Dimensionality Reduction via Tangent Space Alignment
- Laplacian Eigenmaps for Dimensionality Reduction and Data Representation
- Solving SDD linear systems in nearly m log 1/2 n time
- Row Modifications of a Sparse Cholesky Factorization
- Approaching Optimality for Solving SDD Linear Systems
- A general framework for graph sparsification
- Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data
- A Nearly-m log n Time Solver for SDD Linear Systems
- Graph Sparsification by Effective Resistances
This page was built for publication: A fast algorithm for manifold learning by posing it as a symmetric diagonally dominant linear system