Statistical embedding: beyond principal components
DOI10.1214/22-sts881arXiv2106.01858OpenAlexW3169863281MaRDI QIDQ6181742
Anders Løland, Martin Jullum, Dag Tjøstheim
Publication date: 23 January 2024
Published in: Statistical Science (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2106.01858
visualizationreproducing kernel Hilbert spacerandom projectionmultidimensional scalingpersistent homologyspectral embeddingnetwork embeddingdiffusion mappingpersistence diagramlocal linear methodprincipal componentgraph spectral theoryMapperISOMAP\(t\)-SNEneighborhood sampling strategiesnon-linear principal componentskip-Gramstatistical embeddingstochastic block modelingtopological data analysis and embedding
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Likelihood-based model selection for stochastic block models
- Asymptotic normality of maximum likelihood and its variational approximation for stochastic blockmodels
- Selecting the number of components in principal component analysis using cross-validation approximations
- Manifold estimation and singular deconvolution under Hausdorff loss
- Nonparametric ridge estimation
- Spectral clustering and the high-dimensional stochastic blockmodel
- Asymptotic theory for density ridges
- Self-organized formation of topologically correct feature maps
- Extremal properties of principal curves in the plane
- Computing persistent homology
- Principal component analysis.
- Topological persistence and simplification
- Network vector autoregression
- Statistical dependence: beyond Pearson's \(\rho\)
- Pairwise local Fisher and naive Bayes: improving two standard discriminants
- Dynamic stochastic block models: parameter estimation and detection of changes in community structure
- Consistency of spectral clustering in stochastic block models
- Information criteria and statistical modeling.
- Finding the homology of submanifolds with high confidence from random samples
- Diffusion maps
- Theoretical foundations of the potential function method in pattern recognition learning
- Multidimensional scaling. I: Theory and method
- A nonparametric view of network models and Newman–Girvan and other modularities
- Empirical Analysis of an Evolving Social Network
- Reducing the Dimensionality of Data with Neural Networks
- Extensions of Lipschitz mappings into a Hilbert space
- Topology and data
- Detection of Abnormal Behavior Via Nonparametric Estimation of the Support
- Latent Space Approaches to Social Network Analysis
- Community structure in social and biological networks
- Statistical Analysis and Parameter Selection for Mapper
- Networks
- Minimax Rates for Estimating the Dimension of a Manifold
- Co-clustering directed graphs to discover asymmetries and directional communities
- Principal Curves
- Laplacian Eigenmaps for Dimensionality Reduction and Data Representation
- Statistical Modeling Using Local Gaussian Approximation
- Modularity Based Community Detection in Heterogeneous Networks
- Topological Data Analysis of Single-Cell Hi-C Contact Maps
- Predicting Clinical Outcomes in Glioblastoma: An Application of Topological and Functional Data Analysis
- Grouped Network Vector Autoregression
- Fast unfolding of communities in large networks
- Random-projection Ensemble Classification
- Nonlinear Estimators and Tail Bounds for Dimension Reduction in l 1 Using Cauchy Random Projections
- Augmented Implicitly Restarted Lanczos Bidiagonalization Methods
- How to Draw a Graph
- Hypothesis Testing for Automated Community Detection in Networks
- RELATIONS BETWEEN TWO SETS OF VARIATES
- Some recent trends in embeddings of time series and dynamic networks
This page was built for publication: Statistical embedding: beyond principal components