On Consistency and Sparsity for Principal Components Analysis in High Dimensions
From MaRDI portal
Publication:5252135
DOI10.1198/jasa.2009.0121zbMath1388.62174OpenAlexW2097714737WikidataQ41809842 ScholiaQ41809842MaRDI QIDQ5252135
Arthur Yu Lu, Iain M. Johnstone
Publication date: 29 May 2015
Published in: Journal of the American Statistical Association (Search for Journal in Brave)
Full work available at URL: http://europepmc.org/articles/pmc2898454
Related Items
Wald Statistics in high-dimensional PCA, Modeling High-Dimensional Time Series: A Factor Model With Dynamically Dependent Factors and Diverging Eigenvalues, Robust low-rank data matrix approximations, Statistical regression analysis of functional and shape data, Learning Latent Factors From Diversified Projections and Its Applications to Over-Estimated and Weak Factors, Asymptotic Theory of Eigenvectors for Random Matrices With Diverging Spikes, Optimally Weighted PCA for High-Dimensional Heteroscedastic Data, Data-guided Treatment Recommendation with Feature Scores, Sparse Functional Principal Component Analysis in High Dimensions, Cross-Validated Loss-based Covariance Matrix Estimator Selection in High Dimensions, Bayesian sparse covariance decomposition with a graphical structure, Vector diffusion maps and the connection Laplacian, Vast volatility matrix estimation for high-frequency financial data, Unnamed Item, Adjusting systematic bias in high dimensional principal component scores, Estimating Large Precision Matrices via Modified Cholesky Decomposition, Overview of object oriented data analysis, Unnamed Item, Estimation of low-rank matrices via approximate message passing, Asymmetry helps: eigenvalue and eigenvector analyses of asymmetrically perturbed low-rank matrices, Directed Principal Component Analysis, The Impact of Measurement Error on Principal Component Analysis, Asymptotically efficient estimation of smooth functionals of covariance operators, Selection of time instants and intervals with support vector regression for multivariate functional data, Order Determination for Spiked Type Models, Large covariance estimation through elliptical factor models, Matrix means and a novel high-dimensional shrinkage phenomenon, A Universal Test on Spikes in a High-Dimensional Generalized Spiked Model and Its Applications, Stability of principal components under normal and non-normal parent populations and different covariance structures scenarios, Free Energy Wells and Overlap Gap Property in Sparse PCA, Uncovering block structures in large rectangular matrices, Principal component analysis: a review and recent developments, Convergence rate of eigenvector empirical spectral distribution of large Wigner matrices, Large volatility matrix analysis using global and national factor models, Distributed Estimation for Principal Component Analysis: An Enlarged Eigenspace Analysis, Re-thinking high-dimensional mathematical statistics. Abstracts from the workshop held May 15--21, 2022, Bayesian sparse spiked covariance model with a continuous matrix shrinkage prior, Scalable Bayesian high-dimensional local dependence learning, Unnamed Item, Misspecified nonconvex statistical optimization for sparse phase retrieval, A communication-efficient and privacy-aware distributed algorithm for sparse PCA, Compressed spectral screening for large-scale differential correlation analysis with application in selecting glioblastoma gene modules, Sparse principal component analysis for high‐dimensional stationary time series, Covariance Estimation for Matrix-valued Data, Dynamic Principal Component Analysis in High Dimensions, Order determination for spiked-type models with a divergent number of spikes, Entrywise limit theorems for eigenvectors of signal-plus-noise matrix models with weak signals, Long random matrices and tensor unfolding, Coordinatewise Gaussianization: Theories and Applications, Regression on manifolds: estimation of the exterior derivative, Time Series Source Separation Using Dynamic Mode Decomposition, Detecting the large entries of a sparse covariance matrix in sub-quadratic time, Near-optimal estimation of simultaneously sparse and low-rank matrices from nested linear measurements, High Dimensional Change Point Estimation via Sparse Projection, Minimax sparse principal subspace estimation in high dimensions, Covariance and precision matrix estimation for high-dimensional time series, Sparse PCA: optimal rates and adaptive estimation, Simple components, Random matrix theory in statistics: a review, Oracle Inequalities for Local and Global Empirical Risk Minimizers, A test of sphericity for high-dimensional data and its application for detection of divergently spiked noise, Statistical Analysis of Trajectories of Multi-Modality Data, Near-Optimal Bounds for Phase Synchronization, A survey of high dimension low sample size asymptotics, Statistical challenges of high-dimensional data, An $\ell_{\infty}$ Eigenvector Perturbation Bound and Its Application to Robust Covariance Estimation, Two-Step Hypothesis Testing When the Number of Variables Exceeds the Sample Size, Computational and statistical tradeoffs via convex relaxation, Sparse estimation of large covariance matrices via a nested Lasso penalty, Biclustering with heterogeneous variance, Large Covariance Estimation by Thresholding Principal Orthogonal Complements, Sparse Principal Component Analysis in Hilbert Space, Eigenvectors from Eigenvalues Sparse Principal Component Analysis, Proximal Distance Algorithms: Theory and Examples, ECA: High-Dimensional Elliptical Component Analysis in Non-Gaussian Distributions, Estimating structured high-dimensional covariance and precision matrices: optimal rates and adaptive estimation, Forecasting high-dimensional realized volatility matrices using a factor model, Bayesian function-on-function regression for multilevel functional data, Optimal Sparse Singular Value Decomposition for High-Dimensional High-Order Data, Principal Component Analysis by Optimization of Symmetric Functions has no Spurious Local Optima, Clustering large number of extragalactic spectra of galaxies and quasars through canopies, Functional prediction of intraday cumulative returns, On the singular value distribution of large-dimensional data matrices whose columns have different correlations, A Generalized Least-Square Matrix Decomposition, Scale-Invariant Sparse PCA on High-Dimensional Meta-Elliptical Data, Variable selection of linear programming discriminant estimator, Penalized Orthogonal Iteration for Sparse Estimation of Generalized Eigenvalue Problem, Evaluating the performance of sparse principal component analysis methods in high-dimensional data scenarios, Principal Component Analysis of High-Frequency Data, Nonsparse Learning with Latent Variables, FLCRM: Functional linear cox regression model, On Cross-Validation for Sparse Reduced Rank Regression, Unnamed Item, Estimation of high-dimensional seemingly unrelated regression models, Rate-optimal posterior contraction for sparse PCA, A Unifying Tutorial on Approximate Message Passing, Asymptotic power of the sphericity test under weak and strong factors in a fixed effects panel data model, Do semidefinite relaxations solve sparse PCA up to the information limit?, Random projections as regularizers: learning a linear discriminant from fewer observations than dimensions, Testing equivalence of clustering, Overlapping community detection in networks via sparse spectral decomposition, Principal components in linear mixed models with general bulk, High-resolution signal recovery via generalized sampling and functional principal component analysis, TPRM: tensor partition regression models with applications in imaging biomarker detection, Physically interpretable machine learning algorithm on multidimensional non-linear fields, High-dimensional two-sample mean vectors test and support recovery with factor adjustment, Optimal sparse eigenspace and low-rank density matrix estimation for quantum systems, Statistical inference for principal components of spiked covariance matrices, Optimal Bayesian minimax rates for unconstrained large covariance matrices, Estimation of conditional mean operator under the bandable covariance structure, Non-asymptotic properties of spectral decomposition of large Gram-type matrices and applications, Perturbation theory for cross data matrix-based PCA, Global minimum variance portfolio optimisation under some model risk: a robust regression-based approach, Sparse PCA-based on high-dimensional Itô processes with measurement errors, Computational barriers to estimation from low-degree polynomials, Consistency of AIC and BIC in estimating the number of significant components in high-dimensional principal component analysis, Bayesian factor-adjusted sparse regression, Recent developments in high dimensional covariance estimation and its related issues, a review, Bayesian inference for spectral projectors of the covariance matrix, Sparse principal component analysis and iterative thresholding, Asymptotic performance of PCA for high-dimensional heteroscedastic data, Minimax bounds for sparse PCA with noisy high-dimensional data, Sparse SIR: optimal rates and adaptive estimation, Testing for principal component directions under weak identifiability, Using principal component analysis to estimate a high dimensional factor model with high-frequency data, Optimal detection of sparse principal components in high dimension, Optimal sparse volatility matrix estimation for high-dimensional Itô processes with measurement errors, Sparse-smooth regularized singular value decomposition, PCA consistency for the power spiked model in high-dimensional settings, Forecasting co-volatilities via factor models with asymmetry and long memory in realized covariance, Minimax estimation in sparse canonical correlation analysis, Sequential testing for structural stability in approximate factor models, Estimation of functionals of sparse covariance matrices, New asymptotic results in principal component analysis, tSSNALM: a fast two-stage semi-smooth Newton augmented Lagrangian method for sparse CCA, Convergence and prediction of principal component scores in high-dimensional settings, Coordinate-independent sparse sufficient dimension reduction and variable selection, Fundamental limits of detection in the spiked Wigner model, Nonasymptotic upper bounds for the reconstruction error of PCA, Limiting laws for divergent spiked eigenvalues and largest nonspiked eigenvalue of sample covariance matrices, Dimensionality reduction for binary data through the projection of natural parameters, Video denoising via empirical Bayesian estimation of space-time patches, Heterogeneity adjustment with applications to graphical model inference, Covariance estimation: the GLM and regularization perspectives, Minimax estimation of large precision matrices with bandable Cholesky factor, The spectral norm of random inner-product kernel matrices, Bayesian bandwidth test and selection for high-dimensional banded precision matrices, Combinatorial inference for graphical models, Maximum pairwise Bayes factors for covariance structure testing, Robust covariance estimation for approximate factor models, Accuracy of regularized D-rule for binary classification, Rate-optimal perturbation bounds for singular subspaces with applications to high-dimensional statistics, Near-optimal stochastic approximation for online principal component estimation, Dimension reduction big data using recognition of data features based on copula function and principal component analysis, On consistency and sparsity for sliced inverse regression in high dimensions, Asymptotics of the principal components estimator of large factor models with weakly influential factors, Computationally efficient banding of large covariance matrices for ordered data and connections to banding the inverse Cholesky factor, Sparse wavelet regression with multiple predictive curves, From simple structure to sparse components: a review, A very fast algorithm for matrix factorization, Simple algorithms for optimization on Riemannian manifolds with constraints, Partial estimation of covariance matrices, Wavelet estimation of the dimensionality of curve time series, Factor analysis via components analysis, Optimal rates of convergence for covariance matrix estimation, Effective PCA for high-dimension, low-sample-size data with singular value decomposition of cross data matrix, Estimating sufficient reductions of the predictors in abundant high-dimensional regressions, Integrative sparse principal component analysis, Distributed estimation of principal eigenspaces, Minimax posterior convergence rates and model selection consistency in high-dimensional DAG models based on sparse Cholesky factors, Panel models with interactive effects, Recovery of simultaneous low rank and two-way sparse coefficient matrices, a nonconvex approach, Estimation of autocovariance matrices for high dimensional linear processes, Asymptotic properties of principal component analysis and shrinkage-bias adjustment under the generalized spiked population model, Robust high-dimensional factor models with applications to statistical machine learning, Subspace estimation from unbalanced and incomplete data matrices: \({\ell_{2,\infty}}\) statistical guarantees, Robust covariance and scatter matrix estimation under Huber's contamination model, On asymptotic normality of cross data matrix-based PCA in high dimension low sample size, Two sample tests for high-dimensional covariance matrices, Factor-Adjusted Regularized Model Selection, Feature screening in large scale cluster analysis, Sampled forms of functional PCA in reproducing kernel Hilbert spaces, Wavelet-domain regression and predictive inference in psychiatric neuroimaging, On the estimation of correlation in a binary sequence model, Testing and estimating change-points in the covariance matrix of a high-dimensional time series, Solving equations of random convex functions via anchored regression, Sparse principal component analysis with missing observations, Adaptive gPCA: a method for structured dimensionality reduction with applications to microbiome data, Approximated penalized maximum likelihood for exploratory factor analysis: an orthogonal case, A guide for sparse PCA: model comparison and applications, High-dimensional outlier detection using random projections, Random matrix theory and its applications, Feature extraction for functional time series: theory and application to NIR spectroscopy data, Consistency of the objective general index in high-dimensional settings, Testing independence between two spatial random fields, An \({\ell_p}\) theory of PCA and spectral clustering, Sparsistency and agnostic inference in sparse PCA, Optimal estimation and rank detection for sparse spiked covariance matrices, Notes on computational hardness of hypothesis testing: predictions using the low-degree likelihood ratio