The following pages link to Po-Ling Loh (Q682288):
Displaying 24 items.
- Support recovery without incoherence: a case for nonconvex regularization (Q682289) (← links)
- High-dimensional regression with noisy and missing data: provable guarantees with nonconvexity (Q693741) (← links)
- High-dimensional robust precision matrix estimation: cellwise corruption under \(\epsilon \)-contamination (Q1753147) (← links)
- Statistical consistency and asymptotic normality for high-dimensional robust \(M\)-estimators (Q2012209) (← links)
- Provable training set debugging for linear regression (Q2071505) (← links)
- Scale calibration for high-dimensional robust regression (Q2074316) (← links)
- Optimal rates for community estimation in the weighted stochastic block model (Q2176614) (← links)
- Structure estimation for discrete graphical models: generalized covariance matrices and their inverses (Q2443211) (← links)
- Faster Hoeffding Racing: Bernstein Races via Jackknife Estimates (Q2859218) (← links)
- High-dimensional learning of linear causal networks via inverse covariance estimation (Q2934130) (← links)
- Persistence of centrality in random growing trees (Q4601443) (← links)
- Permutation Tests for Infection Graphs (Q4999155) (← links)
- Estimating location parameters in sample-heterogeneous distributions (Q5044115) (← links)
- Book Review: High-dimensional statistics: A non-asymptotic viewpoint (Q5122998) (← links)
- Comment on “A Tuning-Free Robust and Efficient Approach to High-Dimensional Regression” (Q5146021) (← links)
- Teaching and Learning in Uncertainty (Q5151734) (← links)
- (Q5224823) (← links)
- Regularized M-estimators with nonconvexity: Statistical and algorithmic theory for local optima (Q5502126) (← links)
- Robust W-GAN-based estimation under Wasserstein contamination (Q5878255) (← links)
- Differentially private inference via noisy optimization (Q6183772) (← links)
- Robust W-GAN-Based Estimation Under Wasserstein Contamination (Q6358642) (← links)
- Communication-constrained hypothesis testing: optimality, robustness, and reverse data processing inequalities (Q6575635) (← links)
- Entropic regularization of neural networks: self-similar approximations (Q6592793) (← links)
- Proposers of the vote of thanks to Crane and Xu and contribution to the discussion of ``Root and community inference on the latent growth process of a network'' (Q6670525) (← links)