Robust Wasserstein profile inference and applications to machine learning
DOI10.1017/jpr.2019.49zbMath1436.62336arXiv1610.05627OpenAlexW2537619949WikidataQ92196462 ScholiaQ92196462MaRDI QIDQ5235055
Karthyek R. A. Murthy, Yang Kang, Jose H. Blanchet
Publication date: 7 October 2019
Published in: Journal of Applied Probability (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1610.05627
regularizationlogistic regressionempirical likelihoodWasserstein distancesupport vector machinedistributionally robust optimizationsquare-root Lassolimit characterization of optimal Wasserstein ball radius and regularization parameter
Ridge regression; shrinkage estimators (Lasso) (62J07) Generalized linear models (logistic models) (62J12) Learning and adaptive systems in artificial intelligence (68T05)
Related Items (35)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On the rate of convergence in Wasserstein distance of the empirical measure
- Characterization of the equivalence of robustification and regularization in linear and matrix regression
- Empirical likelihood ratio confidence regions
- Extending the scope of empirical likelihood
- Empirical likelihood for linear models
- Matching random samples in many dimensions
- Smoothed empirical likelihood confidence intervals for quantiles
- Empirical likelihood and general estimating equations
- Mass transportation problems. Vol. 1: Theory. Vol. 2: Applications
- Weighted empirical likelihood inference.
- The empirical likelihood approach to quantifying uncertainty in sample average approximation
- Data-driven distributionally robust optimization using the Wasserstein metric: performance guarantees and tractable reformulations
- Asymptotics for Lasso-type estimators.
- The earth mover's distance as a metric for image retrieval
- Least angle regression. (With discussion)
- Simultaneous analysis of Lasso and Dantzig selector
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- On sharpness of Tchebycheff-type inequalities
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- Empirical likelihood ratio confidence intervals for a single functional
- Earth mover's distances on discrete surfaces
- Generalized Chebychev Inequalities: Theory and Applications in Decision Analysis
- Statistics of Robust Optimization: A Generalized Empirical Likelihood Approach
- Sample Out-of-Sample Inference Based on Wasserstein Distance
- Calibration of Distributionally Robust Empirical Optimization Models
- Recovering Best Statistical Guarantees via the Empirical Divergence-Based Distributionally Robust Optimization
- Regularization via Mass Transportation
- Quantifying Distributional Model Risk via Optimal Transport
- Robust Wasserstein profile inference and applications to machine learning
- Robust Regression and Lasso
- EMPIRICAL LIKELIHOOD BASED INFERENCE WITH APPLICATIONS TO SOME ECONOMETRIC MODELS
- Optimal Transport
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
This page was built for publication: Robust Wasserstein profile inference and applications to machine learning