On robust learning in the canonical change point problem under heavy tailed errors in finite and growing dimensions
From MaRDI portal
Publication:2136638
DOI10.1214/21-EJS1927MaRDI QIDQ2136638
Ya'acov Ritov, Debarghya Mukherjee, Moulinath Banerjee
Publication date: 11 May 2022
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2105.11591
Related Items
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A smoothed least squares estimator for threshold regression models
- The median of the Poisson distribution
- Risk bounds for statistical learning
- Concentration inequalities and model selection. Ecole d'Eté de Probabilités de Saint-Flour XXXIII -- 2003.
- Risk bounds for model selection via penalization
- Change point estimation using nonparametric regression
- Adaptive estimation of the intensity of inhomogeneous Poisson processes via concentration inequalities
- Minimax estimation of sharp change points
- Weak convergence and empirical processes. With applications to statistics
- Multi-threshold accelerated failure time model
- Convergence rates of least squares regression estimators with heavy-tailed errors
- Simultaneous analysis of Lasso and Dantzig selector
- Change-point estimation under adaptive sampling
- Introduction to empirical processes and semiparametric inference
- Inference under right censoring for transformation models with a change-point based on a covariate threshold
- On robust regression with high-dimensional predictors
- A New Lower Bound for Multiple Hypothesis Testing
- A Smoothed Maximum Score Estimator for the Binary Response Model
- High-Dimensional Classification by Sparse Logistic Regression
- On the Uniform Convergence of the Frequencies of Occurrence of Events to Their Probabilities
- Model selection and error estimation