Neyman-Pearson classification: parametrics and sample size requirement
From MaRDI portal
Publication:4969044
zbMath1497.62159arXiv1802.02557MaRDI QIDQ4969044
Xin Tong, Lucy Xia, Jiacheng Wang, Yang Feng
Publication date: 5 October 2020
Full work available at URL: https://arxiv.org/abs/1802.02557
classificationadaptive splittinglinear discriminant analysis (LDA)asymmetric errorNeyman-Pearson (NP) paradigmminimum sample size requirementNP oracle inequalitiesNP umbrella algorithm
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05)
Related Items (3)
Asymmetric Error Control Under Imperfect Supervision: A Label-Noise-Adjusted Neyman–Pearson Umbrella Algorithm ⋮ Unnamed Item ⋮ nproc
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Regularized linear discriminant analysis and its application in microarrays
- Sparse linear discriminant analysis by thresholding for high dimensional data
- A tail inequality for quadratic forms of subgaussian random vectors
- Smooth discrimination analysis
- Measuring mass concentrations and estimating density contour clusters -- An excess mass approach
- Variance and covariance inequalities for truncated joint normal distribution via monotone likelihood ratio and log-concavity
- Isotropic local laws for sample covariance and generalized Wigner matrices
- Penalized Classification using Fisher’s Linear Discriminant
- A direct approach to sparse discriminant analysis in ultra-high dimensions
- A Direct Estimation Approach to Sparse Linear Discriminant Analysis
- Analysis to Neyman-Pearson classification with convex loss function
- A Neyman–Pearson Approach to Statistical Learning
- Neyman-Pearson classification, convexity and stochastic constraints
- A Road to Classification in High Dimensional Space: The Regularized Optimal Affine Discriminant
- Random forests
This page was built for publication: Neyman-Pearson classification: parametrics and sample size requirement