Ridge Regularization: An Essential Concept in Data Science
From MaRDI portal
Publication:6636560
DOI10.1080/00401706.2020.1791959MaRDI QIDQ6636560
Publication date: 12 November 2024
Published in: Technometrics (Search for Journal in Brave)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Matrix Completion and Low-Rank SVD via Fast Alternating Least Squares
- Computer Age Statistical Inference
- Generalized Cross-Validation as a Method for Choosing a Good Ridge Parameter
- Reconciling modern machine-learning practice and the classical bias–variance trade-off
- Regularization and Variable Selection Via the Elastic Net
- Model Selection and Estimation in Regression with Grouped Variables
- Ridge Regression: Biased Estimation for Nonorthogonal Problems
- Efficient quadratic regularization for expression arrays
- Random forests
- The Elements of Statistical Learning
Related Items (7)
High-dimensional regression coefficient estimation by nuclear norm plus \(l_1\) norm penalization ⋮ Screen then select: a strategy for correlated predictors in high-dimensional quantile regression ⋮ Regularized parametric survival modeling to improve risk prediction models ⋮ A tutorial on individualized treatment effect prediction from randomized trials with a binary endpoint ⋮ Can’t Ridge Regression Perform Variable Selection? ⋮ Comment: Regularization via Bayesian Penalty Mixing ⋮ Comment: Ridge Regression—Still Inspiring After 50 Years
This page was built for publication: Ridge Regularization: An Essential Concept in Data Science