Computational Method for Jackknifed Generalized Ridge Tuning Parameter based on Generalized Maximum Entropy
From MaRDI portal
Publication:3168356
DOI10.1080/03610918.2011.600503zbMath1271.62155OpenAlexW1986394823MaRDI QIDQ3168356
Publication date: 30 October 2012
Published in: Communications in Statistics - Simulation and Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/03610918.2011.600503
Ridge regression; shrinkage estimators (Lasso) (62J07) Linear regression; mixed models (62J05) Statistical aspects of information-theoretic topics (62B10)
Related Items (2)
A jackknifed ridge estimator in probit regression model ⋮ Ridge Regression and Generalized Maximum Entropy: An improved version of the Ridge–GME parameter estimator
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Applications of the jackknife procedure in ridge regression
- A weighted generalized maximum entropy estimator with a data-driven weight
- Comparative statics of the generalized maximum entropy estimator of the general linear model
- On the Choice of the Ridge Parameter: A Maximum Entropy Approach
- Generalized Maximum Entropy Estimators: Applications to the Portland Cement Dataset
- NOTES ON BIAS IN ESTIMATION
- On the almost unbiased ridge regression estimator
- On Some Ridge Regression Estimators: An Empirical Comparisons
- Penalized Maximum Likelihood Principle for Choosing Ridge Parameter
- Ridge regression:some simulations
- A Monte Carlo Evaluation of Some Ridge-Type Estimators
- A simulation study of ridge and other regression estimators
- Mean Squared Error Matrix Comparisons of Some Biased Estimators in Linear Regression
- Choosing Ridge Parameter for Regression Problems
- Performance of Some New Ridge Regression Estimators
- Characterization of Ridge Trace Behavior
- Imposing parameter inequality restrictions using the principle of maximum entropy
This page was built for publication: Computational Method for Jackknifed Generalized Ridge Tuning Parameter based on Generalized Maximum Entropy