Selective inference for additive and linear mixed models
From MaRDI portal
Publication:2072385
DOI10.1016/j.csda.2021.107350OpenAlexW3199655981WikidataQ108863848 ScholiaQ108863848MaRDI QIDQ2072385
Philipp F. M. Baumann, David Rügamer, Sonja Greven
Publication date: 26 January 2022
Published in: Computational Statistics and Data Analysis (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2007.07930
Related Items (3)
Uniformly valid inference based on the Lasso in linear mixed models ⋮ Mixed-effect models with trees ⋮ Semiparametric Regression with R
Uses Software
Cites Work
- Unnamed Item
- Exact post-selection inference, with application to the Lasso
- Valid post-selection inference
- Flexible smoothing with \(B\)-splines and penalties. With comments and a rejoinder by the authors
- Selective inference after likelihood- or test-based model selection in linear models
- Uniform asymptotic inference and the bootstrap after model selection
- Inference for \(L_2\)-boosting
- A unifying approach to the estimation of the conditional Akaike information in generalized linear mixed models
- On generalized degrees of freedom with application in linear mixed models selection
- Coverage Properties of Confidence Intervals for Generalized Additive Model Components
- On the behaviour of marginal and conditional AIC in linear mixed models
- Robust linear mixed models using the skew t distribution with application to schizophrenia data
- Asymptotic post-selection inference for the Akaike information criterion
This page was built for publication: Selective inference for additive and linear mixed models