The EAS approach for graphical selection consistency in vector autoregression models
From MaRDI portal
Publication:6059467
DOI10.1002/cjs.11726arXiv1906.04812OpenAlexW2951104656MaRDI QIDQ6059467
Yuying Xie, Jan Hannig, Jonathan P. Williams
Publication date: 2 November 2023
Published in: Canadian Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1906.04812
empirical Bayesgeneralized fiducial inferencelarge-sample propertieshigh-dimensional model selectiongraph selection
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Regularized estimation in sparse high-dimensional time series models
- Bayesian variable selection with shrinking and diffusing priors
- The pseudo-marginal approach for efficient Monte Carlo computations
- Best subset selection via a modern optimization lens
- Bayesian stochastic search for VAR model restrictions
- Estimation of (near) low-rank matrices with noise and high-dimensional scaling
- High-dimensional regression with noisy and missing data: provable guarantees with nonconvexity
- Strong selection consistency of Bayesian vector autoregressive models based on a pseudo-likelihood approach
- \(\ell_1\)-regularization of high-dimensional time-series models with non-Gaussian and heteroskedastic errors
- The formal definition of reference priors
- Fiducial theory and optimal inference
- Lasso guarantees for \(\beta \)-mixing heavy-tailed time series
- Estimating limits from Poisson counting data using Dempster-Shafer analysis
- Spectral analysis of high-dimensional time series
- Convex regularization for high-dimensional multiresponse tensor regression
- Nonpenalized variable selection in high-dimensional linear model settings via generalized fiducial inference
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- Confidence, Likelihood, Probability
- Square‐Root LASSO for High‐Dimensional Sparse Linear Systems with Weakly Dependent Errors
- Low Rank and Structured Modeling of High-Dimensional Vector Autoregressions
- Bayesian Model Selection in High-Dimensional Settings
- Confidence Distribution, the Frequentist Distribution Estimator of a Parameter: A Review
- Learning High-Dimensional Generalized Linear Autoregressive Models
- High-Dimensional Posterior Consistency in Bayesian Vector Autoregressive Models
- Fiducial and Confidence Distributions for Real Exponential Families
- Oracle M‐Estimation for Time Series Models
- Inequalities for Gamma Function Ratios
- The p-value Function and Statistical Inference
- Joint Structural Break Detection and Parameter Estimation in High-Dimensional Nonstationary VAR Models
- A Gibbs Sampler for a Class of Random Convex Polytopes
This page was built for publication: The EAS approach for graphical selection consistency in vector autoregression models