Subsampling for heteroskedastic time series
DOI10.1016/S0304-4076(97)86569-4zbMath0904.62059OpenAlexW2059047279WikidataQ60962280 ScholiaQ60962280MaRDI QIDQ1372916
Joseph P. Romano, Michael Wolf, Dimitris N. Politis
Publication date: 4 November 1997
Published in: Journal of Econometrics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/s0304-4076(97)86569-4
time seriessubsamplingheteroskedasticitymoving blocks bootstrapcentral limit theorem for triangular arrays
Applications of statistics to economics (62P20) Time series, auto-correlation, regression, etc. in statistics (GARCH) (62M10) Asymptotic properties of nonparametric inference (62G20) Linear regression; mixed models (62J05) Nonparametric tolerance and confidence regions (62G15) Nonparametric statistical resampling methods (62G09)
Related Items
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- The Pricing of Options and Corporate Liabilities
- Heteroskedasticity and Autocorrelation Consistent Covariance Matrix Estimation
- On bootstrapping two-stage least-squares estimates in stationary linear models
- Bootstrap methods in statistics
- The use of subseries values for estimating the variance of a general statistic from a stationary sequence
- Edgeworth correction by bootstrap in autoregressions
- Bootstrap procedures under some non-i.i.d. models
- Bootstrapping regression models
- On the asymptotic accuracy of Efron's bootstrap
- On bootstrapping kernel spectral estimates
- A general resampling scheme for triangular arrays of \(\alpha\)-mixing random variables with application to the problem of spectral density estimation
- Bootstrap methods: another look at the jackknife
- Mixing: Properties and examples
- Blockwise bootstrapped empirical process for stationary sequences
- The moving blocks bootstrap and robust inference for linear least squares and quantile regressions
- Block length selection in the bootstrap for time series
- Jackknife, bootstrap and other resampling methods in regression analysis
- The jackknife and the bootstrap for general stationary observations
- Large sample confidence regions based on subsamples under minimal assumptions
- A CENTRAL LIMIT THEOREM AND A STRONG MIXING CONDITION
- Some Limit Theorems for Random Functions. I
- Nonlinear Regression with Dependent Observations
- Calibrating Confidence Coefficients
- An Improved Heteroskedasticity and Autocorrelation Consistent Covariance Matrix Estimator
- The Stationary Bootstrap
- On blocking rules for the bootstrap with dependent data
- Contributions to Central Limit Theory for Dependent Variables
- The Invariance Principle for Stationary Processes
- Functional central limit theorems for strictly stationary processes satisfying the strong mixing condition
- Some Limit Theorems for Stationary Processes
- The bootstrap and Edgeworth expansion