Nonlinear time series. Theory, methods and applications with R examples (Q2871232)

From MaRDI portal





scientific article; zbMATH DE number 6248992
Language Label Description Also known as
English
Nonlinear time series. Theory, methods and applications with R examples
scientific article; zbMATH DE number 6248992

    Statements

    0 references
    0 references
    0 references
    22 January 2014
    0 references
    nonlinear time series
    0 references
    inference
    0 references
    Markov processes
    0 references
    0 references
    0 references
    Nonlinear time series. Theory, methods and applications with R examples (English)
    0 references
    The book is designed for researchers and postgraduate students who want to acquire advanced knowledge in nonlinear time series and their applications.NEWLINENEWLINE This text is divided in four parts, Foundations, Markovian models, State space and Hidden Markov models, and four appendices. Foundations comprise four chapters.NEWLINENEWLINE In Chapter 1 the authors briefly review some aspects of stochastic processes and linear time series models, univariate and multivariate cases. The chapter is a good review of stationarity, linear processes and Gaussianity.NEWLINENEWLINEChapter 2 is focused on the linear Gaussian state space model. The Kalman filter is dealt with in detail. There are several interesting application examples.NEWLINENEWLINEChapter 3 is entitled ``Beyond linear models''. Its goal is to provide an introduction to nonlinear processes and models that may be used for data analysis. Through real data examples it is discussed why there is a need for such models.NEWLINENEWLINEChapter 4 deals with stochastic difference equations also referred to as random coefficient autoregression. This chapter gives a general framework to establish properties for a variety of nonlinear models.NEWLINENEWLINEChapters 5 to 8 are dedicated to Markov models and are the contents of Part II. Chapter 5 is focused on the construction of Markov models. These constructions are illustrated with several important time series models.NEWLINENEWLINEChapter 6 is entitled ``Stability and convergence''. The authors examine different types of ergodicity for Markov chains. The first one is the uniform ergodicity, the oldest and strongest form of convergence in the study of Markov chains. In several important applications it is necessary to generalize results beyond uniform ergodicity. In this chapter, the concept of V-geometric ergodicity is analyzed. It constitutes the most useful framework for nonlinear time series.NEWLINENEWLINEIn Chapter 7, the question is analyzed whether the law of large numbers and Lindeberg-Lévy central limit theorem extend to Markov chains models. The analysis is focused on uniformly ergodic Markov chains.NEWLINENEWLINEChapter 8 deals with inference for Markovian models. Consistency and asymptotic normality of the MLE of the parameter for misspecified models are considered. Three examples of the Bayesian analysis of time series models are included. The approach taken to study the asymptotic distribution of the MLE uses only the most standard tools of asymptotic statistics.NEWLINENEWLINEChapters 9 to 13 are the contents of Part III. Chapter 9 is dedicated to non-Gaussian and nonlinear state space models. Nonlinear state space model (NLSS) or hidden Markov model (HMM) keeps the structure of the Gaussian linear state space model, but removes the limitations of linearity and Gaussianity. The chapter considers several HMMs, finite-valued state spaces, nonlinear Gaussian state space models, and conditionally Gaussian state space models, among others.NEWLINENEWLINEChapter 10 is entitled ``Particle filtering''. This chapter is a summary on sequential Monte Carlo (SMC) methods. SMC refers to a class of methods designed to approximate a sequence of probability distributions by a set of particles such that each has a non-negative weight and is updated recursively. SMC methods are a combination of the sequential importance sampling method and the sampling importance resampling algorithms.NEWLINENEWLINEChapter 11 is focused on particle smoothing. It corresponds to approximating the conditional distribution of the state \(X(t)\) at time \(t\) given all of the available observations up to time \(nt\). The smoothing distribution may be thought of as a correction or an update to the filter distribution that is enhanced by the use of additional observations from time \(t+1\) to \(n\).NEWLINENEWLINEChapter 12 deals with the inference for NLSS. Maximum likelihood estimation and Bayesian inference for general NLSS are considered. The likelihood and its gradient are seldom available in closed form, and Monte Carlo techniques are required to approximate these quantities. Unlike gradient-based methods, the EM approach is suitable for the maximization of likelihood functions in incomplete data models. This method is clearly of interest for MLE in NLSS.NEWLINENEWLINEChapter 13 is dedicated to asymptotics of MLE for nonlinear state space models. Strong consistency in misspecified and well-specified models is analyzed.NEWLINENEWLINEThe book concludes with four appendices: Some mathematical background, Martingales, Stochastic approximation, and Data augmentation. The last appendix is closely related with the EM algorithm. This book is very suitable for mathematicians requiring a very rigorous and complete introduction to nonlinear time series and their applications in several fields.
    0 references
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references