Unconstrained recursive importance sampling (Q988764)

From MaRDI portal
scientific article
Language Label Description Also known as
English
Unconstrained recursive importance sampling
scientific article

    Statements

    Unconstrained recursive importance sampling (English)
    0 references
    0 references
    0 references
    18 August 2010
    0 references
    The basic problems in numerical probability is to optimize the computation by a Monte Carlo simulation of real quantities through a probabilistic representations as an expectation. In the present paper an unconstrained stochastic approximation method for finding the optimal change of measure to reduce the variance of Monte Carlo simulations is proposed. The scalar or process parameters are selected by a classical Robbins-Monro procedure without projection or truncation. The convergence for a large class of multidimensional distributions as well as for diffusion processes is proved. The efficiency of the developed algorithm is illustrated. In the Introduction the basic notions, statements, algorithms, results on the paradigm of finite-dimensional and infinite-dimensional settings are recalled. The classical procedure of the Robbins-Monro algorithm is presented. In Section 2 the translation for log-concave probability distributions and the Esscher transform are investigated. The finite-dimensional setting is focused. In Section 2.1 the main result of the paper -- Theorem 1 -- is presented. This theorem is a slight extension of the Robbins-Monro convergence theorem. A program how to apply Theorem 1 is realized. In Section 2.2 Theorem 1 is applied for revision of the Gaussian distribution. In Section 2.3 a self-controlled variant of the algorithms is applied to the case of a too dissymmetric functions. In Theorem 2, under the given assumptions, a recursive procedure is defined, and the distribution and the convergence of the constructed random variables are shown. In Section 2.4 the parametrized exponential change of measure, or the Esscher transform, is considered as an approach to design an importance sampling procedure. In Theorem 3 also, under the given assumptions, a recursive procedure is defined, and the distribution and the convergence of the constructed random variables are shown. In Section 3 the functional version of the presented algorithm is introduced. This approach is based on the Girsanov theorem. The \(d\)-dimensional Itô process as a solution to a stochastic differential equation is considered. In Theorem 4 an algorithm for a construction of a recursive sequence is presented and its convergence is shown. In Section 4 some comments on the practical implementations of the importance sampling by means of a translation in a finite-dimensional setting are presented. In Section 4.1 a purely adaptive approach is considered to reduce the variance. In Section 4.2 the weak rate of convergence is discussed in terms of the central limit theorem. In Section 4.3 the problem of an extension to more general sets of parameters is briefly considered. In section 5 some numerical experiments are carried out. In Section 5.1 both variance reducing approaches to proceed a recursive importance sampling are compared. The translation case for the Robbins-Monro procedure and the Esscher transform are used and graphically illustrated. In Section 5.2 the Down and In Call option is used to find the solution of a given diffusion equation. Three different basis on \(L^2\) -- the Legendre polynomials, the Karhunen-Loève basis and the Haar basis are used to present the solution of the \(k\)-Scholes model. The obtained results are compared. The paper finishes with an appendix, where the proof of Theorem 1 is presented.
    0 references
    stochastic algorithm
    0 references
    importance sampling
    0 references
    Escher transform
    0 references
    NIG distribution
    0 references
    barrier options
    0 references
    finite-dimensional settings
    0 references
    infinite-dimensional settings
    0 references
    recursive procedures
    0 references
    change of measure
    0 references
    variance reduction
    0 references
    basis in \(L^2\)
    0 references
    Monte Carlo simulation
    0 references
    convergence
    0 references
    diffusion processes
    0 references
    Robbins-Monro algorithm
    0 references
    log-concave probability distributions
    0 references
    convergence theorem
    0 references
    Gaussian distribution
    0 references
    Girsanov theorem
    0 references
    Itô process
    0 references
    stochastic differential equation
    0 references
    central limit theorem
    0 references
    numerical experiments
    0 references
    Black-Scholes model
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references