A Proof that Artificial Neural Networks Overcome the Curse of Dimensionality in the Numerical Approximation of Black–Scholes Partial Differential Equations
DOI10.1090/memo/1410OpenAlexW2890889625MaRDI QIDQ5889064
Philippe von Wurstemberger, Fabian Hornung, Arnulf Jentzen, Philipp Grohs
Publication date: 26 April 2023
Published in: Memoirs of the American Mathematical Society (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1809.02362
Research exposition (monographs, survey articles) pertaining to numerical analysis (65-02) Stochastic analysis (60Hxx) Numerical approximation and computational geometry (primarily algorithms) (65D99) Parabolic equations and parabolic systems (35Kxx)
Related Items (16)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Breaking the curse of dimensionality in sparse polynomial approximation of parametric PDEs
- Deep learning-based numerical methods for high-dimensional parabolic partial differential equations and backward stochastic differential equations
- Infinite-dimensional quadrature and approximation of distributions
- Geometric theory of semilinear parabolic equations
- Monte Carlo complexity of global solution of integral equations
- Approximation and estimation bounds for artificial neural networks
- Multilayer feedforward networks are universal approximators
- Nesting Monte Carlo for high-dimensional non-linear PDEs
- Provable approximation properties for deep neural networks
- The Deep Ritz Method: a deep learning-based numerical algorithm for solving variational problems
- Degree of approximation by neural and translation networks with a single hidden layer
- Approximation of functions and their derivatives: A neural network implementation with applications
- Existence, uniqueness, and regularity for stochastic evolution equations with irregular initial values
- Topological properties of the set of functions generated by neural networks of fixed size
- Solving the Kolmogorov PDE by means of deep learning
- Proof that deep artificial neural networks overcome the curse of dimensionality in the numerical approximation of Kolmogorov partial differential equations with constant diffusion and nonlinear drift coefficients
- Multilevel Picard iterations for solving smooth semilinear parabolic heat equations
- DNN expression rate analysis of high-dimensional PDEs: application to option pricing
- A theoretical analysis of deep neural networks and parametric PDEs
- Universal approximations of invariant maps by neural networks
- Deep neural network approximations for solutions of PDEs based on Monte Carlo algorithms
- Optimal approximation of piecewise smooth functions using deep ReLU neural networks
- A proof that rectified deep neural networks overcome the curse of dimensionality in the numerical approximation of semilinear heat equations
- Error bounds for approximations with deep ReLU networks
- A machine learning framework for data driven acceleration of computations of differential equations
- Asymptotic expansion as prior knowledge in deep learning method for high dimensional BSDEs
- Machine learning approximation algorithms for high-dimensional fully nonlinear partial differential equations and second-order backward stochastic differential equations
- Loss of regularity for Kolmogorov equations
- Stochastic simulation and Monte Carlo methods. Mathematical foundations of stochastic simulation
- Runge-Kutta schemes for backward stochastic differential equations
- A regression-based Monte Carlo method to solve backward stochastic differential equations
- Space-time error estimates for deep neural network approximations for differential equations
- Stratified Regression Monte-Carlo Scheme for Semilinear PDEs and BSDEs with Large Scale Parallelization on GPUs
- Deep vs. shallow networks: An approximation theory perspective
- Multilevel Monte Carlo Path Simulation
- Viscosity Solutions of Hamilton-Jacobi Equations
- User’s guide to viscosity solutions of second order partial differential equations
- Universal approximation bounds for superpositions of a sigmoidal function
- Neural Networks for Localized Approximation
- Foundations of Modern Probability
- Book Reviews
- Convergence in Hölder norms with applications to Monte Carlo methods in infinite dimensions
- Solving high-dimensional partial differential equations using deep learning
- Solving parametric PDE problems with artificial neural networks
- Optimal Approximation with Sparsely Connected Deep Neural Networks
- Rectified deep neural networks overcome the curse of dimensionality for nonsmooth value functions in zero-sum games of nonlinear stiff systems
- Overcoming the curse of dimensionality in the numerical approximation of semilinear parabolic partial differential equations
- Decoupling on the Wiener Space, Related Besov Spaces, and Applications to BSDEs
- A general reinforcement learning algorithm that masters chess, shogi, and Go through self-play
- Approximation of high-dimensional parametric PDEs
- Numerical solution of parabolic equations in high dimensions
- Breaking the Curse of Dimensionality with Convex Neural Networks
- Deep optimal stopping
- A logical calculus of the ideas immanent in nervous activity
- Probability theory. A comprehensive course
- Approximation by superpositions of a sigmoidal function
- Error bounds for approximation with neural networks
- Stochastic Equations in Infinite Dimensions
This page was built for publication: A Proof that Artificial Neural Networks Overcome the Curse of Dimensionality in the Numerical Approximation of Black–Scholes Partial Differential Equations