Is $L^2$ Physics-Informed Loss Always Suitable for Training Physics-Informed Neural Network?
From MaRDI portal
Publication:6401133
arXiv2206.02016MaRDI QIDQ6401133
Author name not available (Why is that?)
Publication date: 4 June 2022
Abstract: The Physics-Informed Neural Network (PINN) approach is a new and promising way to solve partial differential equations using deep learning. The Physics-Informed Loss is the de-facto standard in training Physics-Informed Neural Networks. In this paper, we challenge this common practice by investigating the relationship between the loss function and the approximation quality of the learned solution. In particular, we leverage the concept of stability in the literature of partial differential equation to study the asymptotic behavior of the learned solution as the loss approaches zero. With this concept, we study an important class of high-dimensional non-linear PDEs in optimal control, the Hamilton-Jacobi-Bellman(HJB) Equation, and prove that for general Physics-Informed Loss, a wide class of HJB equation is stable only if is sufficiently large. Therefore, the commonly used loss is not suitable for training PINN on those equations, while loss is a better choice. Based on the theoretical insight, we develop a novel PINN training algorithm to minimize the loss for HJB equations which is in a similar spirit to adversarial training. The effectiveness of the proposed algorithm is empirically demonstrated through experiments. Our code is released at https://github.com/LithiumDA/L_inf-PINN.
Has companion code repository: https://github.com/lithiumda/l_inf-pinn
This page was built for publication: Is $L^2$ Physics-Informed Loss Always Suitable for Training Physics-Informed Neural Network?
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6401133)