Least-squares neural network (LSNN) method for scalar nonlinear hyperbolic conservation laws: discrete divergence operator
DOI10.1016/j.cam.2023.115298arXiv2110.10895MaRDI QIDQ6175199
Jingshuang Chen, Min Liu, Zhi-qiang Cai
Publication date: 21 July 2023
Published in: Journal of Computational and Applied Mathematics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2110.10895
least-squares methoddiscrete divergence operatorReLU neural networkscalar nonlinear hyperbolic conservation law
Numerical methods for partial differential equations, initial value and time-dependent initial-boundary value problems (65Mxx) Numerical methods for partial differential equations, boundary value problems (65Nxx) Hyperbolic equations and hyperbolic systems (35Lxx)
Related Items (1)
Cites Work
- Unnamed Item
- Unnamed Item
- A maximum-principle preserving \(C^0\) finite element method for scalar conservation equations
- A class of discontinuous Petrov-Galerkin methods. I: The transport equation
- Efficient implementation of essentially nonoscillatory shock-capturing schemes
- Approximate Riemann solvers, parameter vectors, and difference schemes
- A posteriori error analysis for numerical approximations of Friedrichs systems
- A posteriori error analysis for stabilised finite element approximations of transport problems
- DGM: a deep learning algorithm for solving partial differential equations
- Adaptive two-layer ReLU neural network. I: Best least-squares approximation
- Adaptive two-layer ReLU neural network. II: Ritz approximation to elliptic PDEs
- Deep least-squares methods: an unsupervised learning-based numerical method for solving elliptic PDEs
- Least-squares ReLU neural network (LSNN) method for linear advection-reaction equation
- Self-adaptive deep neural network: numerical approximation to functions and PDEs
- Thermodynamically consistent physics-informed neural networks for hyperbolic systems
- A hybrid Hermite WENO scheme for hyperbolic conservation laws
- Physics-informed neural networks: a deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations
- Least-squares ReLU neural network (LSNN) method for scalar nonlinear hyperbolic conservation law
- Improved Least-squares Error Estimates for Scalar Hyperbolic Problems
- A Posteriori Error Estimation for Interior Penalty Finite Element Approximations of the Advection-Reaction Equation
- Riemann Problems and Jupyter Solutions
- Riemann Solvers, the Entropy Condition, and Difference
- TVB Runge-Kutta Local Projection Discontinuous Galerkin Finite Element Method for Conservation Laws II: General Framework
- On the Gibbs Phenomenon and Its Resolution
- Least-Squares Finite Element Methods and Algebraic Multigrid Solvers for Linear Hyperbolic PDEs
- DISCONTINUOUS GALERKIN METHODS FOR FIRST-ORDER HYPERBOLIC PROBLEMS
- Adaptive Petrov--Galerkin Methods for First Order Transport Equations
- Learning data-driven discretizations for partial differential equations
- Numerical Conservation Properties of H(div)-Conforming Least-Squares Finite Element Methods for the Burgers Equation
This page was built for publication: Least-squares neural network (LSNN) method for scalar nonlinear hyperbolic conservation laws: discrete divergence operator