A differential game of unlimited duration
From MaRDI portal
Publication:1115825
DOI10.1016/0021-8928(87)90077-3zbMath0664.90104OpenAlexW2091580027MaRDI QIDQ1115825
R. A. Adiatulina, Alexander Tarasyev
Publication date: 1987
Published in: Journal of Applied Mathematics and Mechanics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/0021-8928(87)90077-3
discrete approximationdifferential inequalitiesvalue functiondirectional derivativecontinuous-timeconjugate derivativesdepreciating performance functionalstationary Hamilton-Jacobi equationunlimited durationviscous solutions
Differential games (aspects of game theory) (91A23) Nonlinear systems in control theory (93C10) Control/observation systems governed by ordinary differential equations (93C15)
Related Items
The solution of evolutionary games using the theory of Hamilton-Jacobi equations, Numerical methods for construction of value functions in optimal control problems on an infinite horizon, Approximation and regular perturbation of optimal control problems via Hamilton-Jacobi theory, Differential games, partial-state stabilization, and model reference adaptive control, Stability properties of the value function in an infinite horizon optimal control problem, Estimate for the accuracy of a backward procedure for the Hamilton-Jacobi equation in an infinite-horizon optimal control problem
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Approximate solutions of the Bellman equation of deterministic control theory
- Properties of a differential game's potential
- On a discrete approximation of the Hamilton-Jacobi equation of dynamic programming
- Some Properties of Viscosity Solutions of Hamilton-Jacobi Equations
- Differential Games, Optimal Control and Directional Derivatives of Viscosity Solutions of Bellman’s and Isaacs’ Equations
- Viscosity Solutions of Hamilton-Jacobi Equations