Dynamic programming principle of control systems on manifolds and its relations to maximum principle
DOI10.1016/j.jmaa.2015.09.014zbMath1327.49042OpenAlexW1429453323MaRDI QIDQ890544
Publication date: 10 November 2015
Published in: Journal of Mathematical Analysis and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jmaa.2015.09.014
Hamilton-Jacobi-Bellman equationviscosity solutionRiemannian manifoldPontryagin's maximum principledynamic programming principle
Dynamic programming in optimal control and differential games (49L20) Control/observation systems governed by ordinary differential equations (93C15) Viscosity solutions to Hamilton-Jacobi equations in optimal control and differential games (49L25) Optimality conditions for problems involving ordinary differential equations (49K15) Manifolds and measure-geometric topics (49Q99)
Related Items
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Viscosity solutions to second order partial differential equations on Riemannian manifolds
- New results on the relationship between dynamic programming and the maximum principle
- Maximum principle, dynamic programming and their connection in deterministic control
- A proof of Pontryagin's minimum principle using dynamic programming
- Optimal control problems on manifolds: A dynamic programming approach
- Semicontinuous solutions of Hamilton-Jacobi-Bellman equations with degenerate state constraints
- Control theory from the geometric viewpoint.
- Discontinuous solutions of Hamilton-Jacobi-Bellman equation under state constraints
- Existence of neighboring feasible trajectories: applications to dynamic programming for state-constrained optimal control problems
- Value functions and transversality conditions for infinite-horizon optimal control problems
- On relations of the adjoint state to the value function for optimal control problems with state constraints
- The optimal control related to Riemannian manifolds and the viscosity solutions to Hamilton-Jacobi-Bellman equations
- A note on the value function for constrained control problems
- Deterministic state-constrained optimal control problems without controllability assumptions
- Sensitivity Interpretations of the Costate Variable for Optimal Control Problems with State Constraints
- Hamilton-Jacobi Equations with State Constraints
- Optimal Control with State-Space Constraint I
- Viscosity Solutions of Hamilton-Jacobi Equations
- The Relationship between the Maximum Principle and Dynamic Programming
- The Pontryagin Maximum Principle From Dynamic Programming and Viscosity Solutions to First-Order Partial Differential Equations
- Riemannian Geometry
- A Connection Between the Maximum Principle and Dynamic Programming for Constrained Control Problems
This page was built for publication: Dynamic programming principle of control systems on manifolds and its relations to maximum principle