Discrete time schemes for optimal control problems with monotone controls
From MaRDI portal
Publication:747188
DOI10.1007/s40314-014-0149-4zbMath1325.49034arXiv1407.1790OpenAlexW2079464252MaRDI QIDQ747188
Lisandro A. Parente, Laura S. Aragone, Eduardo A. Philipp
Publication date: 23 October 2015
Published in: Computational and Applied Mathematics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1407.1790
Existence theories for optimal control problems involving ordinary differential equations (49J15) Discrete approximations in optimal control (49M25)
Related Items (1)
Cites Work
- Approximate solutions of the Bellman equation of deterministic control theory
- On a discrete approximation of the Hamilton-Jacobi equation of dynamic programming
- Optimal control problems with no turning back
- Some Properties of Viscosity Solutions of Hamilton-Jacobi Equations
- Viscosity Solutions for the Monotone Control Problem
- Viscosity Solutions of Hamilton-Jacobi Equations
- Minimizing a Quadratic Payoff with Monotone Controls
- Optimal control and viscosity solutions of Hamilton-Jacobi-Bellman equations
- A Bellman's equation for minimizing the maximum cost
This page was built for publication: Discrete time schemes for optimal control problems with monotone controls