A Necessary and Sufficient Condition for Optimality of Dynamic Programming Type, Making No a Priori Assumptions on the Controls
From MaRDI portal
Publication:4173875
DOI10.1137/0316038zbMath0392.49011OpenAlexW2057141338MaRDI QIDQ4173875
R. M. Lewis, Richard B. Vinter
Publication date: 1978
Published in: SIAM Journal on Control and Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/0316038
LagrangianDynamic ProgrammingNecessary and Sufficient ConditionBellman Partial Differential EquationReachable SetRelaxed Controls
Optimality conditions for problems involving ordinary differential equations (49K15) Hamilton-Jacobi theories (49L99)
Related Items (11)
Hamilton–Jacobi theory for a generalized optimal stopping time problem ⋮ A partial history of the early development of continuous-time nonlinear stochastic systems theory ⋮ Erweiterung von mehrdimensionalen steuerungspromemen und dualität ⋮ Generalized Bellman-Hamilton-Jacobi optimality conditions for a control problem with a boundary condition ⋮ A supporting hyperplane derivation of the Hamilton-Jacobi-Bellman equation of dynamic programming ⋮ Relaxation of optimal control problems to equivalent convex programs ⋮ Fundamental theorem for optimal output feedback problem with quadratic performance criterion ⋮ The Semigroup Property of Value Functions in Lagrange Problems ⋮ Bounding Extreme Events in Nonlinear Dynamics Using Convex Optimization ⋮ Optimal control of piecewise deterministic markov process ⋮ Convex duality for finite-fuel problems in singular stochastic control
This page was built for publication: A Necessary and Sufficient Condition for Optimality of Dynamic Programming Type, Making No a Priori Assumptions on the Controls