Positional solutions of Hamilton-Jacobi equations in control problems for discrete-continuous systems
DOI10.1134/S0005117911060051zbMath1230.49017OpenAlexW2046286725MaRDI QIDQ647782
Vladimir Aleksandrovich Dykhta, Stepan Pavlovich Sorokin
Publication date: 24 November 2011
Published in: Automation and Remote Control (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1134/s0005117911060051
sufficient optimality conditionsHamilton-Jacobi inequalitiescanonical global optimality theorydiscrete-continuous (hybrid, impulse) systems
Nonlinear systems in control theory (93C10) Optimality conditions for problems involving ordinary differential equations (49K15) Control/observation systems governed by functional relations other than differential equations (such as hybrid and switching systems) (93C30)
Cites Work
- Semiconcave functions, Hamilton-Jacobi equations, and optimal control
- Optimal control: nonlocal conditions, computational methods, and the variational principle of maximum
- Exact description of reachable sets and global optimality conditions for dynamic systems
- Linear Lyapunov-Krotov functions and sufficient conditions for optimality in the form of the maximum principle
- Lyapunov-Krotov inequality and sufficient conditions in optimal control
- Nonconvex minimization problems
- Optimal control and viscosity solutions of Hamilton-Jacobi-Bellman equations
- Optimal control
- Liapunov functions and stability in control theory
- High-order conditions for a minimum on a set of sequences in the abstract problem with inequality and equality constraints
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Positional solutions of Hamilton-Jacobi equations in control problems for discrete-continuous systems