Skip to main content
TR EN
ME 601 Optimal Control
After a short review of static optimization and numerical methods to address static optimization problems, students will be introduced to the principle of optimality and the Hamilton-Jacobi-Bellman equations in the context of dynamic programming. Calculus of variations will be studied in detail, emphasizing necessary and sufficient conditions for an extrema. Constrained problems, Pontryagin's maximum principle will be discussed; formulation of optimal control problems and performance measures will be covered. Special attention will be paid to linear quadratic regulator/tracking, minimum-time, and minimum control-effort problems. Finally, optimal controllers will be synthesized using direct and indirect numerical techniques.
SU Credits : 3.000
ECTS Credit : 10.000
Prerequisite : -
Corequisite : -