Open Loop Optimal Feedback Control for Continuous Linear Stochastic Systems


Book Description

The control of a continuous time linear system with parameters and disturbance represented by stochastic processes is studied. The optimal open loop control is shown to be a linear function of the expected value of the initial condition vector and the function specifying the control, the control generation matrix, is shown to be the solution to a Fredholm integral equation. A computational procedure is derived for the solution to the control generation matrix based on results by Kagiwada and Kalaba for the solution to a Fredholm integral equation. A closed loop control law, the open loop optimal feedback (OLOF) control, is derived from the optimal open loop control and the control generation matrix shown to be the solution to a Volterra integral equation. The OLOF CONTROL GENERATION MATRIX FOR THE TIME-INVARIANT, INFINITE TIME SYSTEM IS SHOWN TO BE A CONSTANT MATRIX. Some examples are worked to demonstrate the OLOF control and to compare it with the optimal open loop control. (Author).







Linear Stochastic Control Systems


Book Description

Linear Stochastic Control Systems presents a thorough description of the mathematical theory and fundamental principles of linear stochastic control systems. Both continuous-time and discrete-time systems are thoroughly covered. Reviews of the modern probability and random processes theories and the Itô stochastic differential equations are provided. Discrete-time stochastic systems theory, optimal estimation and Kalman filtering, and optimal stochastic control theory are studied in detail. A modern treatment of these same topics for continuous-time stochastic control systems is included. The text is written in an easy-to-understand style, and the reader needs only to have a background of elementary real analysis and linear deterministic systems theory to comprehend the subject matter. This graduate textbook is also suitable for self-study, professional training, and as a handy research reference. Linear Stochastic Control Systems is self-contained and provides a step-by-step development of the theory, with many illustrative examples, exercises, and engineering applications.




Stochastic Optimal Control


Book Description

Presents techniques for optimizing problems in dynamic systems with terminal and path constraints. Includes optimal feedback control, feedback control for linear systems, and regulator synthesis. Offers iterative methods for solving nonlinear control problems. Demonstrates how to apply optimal control in a practical fashion. Serves as a text for graduate controls courses as offered in aerospace, mechanical and chemical engineering departments.




Optimal Control and Estimation


Book Description

Graduate-level text provides introduction to optimal control theory for stochastic systems, emphasizing application of basic concepts to real problems. "Invaluable as a reference for those already familiar with the subject." — Automatica.




Stochastic Linear-Quadratic Optimal Control Theory: Open-Loop and Closed-Loop Solutions


Book Description

This book gathers the most essential results, including recent ones, on linear-quadratic optimal control problems, which represent an important aspect of stochastic control. It presents the results in the context of finite and infinite horizon problems, and discusses a number of new and interesting issues. Further, it precisely identifies, for the first time, the interconnections between three well-known, relevant issues – the existence of optimal controls, solvability of the optimality system, and solvability of the associated Riccati equation. Although the content is largely self-contained, readers should have a basic grasp of linear algebra, functional analysis and stochastic ordinary differential equations. The book is mainly intended for senior undergraduate and graduate students majoring in applied mathematics who are interested in stochastic control theory. However, it will also appeal to researchers in other related areas, such as engineering, management, finance/economics and the social sciences.




Optimal Incomplete Feedback Control of Linear Stochastic Systems


Book Description

The problem of incomplete feedback control of stochastic linear systems is considered. The system is modeled by an uncertain parameter linear differential equation driven by Gaussian white noise and an incomplete observation which is a linear transformation of the states. The optimal control is the linear transformation which minimizes the expected value of a quadratic performance index. For both the finite and infinite time problems, necessary conditions that the optimal control law must satisfy are derived. Time varying and constant gains are considered for the finite time problem. For the infinite time problem only time invariant gains are considered. The gradient derived for the infinite time problem is applied to a flight control design problem. This problem concerns finding feedback gains to improve the lateral handling qualities of an F-4 at two different flight conditions. The resulting control laws give quite adequate aircraft handling qualities for the aircraft at both flight conditions.







On the Adaptive Control of Linear Systems Using the Open-Loop-Feedback-Optimal Approach


Book Description

The paper considers the suboptimal stochastic control of linear discrete-time dynamical systems with unknown or stochasticallt varying parameters. The suboptimal scheme is based upon the use of the open-loop-feedback-optimal (O.L.F.O.) method. The state and parameter estimates are generated by an extended Kalman filter algorithm. Numerical results for first order systems are presented. (Author).




Linear Optimal Control Systems


Book Description

"This book attempts to reconcile modern linear control theory with classical control theory. One of the major concerns of this text is to present design methods, employing modern techniques, for obtaining control systems that stand up to the requirements that have been so well developed in the classical expositions of control theory. Therefore, among other things, an entire chapter is devoted to a description of the analysis of control systems, mostly following the classical lines of thought. In the later chapters of the book, in which modern synthesis methods are developed, the chapter on analysis is recurrently referred to. Furthermore, special attention is paid to subjects that are standard in classical control theory but are frequently overlooked in modern treatments, such as nonzero set point control systems, tracking systems, and control systems that have to cope with constant disturbances. Also, heavy emphasis is placed upon the stochastic nature of control problems because the stochastic aspects are so essential." --Preface.