Estimation and Control of Dynamical Systems


Book Description

This book provides a comprehensive presentation of classical and advanced topics in estimation and control of dynamical systems with an emphasis on stochastic control. Many aspects which are not easily found in a single text are provided, such as connections between control theory and mathematical finance, as well as differential games. The book is self-contained and prioritizes concepts rather than full rigor, targeting scientists who want to use control theory in their research in applied mathematics, engineering, economics, and management science. Examples and exercises are included throughout, which will be useful for PhD courses and graduate courses in general. Dr. Alain Bensoussan is Lars Magnus Ericsson Chair at UT Dallas and Director of the International Center for Decision and Risk Analysis which develops risk management research as it pertains to large-investment industrial projects that involve new technologies, applications and markets. He is also Chair Professor at City University Hong Kong.




Continuous Time Dynamical Systems


Book Description

Optimal control deals with the problem of finding a control law for a given system such that a certain optimality criterion is achieved. An optimal control is a set of differential equations describing the paths of the control variables that minimize the cost functional. This book, Continuous Time Dynamical Systems: State Estimation and Optimal Control with Orthogonal Functions, considers different classes of systems with quadratic performance criteria. It then attempts to find the optimal control law for each class of systems using orthogonal functions that can optimize the given performance criteria. Illustrated throughout with detailed examples, the book covers topics including: Block-pulse functions and shifted Legendre polynomials State estimation of linear time-invariant systems Linear optimal control systems incorporating observers Optimal control of systems described by integro-differential equations Linear-quadratic-Gaussian control Optimal control of singular systems Optimal control of time-delay systems with and without reverse time terms Optimal control of second-order nonlinear systems Hierarchical control of linear time-invariant and time-varying systems




Estimation, Control, and the Discrete Kalman Filter


Book Description

In 1960, R. E. Kalman published his celebrated paper on recursive min imum variance estimation in dynamical systems [14]. This paper, which introduced an algorithm that has since been known as the discrete Kalman filter, produced a virtual revolution in the field of systems engineering. Today, Kalman filters are used in such diverse areas as navigation, guid ance, oil drilling, water and air quality, and geodetic surveys. In addition, Kalman's work led to a multitude of books and papers on minimum vari ance estimation in dynamical systems, including one by Kalman and Bucy on continuous time systems [15]. Most of this work was done outside of the mathematics and statistics communities and, in the spirit of true academic parochialism, was, with a few notable exceptions, ignored by them. This text is my effort toward closing that chasm. For mathematics students, the Kalman filtering theorem is a beautiful illustration of functional analysis in action; Hilbert spaces being used to solve an extremely important problem in applied mathematics. For statistics students, the Kalman filter is a vivid example of Bayesian statistics in action. The present text grew out of a series of graduate courses given by me in the past decade. Most of these courses were given at the University of Mas sachusetts at Amherst.




Optimal Estimation of Dynamic Systems


Book Description

Most newcomers to the field of linear stochastic estimation go through a difficult process in understanding and applying the theory.This book minimizes the process while introducing the fundamentals of optimal estimation. Optimal Estimation of Dynamic Systems explores topics that are important in the field of control where the signals received are used to determine highly sensitive processes such as the flight path of a plane, the orbit of a space vehicle, or the control of a machine. The authors use dynamic models from mechanical and aerospace engineering to provide immediate results of estimation concepts with a minimal reliance on mathematical skills. The book documents the development of the central concepts and methods of optimal estimation theory in a manner accessible to engineering students, applied mathematicians, and practicing engineers. It includes rigorous theoretial derivations and a significant amount of qualitiative discussion and judgements. It also presents prototype algorithms, giving detail and discussion to stimulate development of efficient computer programs and intelligent use of them. This book illustrates the application of optimal estimation methods to problems with varying degrees of analytical and numercial difficulty. It compares various approaches to help develop a feel for the absolute and relative utility of different methods, and provides many applications in the fields of aerospace, mechanical, and electrical engineering.




Estimators for Uncertain Dynamic Systems


Book Description

When solving the control and design problems in aerospace and naval engi neering, energetics, economics, biology, etc., we need to know the state of investigated dynamic processes. The presence of inherent uncertainties in the description of these processes and of noises in measurement devices leads to the necessity to construct the estimators for corresponding dynamic systems. The estimators recover the required information about system state from mea surement data. An attempt to solve the estimation problems in an optimal way results in the formulation of different variational problems. The type and complexity of these variational problems depend on the process model, the model of uncertainties, and the estimation performance criterion. A solution of variational problem determines an optimal estimator. Howerever, there exist at least two reasons why we use nonoptimal esti mators. The first reason is that the numerical algorithms for solving the corresponding variational problems can be very difficult for numerical imple mentation. For example, the dimension of these algorithms can be very high.




Data-Driven Science and Engineering


Book Description

A textbook covering data-science and machine learning methods for modelling and control in engineering and science, with Python and MATLABĀ®.




Dynamic Systems in Management Science


Book Description

Dynamic Systems in Management Science explores the important gaps in the existing literature on operations research and management science by providing new and operational methods which are tested in practical environment and a variety of new applications.




Materials Phase Change PDE Control & Estimation


Book Description

This monograph introduces breakthrough control algorithms for partial differential equation models with moving boundaries, the study of which is known as the Stefan problem. The algorithms can be used to improve the performance of various processes with phase changes, such as additive manufacturing. Using the authors' innovative design solutions, readers will also be equipped to apply estimation algorithms for real-world phase change dynamics, from polar ice to lithium-ion batteries. A historical treatment of the Stefan problem opens the book, situating readers in the larger context of the area. Following this, the chapters are organized into two parts. The first presents the design method and analysis of the boundary control and estimation algorithms. Part two then explores a number of applications, such as 3D printing via screw extrusion and laser sintering, and also discusses the experimental verifications conducted. A number of open problems and provided as well, offering readers multiple paths to explore in future research. Materials Phase Change PDE Control & Estimation is ideal for researchers and graduate students working on control and dynamical systems, and particularly those studying partial differential equations and moving boundaries. It will also appeal to industrial engineers and graduate students in engineering who are interested in this area.




Modeling, Analysis And Control Of Dynamical Systems With Friction And Impacts


Book Description

This book is aimed primarily towards physicists and mechanical engineers specializing in modeling, analysis, and control of discontinuous systems with friction and impacts. It fills a gap in the existing literature by offering an original contribution to the field of discontinuous mechanical systems based on mathematical and numerical modeling as well as the control of such systems. Each chapter provides the reader with both the theoretical background and results of verified and useful computations, including solutions of the problems of modeling and application of friction laws in numerical computations, results from finding and analyzing impact solutions, the analysis and control of dynamical systems with discontinuities, etc. The contents offer a smooth correspondence between science and engineering and will allow the reader to discover new ideas. Also emphasized is the unity of diverse branches of physics and mathematics towards understanding complex piecewise-smooth dynamical systems. Mathematical models presented will be important in numerical experiments, experimental measurements, and optimization problems found in applied mechanics.




Nonlinear Dynamical Systems and Control


Book Description

Nonlinear Dynamical Systems and Control presents and develops an extensive treatment of stability analysis and control design of nonlinear dynamical systems, with an emphasis on Lyapunov-based methods. Dynamical system theory lies at the heart of mathematical sciences and engineering. The application of dynamical systems has crossed interdisciplinary boundaries from chemistry to biochemistry to chemical kinetics, from medicine to biology to population genetics, from economics to sociology to psychology, and from physics to mechanics to engineering. The increasingly complex nature of engineering systems requiring feedback control to obtain a desired system behavior also gives rise to dynamical systems. Wassim Haddad and VijaySekhar Chellaboina provide an exhaustive treatment of nonlinear systems theory and control using the highest standards of exposition and rigor. This graduate-level textbook goes well beyond standard treatments by developing Lyapunov stability theory, partial stability, boundedness, input-to-state stability, input-output stability, finite-time stability, semistability, stability of sets and periodic orbits, and stability theorems via vector Lyapunov functions. A complete and thorough treatment of dissipativity theory, absolute stability theory, stability of feedback systems, optimal control, disturbance rejection control, and robust control for nonlinear dynamical systems is also given. This book is an indispensable resource for applied mathematicians, dynamical systems theorists, control theorists, and engineers.