Linear, Time-varying Approximations to Nonlinear Dynamical Systems


Book Description

Linear, Time-varying Approximations to Nonlinear Dynamical Systems introduces a new technique for analysing and controlling nonlinear systems. This method is general and requires only very mild conditions on the system nonlinearities, setting it apart from other techniques such as those – well-known – based on differential geometry. The authors cover many aspects of nonlinear systems including stability theory, control design and extensions to distributed parameter systems. Many of the classical and modern control design methods which can be applied to linear, time-varying systems can be extended to nonlinear systems by this technique. The implementation of the control is therefore simple and can be done with well-established classical methods. Many aspects of nonlinear systems, such as spectral theory which is important for the generalisation of frequency domain methods, can be approached by this method.




Stability and Control of Nonlinear Time-varying Systems


Book Description

This book presents special systems derived from industrial models, including the complex saturation nonlinear functions and the delay nonlinear functions. It also presents typical methods, such as the classical Liapunov and Integral Inequalities methods. Providing constructive qualitative and stability conditions for linear systems with saturated inputs in both global and local contexts, it offers practitioners more concise model systems for modern saturation nonlinear techniques, which have the potential for future applications. This book is a valuable guide for researchers and graduate students in the fields of mathematics, control, and engineering.




Calculus of Variations and Optimal Control Theory


Book Description

This textbook offers a concise yet rigorous introduction to calculus of variations and optimal control theory, and is a self-contained resource for graduate students in engineering, applied mathematics, and related subjects. Designed specifically for a one-semester course, the book begins with calculus of variations, preparing the ground for optimal control. It then gives a complete proof of the maximum principle and covers key topics such as the Hamilton-Jacobi-Bellman theory of dynamic programming and linear-quadratic optimal control. Calculus of Variations and Optimal Control Theory also traces the historical development of the subject and features numerous exercises, notes and references at the end of each chapter, and suggestions for further study. Offers a concise yet rigorous introduction Requires limited background in control theory or advanced mathematics Provides a complete proof of the maximum principle Uses consistent notation in the exposition of classical and modern topics Traces the historical development of the subject Solutions manual (available only to teachers) Leading universities that have adopted this book include: University of Illinois at Urbana-Champaign ECE 553: Optimum Control Systems Georgia Institute of Technology ECE 6553: Optimal Control and Optimization University of Pennsylvania ESE 680: Optimal Control Theory University of Notre Dame EE 60565: Optimal Control




Frequency Domain Criteria for Absolute stability


Book Description

Frequency Domain Criteria for Absolute Stability presents some generalizations of the well-known Popov solution to the absolute stability problem proposed by Lur'e and Postnikov in 1944. This book is divided into nine chapters that focus on the application of Lyapunov's direct method to generate frequency domain criteria for stability. The first eight chapters explore the systems with a single nonlinear function or time-varying parameter. These chapters also discuss the development of stability criteria for these systems, the sufficiency theorems, and Lyapunov function. Some of the theorems applied to a damped version of the Mathieu equation and to a nonlinear equation derived from it are also covered. The concluding chapter deals with systems with multiple nonlinearities or time-varying gains. This chapter also outlines the basic definitions and tools, as well as the derivation of stability criteria. This work will serve as a reference for research courses concerning stability problems related to the absolute stability problem of Lur'e and Postnikov. Engineers and applied mathematicians will also find this book invaluable.







The Mind of an Engineer: Volume 2


Book Description

This book is a collection of chapters reflecting the experiences and achievements of some of the Fellows of the Indian National Academy of Engineering (INAE). The book comprises essays that look at reminiscences, eureka moments, inspirations, challenges and opportunities in the journey of an engineering professional. The chapters look at the paths successful engineering professionals take towards self-realisation, the milestones they crossed, and the goals they reached. The book contains 38 chapters on diverse topics that truly reflect the way the meaningful mind of an engineer works.




Bounded Variation and Around


Book Description

The aim of this monograph is to give a thorough and self-contained account of functions of (generalized) bounded variation, the methods connected with their study, their relations to other important function classes, and their applications to various problems arising in Fourier analysis and nonlinear analysis. In the first part the basic facts about spaces of functions of bounded variation and related spaces are collected, the main ideas which are useful in studying their properties are presented, and a comparison of their importance and suitability for applications is provided, with a particular emphasis on illustrative examples and counterexamples. The second part is concerned with (sometimes quite surprising) properties of nonlinear composition and superposition operators in such spaces. Moreover, relations with Riemann-Stieltjes integrals, convergence tests for Fourier series, and applications to nonlinear integral equations are discussed. The only prerequisite for understanding this book is a modest background in real analysis, functional analysis, and operator theory. It is addressed to non-specialists who want to get an idea of the development of the theory and its applications in the last decades, as well as a glimpse of the diversity of the directions in which current research is moving. Since the authors try to take into account recent results and state several open problems, this book might also be a fruitful source of inspiration for further research.




Saturated Switching Systems


Book Description

Saturated Switching Systems treats the problem of actuator saturation, inherent in all dynamical systems by using two approaches: positive invariance in which the controller is designed to work within a region of non-saturating linear behaviour; and saturation technique which allows saturation but guarantees asymptotic stability. The results obtained are extended from the linear systems in which they were first developed to switching systems with uncertainties, 2D switching systems, switching systems with Markovian jumping and switching systems of the Takagi-Sugeno type. The text represents a thoroughly referenced distillation of results obtained in this field during the last decade. The selected tool for analysis and design of stabilizing controllers is based on multiple Lyapunov functions and linear matrix inequalities. All the results are illustrated with numerical examples and figures many of them being modelled using MATLAB®. Saturated Switching Systems will be of interest to academic researchers in control systems and to professionals working in any of the many fields where systems are affected by saturation including: chemical and pharmaceutical batch processing, manufacturing (for example in steel rolling), air-traffic control, and the automotive and aerospace industries.




Constructions of Strict Lyapunov Functions


Book Description

Converse Lyapunov function theory guarantees the existence of strict Lyapunov functions in many situations, but the functions it provides are often abstract and nonexplicit, and therefore may not lend themselves to engineering applications. Often, even when a system is known to be stable, one still needs explicit Lyapunov functions; however, once an appropriate strict Lyapunov function has been constructed, many robustness and stabilization problems can be solved through standard feedback designs or robustness arguments. Non-strict Lyapunov functions are often readily constructed. This book contains a broad repertoire of Lyapunov constructions for nonlinear systems, focusing on methods for transforming non-strict Lyapunov functions into strict ones. Their explicitness and simplicity make them suitable for feedback design, and for quantifying the effects of uncertainty. Readers will benefit from the authors’ mathematical rigor and unifying, design-oriented approach, as well as the numerous worked examples.




Stochastic H2/H ∞ Control: A Nash Game Approach


Book Description

The H∞ control has been one of the important robust control approaches since the 1980s. This book extends the area to nonlinear stochastic H2/H∞ control, and studies more complex and practically useful mixed H2/H∞ controller synthesis rather than the pure H∞ control. Different from the commonly used convex optimization method, this book applies the Nash game approach to give necessary and sufficient conditions for the existence and uniqueness of the mixed H2/H∞ control. Researchers will benefit from our detailed exposition of the stochastic mixed H2/H∞ control theory, while practitioners can apply our efficient algorithms to address their practical problems.