Modern Trends in Controlled Stochastic Processes:


Book Description

This book presents state-of-the-art solution methods and applications of stochastic optimal control. It is a collection of extended papers discussed at the traditional Liverpool workshop on controlled stochastic processes with participants from both the east and the west. New problems are formulated, and progresses of ongoing research are reported. Topics covered in this book include theoretical results and numerical methods for Markov and semi-Markov decision processes, optimal stopping of Markov processes, stochastic games, problems with partial information, optimal filtering, robust control, Q-learning, and self-organizing algorithms. Real-life case studies and applications, e.g., queueing systems, forest management, control of water resources, marketing science, and healthcare, are presented. Scientific researchers and postgraduate students interested in stochastic optimal control,- as well as practitioners will find this book appealing and a valuable reference. ​




Modern Trends in Controlled Stochastic Processes


Book Description

World leading experts give their accounts of the modern mathematical models in the field: Markov Decision Processes, controlled diffusions, piece-wise deterministic processes etc, with a wide range of performance functionals. One of the aims is to give a general view on the state-of-the-art. The authors use Dynamic Programming, Convex Analytic Approach, several numerical methods, index-based approach and so on. Most chapters either contain well developed examples, or are entirely devoted to the application of the mathematical control theory to real life problems from such fields as Insurance, Portfolio Optimization and Information Transmission. The book will enable researchers, academics and research students to get a sense of novel results, concepts, models, methods, and applications of controlled stochastic processes.




Modern Trends in Controlled Stochastic Processes: Theory and Applications


Book Description

World leading experts give their accounts of the modern mathematical models in the field: Markov Decision Processes, Controlled Diffusions, etc, with a wide range of performance functionals. One of the aims is to give a general view on the state-of-the-art. The authors use Dynamic Programming, Convex Analytic Approach, several Approximate and Numerical Methods, Index-Based Approach and so on. Most chapters either contain well developed examples, or are entirely devoted to the application of the mathematical control theory to real life problems from such fields as Insurance, Portfolio Optimization, Control of Water Resources, Information Transmission, Quality Control, Pollution Control and so on. The book will enable researchers, academics and research students to get a sense of novel results, concepts, models, methods, and applications of controlled stochastic processes.




Stochastic Processes, Finance And Control: A Festschrift In Honor Of Robert J Elliott


Book Description

This book consists of a series of new, peer-reviewed papers in stochastic processes, analysis, filtering and control, with particular emphasis on mathematical finance, actuarial science and engineering. Paper contributors include colleagues, collaborators and former students of Robert Elliott, many of whom are world-leading experts and have made fundamental and significant contributions to these areas.This book provides new important insights and results by eminent researchers in the considered areas, which will be of interest to researchers and practitioners. The topics considered will be diverse in applications, and will provide contemporary approaches to the problems considered. The areas considered are rapidly evolving. This volume will contribute to their development, and present the current state-of-the-art stochastic processes, analysis, filtering and control.Contributing authors include: H Albrecher, T Bielecki, F Dufour, M Jeanblanc, I Karatzas, H-H Kuo, A Melnikov, E Platen, G Yin, Q Zhang, C Chiarella, W Fleming, D Madan, R Mamon, J Yan, V Krishnamurthy.




Optimization, Control, and Applications of Stochastic Systems


Book Description

This volume provides a general overview of discrete- and continuous-time Markov control processes and stochastic games, along with a look at the range of applications of stochastic control and some of its recent theoretical developments. These topics include various aspects of dynamic programming, approximation algorithms, and infinite-dimensional linear programming. In all, the work comprises 18 carefully selected papers written by experts in their respective fields. Optimization, Control, and Applications of Stochastic Systems will be a valuable resource for all practitioners, researchers, and professionals in applied mathematics and operations research who work in the areas of stochastic control, mathematical finance, queueing theory, and inventory systems. It may also serve as a supplemental text for graduate courses in optimal control and dynamic games.




Stochastic Analysis, Filtering, and Stochastic Optimization


Book Description

This volume is a collection of research works to honor the late Professor Mark H.A. Davis, whose pioneering work in the areas of Stochastic Processes, Filtering, and Stochastic Optimization spans more than five decades. Invited authors include his dissertation advisor, past collaborators, colleagues, mentees, and graduate students of Professor Davis, as well as scholars who have worked in the above areas. Their contributions may expand upon topics in piecewise deterministic processes, pathwise stochastic calculus, martingale methods in stochastic optimization, filtering, mean-field games, time-inconsistency, as well as impulse, singular, risk-sensitive and robust stochastic control.




Selected Topics on Continuous-time Controlled Markov Chains and Markov Games


Book Description

This book concerns continuous-time controlled Markov chains, also known as continuous-time Markov decision processes. They form a class of stochastic control problems in which a single decision-maker wishes to optimize a given objective function. This book is also concerned with Markov games, where two decision-makers (or players) try to optimize their own objective function. Both decision-making processes appear in a large number of applications in economics, operations research, engineering, and computer science, among other areas. An extensive, self-contained, up-to-date analysis of basic optimality criteria (such as discounted and average reward), and advanced optimality criteria (e.g., bias, overtaking, sensitive discount, and Blackwell optimality) is presented. A particular emphasis is made on the application of the results herein: algorithmic and computational issues are discussed, and applications to population models and epidemic processes are shown. This book is addressed to students and researchers in the fields of stochastic control and stochastic games. Moreover, it could be of interest also to undergraduate and beginning graduate students because the reader is not supposed to have a high mathematical background: a working knowledge of calculus, linear algebra, probability, and continuous-time Markov chains should suffice to understand the contents of the book.




Continuous-Time Markov Decision Processes


Book Description

This book offers a systematic and rigorous treatment of continuous-time Markov decision processes, covering both theory and possible applications to queueing systems, epidemiology, finance, and other fields. Unlike most books on the subject, much attention is paid to problems with functional constraints and the realizability of strategies. Three major methods of investigations are presented, based on dynamic programming, linear programming, and reduction to discrete-time problems. Although the main focus is on models with total (discounted or undiscounted) cost criteria, models with average cost criteria and with impulsive controls are also discussed in depth. The book is self-contained. A separate chapter is devoted to Markov pure jump processes and the appendices collect the requisite background on real analysis and applied probability. All the statements in the main text are proved in detail. Researchers and graduate students in applied probability, operational research, statistics and engineering will find this monograph interesting, useful and valuable.




Markov Decision Processes with Applications to Finance


Book Description

The theory of Markov decision processes focuses on controlled Markov chains in discrete time. The authors establish the theory for general state and action spaces and at the same time show its application by means of numerous examples, mostly taken from the fields of finance and operations research. By using a structural approach many technicalities (concerning measure theory) are avoided. They cover problems with finite and infinite horizons, as well as partially observable Markov decision processes, piecewise deterministic Markov decision processes and stopping problems. The book presents Markov decision processes in action and includes various state-of-the-art applications with a particular view towards finance. It is useful for upper-level undergraduates, Master's students and researchers in both applied probability and finance, and provides exercises (without solutions).




Advances in Dynamic and Mean Field Games


Book Description

This contributed volume considers recent advances in dynamic games and their applications, based on presentations given at the 17th Symposium of the International Society of Dynamic Games, held July 12-15, 2016, in Urbino, Italy. Written by experts in their respective disciplines, these papers cover various aspects of dynamic game theory including mean-field games, stochastic and pursuit-evasion games, and computational methods for dynamic games. Topics covered include Pedestrian flow in crowded environments Models for climate change negotiations Nash Equilibria for dynamic games involving Volterra integral equations Differential games in healthcare markets Linear-quadratic Gaussian dynamic games Aircraft control in wind shear conditions Advances in Dynamic and Mean-Field Games presents state-of-the-art research in a wide spectrum of areas. As such, it serves as a testament to the continued vitality and growth of the field of dynamic games and their applications. It will be of interest to an interdisciplinary audience of researchers, practitioners, and graduate students.