A Markov Chain for Rental Systems with Time-Dependency and Substitution of Assets


Book Description

Society currently experiences a growing interest in renting products instead of exclusively owning them. Simultaneously, the increasing availability of data allows rental companies to analyze their operations more effectively.As a result, we aim to support rental companies by providing a method for evaluating and optimizing the occupancy of their assets. Our approach accounts for rental systems of time-dependent customer behavior, where customers make substitute rentals during shortages. Specifically, we provide a continuous-time Markov chain assuming arrivals occur according to Poisson processes and rentals follow phase-type distributed rental times. We further provide a heuristic optimization algorithm for minimizing the inventory constrained by an upper shortage probability. We tested our approach in several numerical experiments and an application to a company case. The experiments indicated that our model adequately reflects the occupancy distribution of assets and minimizes the capacities to near-optimality. The company case resulted in a reduction of the inventory while simultaneously reducing the maximum shortage probability.




Introduction to Markov Chains


Book Description

Besides the investigation of general chains the book contains chapters which are concerned with eigenvalue techniques, conductance, stopping times, the strong Markov property, couplings, strong uniform times, Markov chains on arbitrary finite groups (including a crash-course in harmonic analysis), random generation and counting, Markov random fields, Gibbs fields, the Metropolis sampler, and simulated annealing. With 170 exercises.




Markov Chains


Book Description

Markov chains are a fundamental class of stochastic processes. They are widely used to solve problems in a large number of domains such as operational research, computer science, communication networks and manufacturing systems. The success of Markov chains is mainly due to their simplicity of use, the large number of available theoretical results and the quality of algorithms developed for the numerical evaluation of many metrics of interest. The author presents the theory of both discrete-time and continuous-time homogeneous Markov chains. He carefully examines the explosion phenomenon, the Kolmogorov equations, the convergence to equilibrium and the passage time distributions to a state and to a subset of states. These results are applied to birth-and-death processes. He then proposes a detailed study of the uniformization technique by means of Banach algebra. This technique is used for the transient analysis of several queuing systems. Contents 1. Discrete-Time Markov Chains 2. Continuous-Time Markov Chains 3. Birth-and-Death Processes 4. Uniformization 5. Queues About the Authors Bruno Sericola is a Senior Research Scientist at Inria Rennes – Bretagne Atlantique in France. His main research activity is in performance evaluation of computer and communication systems, dependability analysis of fault-tolerant systems and stochastic models.




Discrete-Time Markov Chains


Book Description

Focusing on discrete-time-scale Markov chains, the contents of this book are an outgrowth of some of the authors' recent research. The motivation stems from existing and emerging applications in optimization and control of complex hybrid Markovian systems in manufacturing, wireless communication, and financial engineering. Much effort in this book is devoted to designing system models arising from these applications, analyzing them via analytic and probabilistic techniques, and developing feasible computational algorithms so as to reduce the inherent complexity. This book presents results including asymptotic expansions of probability vectors, structural properties of occupation measures, exponential bounds, aggregation and decomposition and associated limit processes, and interface of discrete-time and continuous-time systems. One of the salient features is that it contains a diverse range of applications on filtering, estimation, control, optimization, and Markov decision processes, and financial engineering. This book will be an important reference for researchers in the areas of applied probability, control theory, operations research, as well as for practitioners who use optimization techniques. Part of the book can also be used in a graduate course of applied probability, stochastic processes, and applications.




Markov Processes for Stochastic Modeling


Book Description

This book presents an algebraic development of the theory of countable state space Markov chains with discrete- and continuous-time parameters. A Markov chain is a stochastic process characterized by the Markov prop erty that the distribution of future depends only on the current state, not on the whole history. Despite its simple form of dependency, the Markov property has enabled us to develop a rich system of concepts and theorems and to derive many results that are useful in applications. In fact, the areas that can be modeled, with varying degrees of success, by Markov chains are vast and are still expanding. The aim of this book is a discussion of the time-dependent behavior, called the transient behavior, of Markov chains. From the practical point of view, when modeling a stochastic system by a Markov chain, there are many instances in which time-limiting results such as stationary distributions have no meaning. Or, even when the stationary distribution is of some importance, it is often dangerous to use the stationary result alone without knowing the transient behavior of the Markov chain. Not many books have paid much attention to this topic, despite its obvious importance.




Interactive Markov Chains


Book Description

Markov Chains are widely used as stochastic models to study a broad spectrum of system performance and dependability characteristics. This monograph is devoted to compositional specification and analysis of Markov chains. Based on principles known from process algebra, the author systematically develops an algebra of interactive Markov chains. By presenting a number of distinguishing results, of both theoretical and practical nature, the author substantiates the claim that interactive Markov chains are more than just another formalism: Among other, an algebraic theory of interactive Markov chains is developed, devise algorithms to mechanize compositional aggregation are presented, and state spaces of several million states resulting from the study of an ordinary telefone system are analyzed.




Markov Chains


Book Description

A long time ago I started writing a book about Markov chains, Brownian motion, and diffusion. I soon had two hundred pages of manuscript and my publisher was enthusiastic. Some years and several drafts later, I had a thousand pages of manuscript, and my publisher was less enthusiastic. So we made it a trilogy: Markov Chains Brownian Motion and Diffusion Approximating Countable Markov Chains familiarly - MC, B & D, and ACM. I wrote the first two books for beginning graduate students with some knowledge of probability; if you can follow Sections 10.4 to 10.9 of Markov Chains you're in. The first two books are quite independent of one another, and completely independent of the third. This last book is a monograph which explains one way to think about chains with instantaneous states. The results in it are supposed to be new, except where there are specific disclaim ers; it's written in the framework of Markov Chains. Most of the proofs in the trilogy are new, and I tried hard to make them explicit. The old ones were often elegant, but I seldom saw what made them go. With my own, I can sometimes show you why things work. And, as I will VB1 PREFACE argue in a minute, my demonstrations are easier technically. If I wrote them down well enough, you may come to agree.




Essentials of Stochastic Processes


Book Description

Building upon the previous editions, this textbook is a first course in stochastic processes taken by undergraduate and graduate students (MS and PhD students from math, statistics, economics, computer science, engineering, and finance departments) who have had a course in probability theory. It covers Markov chains in discrete and continuous time, Poisson processes, renewal processes, martingales, and option pricing. One can only learn a subject by seeing it in action, so there are a large number of examples and more than 300 carefully chosen exercises to deepen the reader’s understanding. Drawing from teaching experience and student feedback, there are many new examples and problems with solutions that use TI-83 to eliminate the tedious details of solving linear equations by hand, and the collection of exercises is much improved, with many more biological examples. Originally included in previous editions, material too advanced for this first course in stochastic processes has been eliminated while treatment of other topics useful for applications has been expanded. In addition, the ordering of topics has been improved; for example, the difficult subject of martingales is delayed until its usefulness can be applied in the treatment of mathematical finance.




Basics of Applied Stochastic Processes


Book Description

Stochastic processes are mathematical models of random phenomena that evolve according to prescribed dynamics. Processes commonly used in applications are Markov chains in discrete and continuous time, renewal and regenerative processes, Poisson processes, and Brownian motion. This volume gives an in-depth description of the structure and basic properties of these stochastic processes. A main focus is on equilibrium distributions, strong laws of large numbers, and ordinary and functional central limit theorems for cost and performance parameters. Although these results differ for various processes, they have a common trait of being limit theorems for processes with regenerative increments. Extensive examples and exercises show how to formulate stochastic models of systems as functions of a system’s data and dynamics, and how to represent and analyze cost and performance measures. Topics include stochastic networks, spatial and space-time Poisson processes, queueing, reversible processes, simulation, Brownian approximations, and varied Markovian models. The technical level of the volume is between that of introductory texts that focus on highlights of applied stochastic processes, and advanced texts that focus on theoretical aspects of processes.




An Introduction to Stochastic Modeling


Book Description

An Introduction to Stochastic Modeling provides information pertinent to the standard concepts and methods of stochastic modeling. This book presents the rich diversity of applications of stochastic processes in the sciences. Organized into nine chapters, this book begins with an overview of diverse types of stochastic models, which predicts a set of possible outcomes weighed by their likelihoods or probabilities. This text then provides exercises in the applications of simple stochastic analysis to appropriate problems. Other chapters consider the study of general functions of independent, identically distributed, nonnegative random variables representing the successive intervals between renewals. This book discusses as well the numerous examples of Markov branching processes that arise naturally in various scientific disciplines. The final chapter deals with queueing models, which aid the design process by predicting system performance. This book is a valuable resource for students of engineering and management science. Engineers will also find this book useful.