Stochastic Monotonicity and Queueing Applications of Birth-Death Processes


Book Description

A stochastic process {X(t): 0 S t =} with discrete state space S c ~ is said to be stochastically increasing (decreasing) on an interval T if the probabilities Pr{X(t) i}, i E S, are increasing (decreasing) with t on T. Stochastic monotonicity is a basic structural property for process behaviour. It gives rise to meaningful bounds for various quantities such as the moments of the process, and provides the mathematical groundwork for approximation algorithms. Obviously, stochastic monotonicity becomes a more tractable subject for analysis if the processes under consideration are such that stochastic mono tonicity on an inter val 0




Functional Relations, Random Coefficients, and Nonlinear Regression with Application to Kinetic Data


Book Description

These notes on regression give an introduction to some of the techniques that I have found useful when working with various data sets in collaboration with Dr. S. Keiding (Copenhagen) and Dr. J.W.L. Robinson (Lausanne). The notes are based on some lectures given at the Institute of Mathematical Statistics, University of Copenhigen, 1978-81, for graduate students, and assumes a familiarity with statistical theory corresponding to the book by C.R. Rao: "Linear Statistical Inference and its Applications". Wiley, New York (1973) . The mathematical tools needed for the algebraic treatment of the models are some knowledge of finite dimensional vector spaces with an inner product and the notion of orthogonal projection. For the analytic treatment I need characteristic functions and weak convergence as the main tools. The most important statistical concepts are the general linear model for Gaussian variables and the general methods of maximum likelihood estimation as well as the likelihood ratio test. All these topics are presented in the above mentioned book by Rao and the reader is referred to that for details. For convenience a short appendix is added where the fundamental concepts from linear algebra are discussed.




Optimal Sequentially Planned Decision Procedures


Book Description

Learning from experience, making decisions on the basis of the available information, and proceeding step by step to a desired goal are fundamental behavioural qualities of human beings. Nevertheless, it was not until the early 1940's that such a statistical theory - namely Sequential Analysis - was created, which allows us to investigate this kind of behaviour in a precise manner. A. Wald's famous sequential probability ratio test (SPRT; see example (1.8ยป turned out to have an enormous influence on the development of this theory. On the one hand, Wald's fundamental monograph "Sequential Analysis" ([Wa]*) is essentially centered around this test. On the other hand, important properties of the SPRT - e.g. Bayes optimality, minimax-properties, "uniform" optimality with respect to expected sample sizes - gave rise to the development of a general statistical decision theory. As a conse quence, the SPRT's played a dominating role in the further development of sequential analysis and, more generally, in theoretical statistics.




Convergence of Stochastic Processes


Book Description

A more accurate title for this book might be: An Exposition of Selected Parts of Empirical Process Theory, With Related Interesting Facts About Weak Convergence, and Applications to Mathematical Statistics. The high points are Chapters II and VII, which describe some of the developments inspired by Richard Dudley's 1978 paper. There I explain the combinatorial ideas and approximation methods that are needed to prove maximal inequalities for empirical processes indexed by classes of sets or classes of functions. The material is somewhat arbitrarily divided into results used to prove consistency theorems and results used to prove central limit theorems. This has allowed me to put the easier material in Chapter II, with the hope of enticing the casual reader to delve deeper. Chapters III through VI deal with more classical material, as seen from a different perspective. The novelties are: convergence for measures that don't live on borel a-fields; the joys of working with the uniform metric on D[O, IJ; and finite-dimensional approximation as the unifying idea behind weak convergence. Uniform tightness reappears in disguise as a condition that justifies the finite-dimensional approximation. Only later is it exploited as a method for proving the existence of limit distributions. The last chapter has a heuristic flavor. I didn't want to confuse the martingale issues with the martingale facts.




The Analysis of Directional Time Series: Applications to Wind Speed and Direction


Book Description

Given a series of wind speeds and directions from the port of Fremantle the aim of this monograph is to detect general weather patterns and seasonal characteristics. To separate the daily land and sea breeze cycle and other short-term disturbances from the general wind, the series is divided into a daily and a longer term, synoptic component. The latter is related to the atmospheric pressure field, while the former is studied in order i) to isolate particular short-term events such as calms, storms and oscillating winds, and ii) to determine the land and sea breeze cycle which dominates the weather pattern for most of the year. All these patterns are described in detail and are related to the synoptic component of the data. Two time series models for directional data and a new measure of angular association are introduced to provide the basis for certain parts of the analysis.




Interacting Particle Systems


Book Description

At what point in the development of a new field should a book be written about it? This question is seldom easy to answer. In the case of interacting particle systems, important progress continues to be made at a substantial pace. A number of problems which are nearly as old as the subject itself remain open, and new problem areas continue to arise and develop. Thus one might argue that the time is not yet ripe for a book on this subject. On the other hand, this field is now about fifteen years old. Many important of several basic models is problems have been solved and the analysis almost complete. The papers written on this subject number in the hundreds. It has become increasingly difficult for newcomers to master the proliferating literature, and for workers in allied areas to make effective use of it. Thus I have concluded that this is an appropriate time to pause and take stock of the progress made to date. It is my hope that this book will not only provide a useful account of much of this progress, but that it will also help stimulate the future vigorous development of this field.




Series Approximation Methods in Statistics


Book Description

This book was originally compiled for a course I taught at the University of Rochester in the fall of 1991, and is intended to give advanced graduate students in statistics an introduction to Edgeworth and saddlepoint approximations, and related techniques. Many other authors have also written monographs on this subject, and so this work is narrowly focused on two areas not recently discussed in theoretical text books. These areas are, first, a rigorous consideration of Edgeworth and saddlepoint expansion limit theorems, and second, a survey of the more recent developments in the field. In presenting expansion limit theorems I have drawn heavily 011 notation of McCullagh (1987) and on the theorems presented by Feller (1971) on Edgeworth expansions. For saddlepoint notation and results I relied most heavily on the many papers of Daniels, and a review paper by Reid (1988). Throughout this book I have tried to maintain consistent notation and to present theorems in such a way as to make a few theoretical results useful in as many contexts as possible. This was not only in order to present as many results with as few proofs as possible, but more importantly to show the interconnections between the various facets of asymptotic theory. Special attention is paid to regularity conditions. The reasons they are needed and the parts they play in the proofs are both highlighted.




Lectures on Random Voronoi Tessellations


Book Description

Tessellations are subdivisions of d-dimensional space into non-overlapping "cells". Voronoi tessellations are produced by first considering a set of points (known as nuclei) in d-space, and then defining cells as the set of points which are closest to each nuclei. A random Voronoi tessellation is produced by supposing that the location of each nuclei is determined by some random process. They provide models for many natural phenomena as diverse as the growth of crystals, the territories of animals, the development of regional market areas, and in subjects such as computational geometry and astrophysics. This volume provides an introduction to random Voronoi tessellations by presenting a survey of the main known results and the directions in which research is proceeding. Throughout the volume, mathematical and rigorous proofs are given making this essentially a self-contained account in which no background knowledge of the subject is assumed.




Differential-Geometrical Methods in Statistics


Book Description

From the reviews: "In this Lecture Note volume the author describes his differential-geometric approach to parametrical statistical problems summarizing the results he had published in a series of papers in the last five years. The author provides a geometric framework for a special class of test and estimation procedures for curved exponential families. ... ... The material and ideas presented in this volume are important and it is recommended to everybody interested in the connection between statistics and geometry ..." #Metrika#1 "More than hundred references are given showing the growing interest in differential geometry with respect to statistics. The book can only strongly be recommended to a geodesist since it offers many new insights into statistics on a familiar ground." #Manuscripta Geodaetica#2




Causation, Prediction, and Search


Book Description

This book is intended for anyone, regardless of discipline, who is interested in the use of statistical methods to help obtain scientific explanations or to predict the outcomes of actions, experiments or policies. Much of G. Udny Yule's work illustrates a vision of statistics whose goal is to investigate when and how causal influences may be reliably inferred, and their comparative strengths estimated, from statistical samples. Yule's enterprise has been largely replaced by Ronald Fisher's conception, in which there is a fundamental cleavage between experimental and non experimental inquiry, and statistics is largely unable to aid in causal inference without randomized experimental trials. Every now and then members of the statistical community express misgivings about this turn of events, and, in our view, rightly so. Our work represents a return to something like Yule's conception of the enterprise of theoretical statistics and its potential practical benefits. If intellectual history in the 20th century had gone otherwise, there might have been a discipline to which our work belongs. As it happens, there is not. We develop material that belongs to statistics, to computer science, and to philosophy; the combination may not be entirely satisfactory for specialists in any of these subjects. We hope it is nonetheless satisfactory for its purpose.