Parametric Statistical Inference


Book Description

Parametric Statistical Inference: Basic Theory and Modern Approaches presents the developments and modern trends in statistical inference to students who do not have advanced mathematical and statistical preparation. The topics discussed in the book are basic and common to many fields of statistical inference and thus serve as a jumping board for in-depth study. The book is organized into eight chapters. Chapter 1 provides an overview of how the theory of statistical inference is presented in subsequent chapters. Chapter 2 briefly discusses statistical distributions and their properties. Chapter 3 is devoted to the problem of sufficient statistics and the information in samples, and Chapter 4 presents some basic results from the theory of testing statistical hypothesis. In Chapter 5, the classical theory of estimation is developed. Chapter 6 discusses the efficiency of estimators and some large sample properties, while Chapter 7 studies the topics on confidence intervals. Finally, Chapter 8 is about decision theoretic and Bayesian approach in testing and estimation. Senior undergraduate and graduate students in statistics and mathematics, and those who have taken an introductory course in probability will highly benefit from this book.




A Course in Statistics with R


Book Description

Integrates the theory and applications of statistics using R A Course in Statistics with R has been written to bridge the gap between theory and applications and explain how mathematical expressions are converted into R programs. The book has been primarily designed as a useful companion for a Masters student during each semester of the course, but will also help applied statisticians in revisiting the underpinnings of the subject. With this dual goal in mind, the book begins with R basics and quickly covers visualization and exploratory analysis. Probability and statistical inference, inclusive of classical, nonparametric, and Bayesian schools, is developed with definitions, motivations, mathematical expression and R programs in a way which will help the reader to understand the mathematical development as well as R implementation. Linear regression models, experimental designs, multivariate analysis, and categorical data analysis are treated in a way which makes effective use of visualization techniques and the related statistical techniques underlying them through practical applications, and hence helps the reader to achieve a clear understanding of the associated statistical models. Key features: Integrates R basics with statistical concepts Provides graphical presentations inclusive of mathematical expressions Aids understanding of limit theorems of probability with and without the simulation approach Presents detailed algorithmic development of statistical models from scratch Includes practical applications with over 50 data sets




Parametric Statistical Inference


Book Description

Two unifying components of statistics are the likelihood function and the exponential family. These are brought together for the first time as the central themes in this book on statistical inference, written for advanced undergraduate and graduate students in mathematical statistics.




A History of Parametric Statistical Inference from Bernoulli to Fisher, 1713-1935


Book Description

This book offers a detailed history of parametric statistical inference. Covering the period between James Bernoulli and R.A. Fisher, it examines: binomial statistical inference; statistical inference by inverse probability; the central limit theorem and linear minimum variance estimation by Laplace and Gauss; error theory, skew distributions, correlation, sampling distributions; and the Fisherian Revolution. Lively biographical sketches of many of the main characters are featured throughout, including Laplace, Gauss, Edgeworth, Fisher, and Karl Pearson. Also examined are the roles played by DeMoivre, James Bernoulli, and Lagrange.




Examples in Parametric Inference with R


Book Description

This book discusses examples in parametric inference with R. Combining basic theory with modern approaches, it presents the latest developments and trends in statistical inference for students who do not have an advanced mathematical and statistical background. The topics discussed in the book are fundamental and common to many fields of statistical inference and thus serve as a point of departure for in-depth study. The book is divided into eight chapters: Chapter 1 provides an overview of topics on sufficiency and completeness, while Chapter 2 briefly discusses unbiased estimation. Chapter 3 focuses on the study of moments and maximum likelihood estimators, and Chapter 4 presents bounds for the variance. In Chapter 5, topics on consistent estimator are discussed. Chapter 6 discusses Bayes, while Chapter 7 studies some more powerful tests. Lastly, Chapter 8 examines unbiased and other tests. Senior undergraduate and graduate students in statistics and mathematics, and those who have taken an introductory course in probability, will greatly benefit from this book. Students are expected to know matrix algebra, calculus, probability and distribution theory before beginning this course. Presenting a wealth of relevant solved and unsolved problems, the book offers an excellent tool for teachers and instructors who can assign homework problems from the exercises, and students will find the solved examples hugely beneficial in solving the exercise problems.




A First Course on Parametric Inference


Book Description

"After a brief historical perspective, A First Course on Parametric Inference, discusses the basic concept of sufficient statistic and the classical approach based on minimum variance unbiased estimator. There is a separate chapter on simultaneous estimation of several parameters. Large sample theory of estimation, based on consistent asymptotically normal estimators obtained by method of moments, percentile and the method of maximum likelihood is also introduced. The tests of hypotheses for finite samples with classical Neyman-Pearson theory is developed pointing out its connection with Bayesian approach. The hypotheses testing and confidence interval techniques are developed leading to likelihood ratio tests, score tests and tests based on maximum likelihood estimators."--BOOK JACKET.




Parametric and Nonparametric Inference from Record-Breaking Data


Book Description

By providing a comprehensive look at statistical inference from record-breaking data in both parametric and nonparametric settings, this book treats the area of nonparametric function estimation from such data in detail. Its main purpose is to fill this void on general inference from record values. Statisticians, mathematicians, and engineers will find the book useful as a research reference. It can also serve as part of a graduate-level statistics or mathematics course.




All of Nonparametric Statistics


Book Description

This text provides the reader with a single book where they can find accounts of a number of up-to-date issues in nonparametric inference. The book is aimed at Masters or PhD level students in statistics, computer science, and engineering. It is also suitable for researchers who want to get up to speed quickly on modern nonparametric methods. It covers a wide range of topics including the bootstrap, the nonparametric delta method, nonparametric regression, density estimation, orthogonal function methods, minimax estimation, nonparametric confidence sets, and wavelets. The book’s dual approach includes a mixture of methodology and theory.




Introduction to Empirical Processes and Semiparametric Inference


Book Description

Kosorok’s brilliant text provides a self-contained introduction to empirical processes and semiparametric inference. These powerful research techniques are surprisingly useful for developing methods of statistical inference for complex models and in understanding the properties of such methods. This is an authoritative text that covers all the bases, and also a friendly and gradual introduction to the area. The book can be used as research reference and textbook.




Parametric and Nonparametric Inference for Statistical Dynamic Shape Analysis with Applications


Book Description

This book considers specific inferential issues arising from the analysis of dynamic shapes with the attempt to solve the problems at hand using probability models and nonparametric tests. The models are simple to understand and interpret and provide a useful tool to describe the global dynamics of the landmark configurations. However, because of the non-Euclidean nature of shape spaces, distributions in shape spaces are not straightforward to obtain. The book explores the use of the Gaussian distribution in the configuration space, with similarity transformations integrated out. Specifically, it works with the offset-normal shape distribution as a probability model for statistical inference on a sample of a temporal sequence of landmark configurations. This enables inference for Gaussian processes from configurations onto the shape space. The book is divided in two parts, with the first three chapters covering material on the offset-normal shape distribution, and the remaining chapters covering the theory of NonParametric Combination (NPC) tests. The chapters offer a collection of applications which are bound together by the theme of this book. They refer to the analysis of data from the FG-NET (Face and Gesture Recognition Research Network) database with facial expressions. For these data, it may be desirable to provide a description of the dynamics of the expressions, or testing whether there is a difference between the dynamics of two facial expressions or testing which of the landmarks are more informative in explaining the pattern of an expression.