Advances on Methodological and Applied Aspects of Probability and Statistics


Book Description

This is one of two volumes that sets forth invited papers presented at the International Indian Statistical Association Conference. This volume emphasizes advancements in methodology and applications of probability and statistics. The chapters, representing the ideas of vanguard researchers on the topic, present several different subspecialties, including applied probability, models and applications, estimation and testing, robust inference, regression and design and sample size methodology. The text also fully describes the applications of these new ideas to industry, ecology, biology, health, economics and management. Researchers and graduate students in mathematical analysis, as well as probability and statistics professionals in industry, will learn much from this volume.




Advances on Theoretical and Methodological Aspects of Probability and Statistics


Book Description

At the International Indian Statistical Association Conference, held at McMaster University in Ontario, Canada, participants focused on advancements in theory and methodology of probability and statistics. This is one of two volumes containing invited papers from the meeting. The 32 chapters deal with different topics of interest, including stochastic processes and inference, distributions and characterizations, inference, Bayesian inference, selection methods, regression methods, and methods in health research. The text is ideal for applied mathematicians, statisticians, and researchers in the field.




Decision Sciences


Book Description

This handbook is an endeavour to cover many current, relevant, and essential topics related to decision sciences in a scientific manner. Using this handbook, graduate students, researchers, as well as practitioners from engineering, statistics, sociology, economics, etc. will find a new and refreshing paradigm shift as to how these topics can be put to use beneficially. Starting from the basics to advanced concepts, authors hope to make the readers well aware of the different theoretical and practical ideas, which are the focus of study in decision sciences nowadays. It includes an excellent bibliography/reference/journal list, information about a variety of datasets, illustrated pseudo-codes, and discussion of future trends in research. Covering topics ranging from optimization, networks and games, multi-objective optimization, inventory theory, statistical methods, artificial neural networks, times series analysis, simulation modeling, decision support system, data envelopment analysis, queueing theory, etc., this reference book is an attempt to make this area more meaningful for varied readers. Noteworthy features of this handbook are in-depth coverage of different topics, solved practical examples, unique datasets for a variety of examples in the areas of decision sciences, in-depth analysis of problems through colored charts, 3D diagrams, and discussions about software.




Flowgraph Models for Multistate Time-to-Event Data


Book Description

A unique introduction to the innovative methodology of statisticalflowgraphs This book offers a practical, application-based approach toflowgraph models for time-to-event data. It clearly shows how thisinnovative new methodology can be used to analyze data fromsemi-Markov processes without prior knowledge of stochasticprocesses--opening the door to interesting applications in survivalanalysis and reliability as well as stochastic processes. Unlike other books on multistate time-to-event data, this workemphasizes reliability and not just biostatistics, illustratingeach method with medical and engineering examples. It demonstrateshow flowgraphs bring together applied probability techniques andcombine them with data analysis and statistical methods to answerquestions of practical interest. Bayesian methods of data analysisare emphasized. Coverage includes: * Clear instructions on how to model multistate time-to-event datausing flowgraph models * An emphasis on computation, real data, and Bayesian methods forproblem solving * Real-world examples for analyzing data from stochasticprocesses * The use of flowgraph models to analyze complex stochasticnetworks * Exercise sets to reinforce the practical approach of thisvolume Flowgraph Models for Multistate Time-to-Event Data is an invaluableresource/reference for researchers in biostatistics/survivalanalysis, systems engineering, and in fields that use stochasticprocesses, including anthropology, biology, psychology, computerscience, and engineering.




Probability and Statistical Inference


Book Description

Probability and Statistical Inference: From Basic Principles to Advanced Models covers aspects of probability, distribution theory, and inference that are fundamental to a proper understanding of data analysis and statistical modelling. It presents these topics in an accessible manner without sacrificing mathematical rigour, bridging the gap between the many excellent introductory books and the more advanced, graduate-level texts. The book introduces and explores techniques that are relevant to modern practitioners, while being respectful to the history of statistical inference. It seeks to provide a thorough grounding in both the theory and application of statistics, with even the more abstract parts placed in the context of a practical setting. Features: •Complete introduction to mathematical probability, random variables, and distribution theory. •Concise but broad account of statistical modelling, covering topics such as generalised linear models, survival analysis, time series, and random processes. •Extensive discussion of the key concepts in classical statistics (point estimation, interval estimation, hypothesis testing) and the main techniques in likelihood-based inference. •Detailed introduction to Bayesian statistics and associated topics. •Practical illustration of some of the main computational methods used in modern statistical inference (simulation, boostrap, MCMC). This book is for students who have already completed a first course in probability and statistics, and now wish to deepen and broaden their understanding of the subject. It can serve as a foundation for advanced undergraduate or postgraduate courses. Our aim is to challenge and excite the more mathematically able students, while providing explanations of statistical concepts that are more detailed and approachable than those in advanced texts. This book is also useful for data scientists, researchers, and other applied practitioners who want to understand the theory behind the statistical methods used in their fields.




Hands-on Intermediate Econometrics Using R: Templates For Learning Quantitative Methods And R Software (Second Edition)


Book Description

How to learn both applied statistics (econometrics) and free, open-source software R? This book allows students to have a sense of accomplishment by copying and pasting many hands-on templates provided here.The textbook is essential for anyone wishing to have a practical understanding of an extensive range of topics in Econometrics. No other text provides software snippets to learn so many new statistical tools with hands-on examples. The explicit knowledge of inputs and outputs of each new method allows the student to know which algorithm is worth studying. The book offers sufficient theoretical and algorithmic details about a vast range of statistical techniques.The second edition's preface lists the following topics generally absent in other textbooks. (i) Iteratively reweighted least squares, (ii) Pillar charts to represent 3D data. (iii) Stochastic frontier analysis (SFA) (iv) model selection with Mallows' Cp criterion. (v) Hodrick-Prescott (HP) filter. (vi) Automatic ARIMA models. (vi) Nonlinear Granger-causality using kernel regressions and bootstrap confidence intervals. (vii) new Keynesian Phillips curve (NKPC). (viii) Market-neutral pairs trading using two cointegrated stocks. (ix) Artificial neural network (ANN) for product-specific forecasting. (x) Vector AR and VARMA models. (xi) New tools for diagnosing the endogeneity problem. (xii) The elegant set-up of k-class estimators and identification. (xiii) Probit-logit models and Heckman selection bias correction. (xiv) Receiver operating characteristic (ROC) curves and areas under them. (xv) Confusion matrix. (xvi) Quantile regression (xvii) Elastic net estimator. (xviii) generalized Correlations (xix) maximum entropy bootstrap for time series. (xx) Convergence concepts quantified. (xxi) Generalized partial correlation coefficients (xxii) Panel data and duration (survival) models.




Mathematical Reviews


Book Description




Small Area Estimation and Microsimulation Modeling


Book Description

Small Area Estimation and Microsimulation Modeling is the first practical handbook that comprehensively presents modern statistical SAE methods in the framework of ultramodern spatial microsimulation modeling while providing the novel approach of creating synthetic spatial microdata. Along with describing the necessary theories and their advantages and limitations, the authors illustrate the practical application of the techniques to a large number of substantive problems, including how to build up models, organize and link data, create synthetic microdata, conduct analyses, yield informative tables and graphs, and evaluate how the findings effectively support the decision making processes in government and non-government organizations. Features Covers both theoretical and applied aspects for real-world comparative research and regional statistics production Thoroughly explains how microsimulation modeling technology can be constructed using available datasets for reliable small area statistics Provides SAS codes that allow readers to utilize these latest technologies in their own work. This book is designed for advanced graduate students, academics, professionals and applied practitioners who are generally interested in small area estimation and/or microsimulation modeling and dealing with vital issues in social and behavioural sciences, applied economics and policy analysis, government and/or social statistics, health sciences, business, psychology, environmental and agriculture modeling, computational statistics and data simulation, spatial statistics, transport and urban planning, and geospatial modeling. Dr Azizur Rahman is a Senior Lecturer in Statistics and convenor of the Graduate Program in Applied Statistics at the Charles Sturt University, and an Adjunct Associate Professor of Public Health and Biostatistics at the University of Canberra. His research encompasses small area estimation, applied economics, microsimulation modeling, Bayesian inference and public health. He has more than 60 scholarly publications including two books. Dr. Rahman’s research is funded by the Australian Federal and State Governments, and he serves on a range of editorial boards including the International Journal of Microsimulation (IJM). Professor Ann Harding, AO is an Emeritus Professor of Applied Economics and Social Policy at the National Centre for Social and Economic Modelling (NATSEM) of the University of Canberra. She was the founder and inaugural Director of this world class Research Centre for more than sixteen years, and also a co-founder of the International Microsimulation Association (IMA) and served as the inaugural elected president of IMA from 2004 to 2011. She is a fellow of the Academy of the Social Sciences in Australia. She has more than 300 publications including several books in microsimulation modeling.







Advanced Intelligent Computing Theories and Applications. With Aspects of Theoretical and Methodological Issues


Book Description

The International Conference on Intelligent Computing (ICIC) was formed to p- vide an annual forum dedicated to the emerging and challenging topics in artificial intelligence, machine learning, bioinformatics, and computational biology, etc. It aims to bring together researchers and practitioners from both academia and ind- try to share ideas, problems and solutions related to the multifaceted aspects of intelligent computing. ICIC 2008, held in Shanghai, China, September 15–18, 2008, constituted the 4th International Conference on Intelligent Computing. It built upon the success of ICIC 2007, ICIC 2006 and ICIC 2005 held in Qingdao, Kunming and Hefei, China, 2007, 2006 and 2005, respectively. This year, the conference concentrated mainly on the theories and methodologies as well as the emerging applications of intelligent computing. Its aim was to unify the picture of contemporary intelligent computing techniques as an integral concept that highlights the trends in advanced computational intelligence and bridges theoretical research with applications. Therefore, the theme for this conference was “Emerging Intelligent Computing Technology and Applications”. Papers focusing on this theme were solicited, addressing theories, methodologies, and applications in science and technology.