Discrete Choice Methods with Simulation


Book Description

This book describes the new generation of discrete choice methods, focusing on the many advances that are made possible by simulation. Researchers use these statistical methods to examine the choices that consumers, households, firms, and other agents make. Each of the major models is covered: logit, generalized extreme value, or GEV (including nested and cross-nested logits), probit, and mixed logit, plus a variety of specifications that build on these basics. Simulation-assisted estimation procedures are investigated and compared, including maximum stimulated likelihood, method of simulated moments, and method of simulated scores. Procedures for drawing from densities are described, including variance reduction techniques such as anithetics and Halton draws. Recent advances in Bayesian procedures are explored, including the use of the Metropolis-Hastings algorithm and its variant Gibbs sampling. The second edition adds chapters on endogeneity and expectation-maximization (EM) algorithms. No other book incorporates all these fields, which have arisen in the past 25 years. The procedures are applicable in many fields, including energy, transportation, environmental studies, health, labor, and marketing.




Quantitative Techniques for Competition and Antitrust Analysis


Book Description

This book combines practical guidance and theoretical background for analysts using empirical techniques in competition and antitrust investigations. Peter Davis and Eliana Garcés show how to integrate empirical methods, economic theory, and broad evidence about industry in order to provide high-quality, robust empirical work that is tailored to the nature and quality of data available and that can withstand expert and judicial scrutiny. Davis and Garcés describe the toolbox of empirical techniques currently available, explain how to establish the weight of pieces of empirical work, and make some new theoretical contributions. The book consistently evaluates empirical techniques in light of the challenge faced by competition analysts and academics--to provide evidence that can stand up to the review of experts and judges. The book's integrated approach will help analysts clarify the assumptions underlying pieces of empirical work, evaluate those assumptions in light of industry knowledge, and guide future work aimed at understanding whether the assumptions are valid. Throughout, Davis and Garcés work to expand the common ground between practitioners and academics.




The Economics of New Goods


Book Description

New goods are at the heart of economic progress. The eleven essays in this volume include historical treatments of new goods and their diffusion; practical exercises in measurement addressed to recent and ongoing innovations; and real-world methods of devising quantitative adjustments for quality change. The lead article in Part I contains a striking analysis of the history of light over two millenia. Other essays in Part I develop new price indexes for automobiles back to 1906; trace the role of the air conditioner in the development of the American south; and treat the germ theory of disease as an economic innovation. In Part II essays measure the economic impact of more recent innovations, including anti-ulcer drugs, new breakfast cereals, and computers. Part III explores methods and defects in the treatment of quality change in the official price data of the United States, Canada, and Japan. This pathbreaking volume will interest anyone who studies economic growth, productivity, and the American standard of living.




Price Index Concepts and Measurement


Book Description

Although inflation is much feared for its negative effects on the economy, how to measure it is a matter of considerable debate that has important implications for interest rates, monetary supply, and investment and spending decisions. Underlying many of these issues is the concept of the Cost-of-Living Index (COLI) and its controversial role as the methodological foundation for the Consumer Price Index (CPI). Price Index Concepts and Measurements brings together leading experts to address the many questions involved in conceptualizing and measuring inflation. They evaluate the accuracy of COLI, a Cost-of-Goods Index, and a variety of other methodological frameworks as the bases for consumer price construction.




An Introduction to Stochastic Modeling


Book Description

An Introduction to Stochastic Modeling provides information pertinent to the standard concepts and methods of stochastic modeling. This book presents the rich diversity of applications of stochastic processes in the sciences. Organized into nine chapters, this book begins with an overview of diverse types of stochastic models, which predicts a set of possible outcomes weighed by their likelihoods or probabilities. This text then provides exercises in the applications of simple stochastic analysis to appropriate problems. Other chapters consider the study of general functions of independent, identically distributed, nonnegative random variables representing the successive intervals between renewals. This book discusses as well the numerous examples of Markov branching processes that arise naturally in various scientific disciplines. The final chapter deals with queueing models, which aid the design process by predicting system performance. This book is a valuable resource for students of engineering and management science. Engineers will also find this book useful.




Handbook of Industrial Organization


Book Description

Handbook of Industrial Organization Volume 4 highlights new advances in the field, with this new volume presenting interesting chapters. Each chapter is written by an international board of authors. Part of the renowned Handbooks in Economics series Chapters are contributed by some of the leading experts in their fields A source, reference and teaching supplement for industrial organizations or industrial economists




Qualitative Choice Analysis


Book Description

This book addresses two significant research areas in an interdependent fashion. It is first of all a comprehensive but concise text that covers the recently developed and widely applicable methods of qualitative choice analysis, illustrating the general theory through simulation models of automobile demand and use. It is also a detailed study of automobile demand and use, presenting forecasts based on these powerful new techniques. The book develops the general principles that underlie qualitative choice models that are now being applied in numerous fields in addition to transportation, such as housing, labor, energy, communications, and criminology. The general form, derivation, and estimation of qualitative choice models are explained, and the major models - logit, probit, and GEV - are discussed in detail. And continuous/discrete models are introduced. In these, qualitative choice methods and standard regression techniques are combined to analyze situations that neither alone can accurately forecast. Summarizing previous research on auto demand, the book shows how qualitative choice methods can be used by applying them to specific auto-related decisions as the aggregate of individuals' choices. The simulation model that is constructed is a significant improvement over older models, and should prove more useful to agencies and organizations requiring accurate forecasting of auto demand and use for planning and policy development. The book concludes with an actual case study based on a model designed for the investigations of the California Energy Commission. Kenneth Train is Visiting Associate Professor in Economics at the University of California, Berkeley, and Director of Economic Research at Cambridge Systematics, Inc., also in Berkeley. Qualitative Choice Analysisis included in The MIT Press Transportation Studies Series, edited by Marvin L. Manheim.




Statistical Rethinking


Book Description

Statistical Rethinking: A Bayesian Course with Examples in R and Stan builds readers’ knowledge of and confidence in statistical modeling. Reflecting the need for even minor programming in today’s model-based statistics, the book pushes readers to perform step-by-step calculations that are usually automated. This unique computational approach ensures that readers understand enough of the details to make reasonable choices and interpretations in their own modeling work. The text presents generalized linear multilevel models from a Bayesian perspective, relying on a simple logical interpretation of Bayesian probability and maximum entropy. It covers from the basics of regression to multilevel models. The author also discusses measurement error, missing data, and Gaussian process models for spatial and network autocorrelation. By using complete R code examples throughout, this book provides a practical foundation for performing statistical inference. Designed for both PhD students and seasoned professionals in the natural and social sciences, it prepares them for more advanced or specialized statistical modeling. Web Resource The book is accompanied by an R package (rethinking) that is available on the author’s website and GitHub. The two core functions (map and map2stan) of this package allow a variety of statistical models to be constructed from standard model formulas.




Ecological Inference


Book Description

Drawing upon the recent explosion of research in the field, a diverse group of scholars surveys the latest strategies for solving ecological inference problems, the process of trying to infer individual behavior from aggregate data. The uncertainties and information lost in aggregation make ecological inference one of the most difficult areas of statistical inference, but these inferences are required in many academic fields, as well as by legislatures and the Courts in redistricting, marketing research by business, and policy analysis by governments. This wide-ranging collection of essays offers many fresh and important contributions to the study of ecological inference.




Handbook of Econometrics


Book Description

As conceived by the founders of the Econometric Society, econometrics is a field that uses economic theory and statistical methods to address empirical problems in economics. It is a tool for empirical discovery and policy analysis. The chapters in this volume embody this vision and either implement it directly or provide the tools for doing so. This vision is not shared by those who view econometrics as a branch of statistics rather than as a distinct field of knowledge that designs methods of inference from data based on models of human choice ...