Empirical Model Discovery and Theory Evaluation


Book Description

A synthesis of the authors' groundbreaking econometric research on automatic model selection, which uses powerful computational algorithms and theory evaluation. Economic models of empirical phenomena are developed for a variety of reasons, the most obvious of which is the numerical characterization of available evidence, in a suitably parsimonious form. Another is to test a theory, or evaluate it against the evidence; still another is to forecast future outcomes. Building such models involves a multitude of decisions, and the large number of features that need to be taken into account can overwhelm the researcher. Automatic model selection, which draws on recent advances in computation and search algorithms, can create, and then empirically investigate, a vastly wider range of possibilities than even the greatest expert. In this book, leading econometricians David Hendry and Jurgen Doornik report on their several decades of innovative research on automatic model selection. After introducing the principles of empirical model discovery and the role of model selection, Hendry and Doornik outline the stages of developing a viable model of a complicated evolving process. They discuss the discovery stages in detail, considering both the theory of model selection and the performance of several algorithms. They describe extensions to tackling outliers and multiple breaks, leading to the general case of more candidate variables than observations. Finally, they briefly consider selecting models specifically for forecasting.




Empirical Modeling and Data Analysis for Engineers and Applied Scientists


Book Description

This textbook teaches advanced undergraduate and first-year graduate students in Engineering and Applied Sciences to gather and analyze empirical observations (data) in order to aid in making design decisions. While science is about discovery, the primary paradigm of engineering and "applied science" is design. Scientists are in the discovery business and want, in general, to understand the natural world rather than to alter it. In contrast, engineers and applied scientists design products, processes, and solutions to problems. That said, statistics, as a discipline, is mostly oriented toward the discovery paradigm. Young engineers come out of their degree programs having taken courses such as "Statistics for Engineers and Scientists" without any clear idea as to how they can use statistical methods to help them design products or processes. Many seem to think that statistics is only useful for demonstrating that a device or process actually does what it was designed to do. Statistics courses emphasize creating predictive or classification models - predicting nature or classifying individuals, and statistics is often used to prove or disprove phenomena as opposed to aiding in the design of a product or process. In industry however, Chemical Engineers use designed experiments to optimize petroleum extraction; Manufacturing Engineers use experimental data to optimize machine operation; Industrial Engineers might use data to determine the optimal number of operators required in a manual assembly process. This text teaches engineering and applied science students to incorporate empirical investigation into such design processes. Much of the discussion in this book is about models, not whether the models truly represent reality but whether they adequately represent reality with respect to the problems at hand; many ideas focus on how to gather data in the most efficient way possible to construct adequate models. Includes chapters on subjects not often seen together in a single text (e.g., measurement systems, mixture experiments, logistic regression, Taguchi methods, simulation) Techniques and concepts introduced present a wide variety of design situations familiar to engineers and applied scientists and inspire incorporation of experimentation and empirical investigation into the design process. Software is integrally linked to statistical analyses with fully worked examples in each chapter; fully worked using several packages: SAS, R, JMP, Minitab, and MS Excel - also including discussion questions at the end of each chapter. The fundamental learning objective of this textbook is for the reader to understand how experimental data can be used to make design decisions and to be familiar with the most common types of experimental designs and analysis methods.




Dynamic Econometrics


Book Description

The main problem in econometric modelling of time series is discovering sustainable and interpretable relationships between observed economic variables. The primary aim of this book is to develop an operational econometric approach which allows constructive modelling. Professor Hendry deals with methodological issues (model discovery, data mining, and progressive research strategies); with major tools for modelling (recursive methods, encompassing, super exogeneity, invariance tests); and with practical problems (collinearity, heteroscedasticity, and measurement errors). He also includes an extensive study of US money demand. The book is self-contained, with the technical background covered in appendices. It is thus suitable for first year graduate students, and includes solved examples and exercises to facilitate its use in teaching. About the Series Advanced Texts in Econometrics is a distinguished and rapidly expanding series in which leading econometricians assess recent developments in such areas as stochastic probability, panel and time series data analysis, modeling, and cointegration. In both hardback and affordable paperback, each volume explains the nature and applicability of a topic in greater depth than possible in introductory textbooks or single journal articles. Each definitive work is formatted to be as accessible and convenient for those who are not familiar with the detailed primary literature.




The Palgrave Companion to Oxford Economics


Book Description

The University of Oxford has been and continues to be one of the most important global centres for economics. With six chapters on themes in Oxford economics and 24 chapters on the lives and work of Oxford economists, this volume shows how economics became established at the University, how it produced some of the world’s best-known economists, including Francis Ysidro Edgeworth, Roy Harrod and David Hendry, and how it remains a global force for the very best in teaching and research in economics. With original contributions from a stellar cast, this volume provides economists – especially those interested in macroeconomics and the history of economic thought – with the first in-depth analysis of Oxford economics.




Dynamic Econometrics For Empirical Macroeconomic Modelling


Book Description

For Masters and PhD students in EconomicsIn this textbook, the duality between the equilibrium concept used in dynamic economic theory and the stationarity of economic variables is explained and used in the presentation of single equations models and system of equations such as VARs, recursive models and simultaneous equations models.The book also contains chapters on: exogeneity, in the context of estimation, policy analysis and forecasting; automatic (computer based) variable selection, and how it can aid in the specification of an empirical macroeconomic model; and finally, on a common framework for model-based economic forecasting.Supplementary materials and notes are available on the publisher's website.




Linear Models and Time-Series Analysis


Book Description

A comprehensive and timely edition on an emerging new trend in time series Linear Models and Time-Series Analysis: Regression, ANOVA, ARMA and GARCH sets a strong foundation, in terms of distribution theory, for the linear model (regression and ANOVA), univariate time series analysis (ARMAX and GARCH), and some multivariate models associated primarily with modeling financial asset returns (copula-based structures and the discrete mixed normal and Laplace). It builds on the author's previous book, Fundamental Statistical Inference: A Computational Approach, which introduced the major concepts of statistical inference. Attention is explicitly paid to application and numeric computation, with examples of Matlab code throughout. The code offers a framework for discussion and illustration of numerics, and shows the mapping from theory to computation. The topic of time series analysis is on firm footing, with numerous textbooks and research journals dedicated to it. With respect to the subject/technology, many chapters in Linear Models and Time-Series Analysis cover firmly entrenched topics (regression and ARMA). Several others are dedicated to very modern methods, as used in empirical finance, asset pricing, risk management, and portfolio optimization, in order to address the severe change in performance of many pension funds, and changes in how fund managers work. Covers traditional time series analysis with new guidelines Provides access to cutting edge topics that are at the forefront of financial econometrics and industry Includes latest developments and topics such as financial returns data, notably also in a multivariate context Written by a leading expert in time series analysis Extensively classroom tested Includes a tutorial on SAS Supplemented with a companion website containing numerous Matlab programs Solutions to most exercises are provided in the book Linear Models and Time-Series Analysis: Regression, ANOVA, ARMA and GARCH is suitable for advanced masters students in statistics and quantitative finance, as well as doctoral students in economics and finance. It is also useful for quantitative financial practitioners in large financial institutions and smaller finance outlets.




A Macroeconometric Model for Saudi Arabia


Book Description

This Open Access Brief presents the KAPSARC Global Energy Macroeconometric Model (KGEMM). KGEMM is a policy analysis tool for examining the impacts of domestic policy measures and global economic and energy shocks on the Kingdom of Saudi Arabia. The model has eight blocks (real sector, fiscal, monetary, external sector, price, labor and wages, energy, population, and age cohorts) that interact with each other to represent the Kingdom’s macroeconomy and energy linkages. It captures New Keynesian demand-side features anchored to medium-run equilibrium and long-run aggregate supply. It applies a cointegration and equilibrium correction modeling (ECM) methodology to time series data to estimate the model’s behavioral equations in the framework of Autometrics, a general-to-specific econometric modeling strategy. Hence, the model combines ‘theory-driven’ approach with ‘data-driven’ approach. The Brief begins with an introduction to the theoretical framework of the model and the KGEMM methodology and then walks the reader through the structure of the model and its behavioral equations. The book closes with simulations showing the application of the model. Providing a detailed introduction to a cutting-edge, robust predictive model, this Brief will be of great use to researchers and policymakers interested in macroeconomics, energy economics, econometrics, and more specifically, the economy of Saudi Arabia.




The Leading Economic Indicators and Business Cycles in the United States


Book Description

In a time of unprecedented economic uncertainty, this book provides empirical guidance to the economy and what to expect in the near and distant future. Beginning with a historic look at major contributions to economic indicators and business cycles starting with Wesley Clair Mitchell (1913) to Burns and Mitchell (1946), to Moore (1961) and Zarnowitz (1992), this book explores time series forecasting and economic cycles, which are currently maintained and enhanced by The Conference Board. Given their highly statistically significant relationship with GDP and the unemployment rate, these relationships are particularly useful for practitioners to help predict business cycles.




Milton Friedman


Book Description

Milton Friedman is widely regarded as one of the most influential economists of the twentieth century. Although he made many important contributions to both economic theory and policy - most clearly demonstrated by his development of and support for monetarism - he was also active in various spheres of public policy, where he more often than not pursued his championing of the free market and liberty. This volume assesses the importance of the full range of Friedman's ideas, from his work on methodology in economics, his highly innovative consumption theory, and his extensive research on monetary economics, to his views on contentious social and political issues such as education, conscription, and drugs. It also presents personal recollections of Friedman by some of those who knew him, both as students and colleagues, and offers new evidence on Friedman's interactions with other noted economists, including George Stigler and Lionel Robbins. The volume provides readers with an up to date account of Friedman's work and continuing influence and will help to inform and stimulate further research across a variety of areas, including macroeconomics, the history of economic thought, as well as the development and different uses of public policy. With contributions from a stellar cast, this book will be invaluable to academics and students alike.




Models of Discovery


Book Description

We respect Herbert A. Simon as an established leader of empirical and logical analysis in the human sciences while we happily think of him as also the loner; of course he works with many colleagues but none can match him. He has been writing fruitfully and steadily for four decades in many fields, among them psychology, logic, decision theory, economics, computer science, management, production engineering, information and control theory, operations research, confirmation theory, and we must have omitted several. With all of them, he is at once the technical scientist and the philosophical critic and analyst. When writing of decisions and actions, he is at the interface of philosophy of science, decision theory, philosophy of the specific social sciences, and inventory theory (itself, for him, at the interface of economic theory, production engineering and information theory). When writing on causality, he is at the interface of methodology, metaphysics, logic and philosophy of physics, systems theory, and so on. Not that the interdisciplinary is his orthodoxy; we are delighted that he has chosen to include in this book both his early and little-appreciated treatment of straightforward philosophy of physics - the axioms of Newtonian mechanics, and also his fine papers on pure confirmation theory.