The Oxford Handbook of Applied Nonparametric and Semiparametric Econometrics and Statistics


Book Description

This volume, edited by Jeffrey Racine, Liangjun Su, and Aman Ullah, contains the latest research on nonparametric and semiparametric econometrics and statistics. Chapters by leading international econometricians and statisticians highlight the interface between econometrics and statistical methods for nonparametric and semiparametric procedures.




Time Series Analysis: Methods and Applications


Book Description

'Handbook of Statistics' is a series of self-contained reference books. Each volume is devoted to a particular topic in statistics, with volume 30 dealing with time series.




Time Series Analysis: Methods and Applications


Book Description

The field of statistics not only affects all areas of scientific activity, but also many other matters such as public policy. It is branching rapidly into so many different subjects that a series of handbooks is the only way of comprehensively presenting the various aspects of statistical methodology, applications, and recent developments.The Handbook of Statistics is a series of self-contained reference books. Each volume is devoted to a particular topic in statistics, with Volume 30 dealing with time series. The series is addressed to the entire community of statisticians and scientists in various disciplines who use statistical methodology in their work. At the same time, special emphasis is placed on applications-oriented techniques, with the applied statistician in mind as the primary audience. - Comprehensively presents the various aspects of statistical methodology - Discusses a wide variety of diverse applications and recent developments - Contributors are internationally renowened experts in their respective areas




Econometric Modelling with Time Series


Book Description

"Maximum likelihood estimation is a general method for estimating the parameters of econometric models from observed data. The principle of maximum likelihood plays a central role in the exposition of this book, since a number of estimators used in econometrics can be derived within this framework. Examples include ordinary least squares, generalized least squares and full-information maximum likelihood. In deriving the maximum likelihood estimator, a key concept is the joint probability density function (pdf) of the observed random variables, yt. Maximum likelihood estimation requires that the following conditions are satisfied. (1) The form of the joint pdf of yt is known. (2) The specification of the moments of the joint pdf are known. (3) The joint pdf can be evaluated for all values of the parameters, 9. Parts ONE and TWO of this book deal with models in which all these conditions are satisfied. Part THREE investigates models in which these conditions are not satisfied and considers four important cases. First, if the distribution of yt is misspecified, resulting in both conditions 1 and 2 being violated, estimation is by quasi-maximum likelihood (Chapter 9). Second, if condition 1 is not satisfied, a generalized method of moments estimator (Chapter 10) is required. Third, if condition 2 is not satisfied, estimation relies on nonparametric methods (Chapter 11). Fourth, if condition 3 is violated, simulation-based estimation methods are used (Chapter 12). 1.2 Motivating Examples To highlight the role of probability distributions in maximum likelihood estimation, this section emphasizes the link between observed sample data and 4 The Maximum Likelihood Principle the probability distribution from which they are drawn"-- publisher.




Essays in Honor of Peter C. B. Phillips


Book Description

This volume honors Professor Peter C.B. Phillips' many contributions to the field of econometrics. The topics include non-stationary time series, panel models, financial econometrics, predictive tests, IV estimation and inference, difference-in-difference regressions, stochastic dominance techniques, and information matrix testing.




Essays in Nonlinear Time Series Econometrics


Book Description

This edited collection concerns nonlinear economic relations that involve time. It is divided into four broad themes that all reflect the work and methodology of Professor Timo Teräsvirta, one of the leading scholars in the field of nonlinear time series econometrics. The themes are: Testing for linearity and functional form, specification testing and estimation of nonlinear time series models in the form of smooth transition models, model selection and econometric methodology, and finally applications within the area of financial econometrics. All these research fields include contributions that represent state of the art in econometrics such as testing for neglected nonlinearity in neural network models, time-varying GARCH and smooth transition models, STAR models and common factors in volatility modeling, semi-automatic general to specific model selection for nonlinear dynamic models, high-dimensional data analysis for parametric and semi-parametric regression models with dependent data, commodity price modeling, financial analysts earnings forecasts based on asymmetric loss function, local Gaussian correlation and dependence for asymmetric return dependence, and the use of bootstrap aggregation to improve forecast accuracy. Each chapter represents original scholarly work, and reflects the intellectual impact that Timo Teräsvirta has had and will continue to have, on the profession.




Essays in Honor of Joon Y. Park


Book Description

Volumes 45a and 45b of Advances in Econometrics honor Professor Joon Y. Park, who has made numerous and substantive contributions to the field of econometrics over a career spanning four decades since the 1980s and counting.




Nonparametric Econometrics


Book Description

Nonparametric Econometrics is a primer for those who wish to familiarize themselves with nonparametric econometrics. While the underlying theory for many of these methods can be daunting for practitioners, this monograph presents a range of nonparametric methods that can be deployed in a fairly straightforward manner. Nonparametric methods are statistical techniques that do not require a researcher to specify functional forms for objects being estimated. The methods surveyed are known as kernel methods, which are becoming increasingly popular for applied data analysis. The appeal of nonparametric methods stems from the fact that they relax the parametric assumptions imposed on the data generating process and let the data determine an appropriate model. Nonparametric Econometrics focuses on a set of touchstone topics while making liberal use of examples for illustrative purposes. The author provides settings in which the user may wish to model a dataset comprised of continuous, discrete, or categorical data (nominal or ordinal), or any combination thereof. Recent developments are considered, including some where the variables involved may in fact be irrelevant, which alters the behavior of the estimators and optimal bandwidths in a manner that deviates substantially from conventional approaches.




Nonlinear Time Series


Book Description

Useful in the theoretical and empirical analysis of nonlinear time series data, semiparametric methods have received extensive attention in the economics and statistics communities over the past twenty years. Recent studies show that semiparametric methods and models may be applied to solve dimensionality reduction problems arising from using fully




Statistical and Econometric Methods for Transportation Data Analysis, Second Edition


Book Description

The complexity, diversity, and random nature of transportation problems necessitates a broad analytical toolbox. Describing tools commonly used in the field, Statistical and Econometric Methods for Transportation Data Analysis, Second Edition provides an understanding of a broad range of analytical tools required to solve transportation problems. It includes a wide breadth of examples and case studies covering applications in various aspects of transportation planning, engineering, safety, and economics. After a solid refresher on statistical fundamentals, the book focuses on continuous dependent variable models and count and discrete dependent variable models. Along with an entirely new section on other statistical methods, this edition offers a wealth of new material. New to the Second Edition A subsection on Tobit and censored regressions An explicit treatment of frequency domain time series analysis, including Fourier and wavelets analysis methods New chapter that presents logistic regression commonly used to model binary outcomes New chapter on ordered probability models New chapters on random-parameter models and Bayesian statistical modeling New examples and data sets Each chapter clearly presents fundamental concepts and principles and includes numerous references for those seeking additional technical details and applications. To reinforce a practical understanding of the modeling techniques, the data sets used in the text are offered on the book’s CRC Press web page. PowerPoint and Word presentations for each chapter are also available for download.