Allowing for Jump Measurements in Volatility


Book Description

Following recent advances in the non-parametric realized volatility approach, we separately measure the discontinuous jump part of the quadratic variation process for individual stocks and incorporate it into heterogeneous autoregressive volatility models. We analyze the distributional properties of the jump measures vis-à-vis the corresponding realized volatility ones, and compare them to those of aggregate US market index series. We also demonstrate important gains in the forecasting accuracy of high-frequency volatility models.




Forecasting in the Presence of Structural Breaks and Model Uncertainty


Book Description

Forecasting in the presence of structural breaks and model uncertainty are active areas of research with implications for practical problems in forecasting. This book addresses forecasting variables from both Macroeconomics and Finance, and considers various methods of dealing with model instability and model uncertainty when forming forecasts.




Portfolio Construction, Measurement, and Efficiency


Book Description

This volume, inspired by and dedicated to the work of pioneering investment analyst, Jack Treynor, addresses the issues of portfolio risk and return and how investment portfolios are measured. In a career spanning over fifty years, the primary questions addressed by Jack Treynor were: Is there an observable risk-return trade-off? How can stock selection models be integrated with risk models to enhance client returns? Do managed portfolios earn positive, and statistically significant, excess returns and can mutual fund managers time the market? Since the publication of a pair of seminal Harvard Business Review articles in the mid-1960’s, Jack Treynor has developed thinking that has greatly influenced security selection, portfolio construction and measurement, and market efficiency. Key publications addressed such topics as the Capital Asset Pricing Model and stock selection modeling and integration with risk models. Treynor also served as editor of the Financial Analysts Journal, through which he wrote many columns across a wide spectrum of topics. This volume showcases original essays by leading researchers and practitioners exploring the topics that have interested Treynor while applying the most current methodologies. Such topics include the origins of portfolio theory, market timing, and portfolio construction in equity markets. The result not only reinforces Treynor’s lasting contributions to the field but suggests new areas for research and analysis.




Financial Modelling with Jump Processes


Book Description

WINNER of a Riskbook.com Best of 2004 Book Award! During the last decade, financial models based on jump processes have acquired increasing popularity in risk management and option pricing. Much has been published on the subject, but the technical nature of most papers makes them difficult for nonspecialists to understand, and the mathematic




Handbook of Computational Finance


Book Description

Any financial asset that is openly traded has a market price. Except for extreme market conditions, market price may be more or less than a “fair” value. Fair value is likely to be some complicated function of the current intrinsic value of tangible or intangible assets underlying the claim and our assessment of the characteristics of the underlying assets with respect to the expected rate of growth, future dividends, volatility, and other relevant market factors. Some of these factors that affect the price can be measured at the time of a transaction with reasonably high accuracy. Most factors, however, relate to expectations about the future and to subjective issues, such as current management, corporate policies and market environment, that could affect the future financial performance of the underlying assets. Models are thus needed to describe the stochastic factors and environment, and their implementations inevitably require computational finance tools.




Handbook of Quantitative Finance and Risk Management


Book Description

Quantitative finance is a combination of economics, accounting, statistics, econometrics, mathematics, stochastic process, and computer science and technology. Increasingly, the tools of financial analysis are being applied to assess, monitor, and mitigate risk, especially in the context of globalization, market volatility, and economic crisis. This two-volume handbook, comprised of over 100 chapters, is the most comprehensive resource in the field to date, integrating the most current theory, methodology, policy, and practical applications. Showcasing contributions from an international array of experts, the Handbook of Quantitative Finance and Risk Management is unparalleled in the breadth and depth of its coverage. Volume 1 presents an overview of quantitative finance and risk management research, covering the essential theories, policies, and empirical methodologies used in the field. Chapters provide in-depth discussion of portfolio theory and investment analysis. Volume 2 covers options and option pricing theory and risk management. Volume 3 presents a wide variety of models and analytical tools. Throughout, the handbook offers illustrative case examples, worked equations, and extensive references; additional features include chapter abstracts, keywords, and author and subject indices. From "arbitrage" to "yield spreads," the Handbook of Quantitative Finance and Risk Management will serve as an essential resource for academics, educators, students, policymakers, and practitioners.




Uncertainty Analysis in Econometrics with Applications


Book Description

Unlike uncertain dynamical systems in physical sciences where models for prediction are somewhat given to us by physical laws, uncertain dynamical systems in economics need statistical models. In this context, modeling and optimization surface as basic ingredients for fruitful applications. This volume concentrates on the current methodology of copulas and maximum entropy optimization. This volume contains main research presentations at the Sixth International Conference of the Thailand Econometrics Society held at the Faculty of Economics, Chiang Mai University, Thailand, during January 10-11, 2013. It consists of keynote addresses, theoretical and applied contributions. These contributions to Econometrics are somewhat centered around the theme of Copulas and Maximum Entropy Econometrics. The method of copulas is applied to a variety of economic problems where multivariate model building and correlation analysis are needed. As for the art of choosing copulas in practical problems, the principle of maximum entropy surfaces as a potential way to do so. The state-of-the-art of Maximum Entropy Econometrics is presented in the first keynote address, while the second keynote address focusses on testing stationarity in economic time series data.




FinTech as a Disruptive Technology for Financial Institutions


Book Description

Financial institutions are tasked with keeping businesses of all sizes financially sounds while also providing accessible banking options to everyday individuals. Fintech, or financial technology, is an emerging disruptive technology in financial transaction that will change banking behavior for stakeholders and enable better traceability of funds against specific assets. FinTech as a Disruptive Technology for Financial Institutions is an essential reference source that discusses applications of FinTech in financial institutions in small, medium, and large businesses and through cultural and religious filters. Featuring research on topics such as machine learning, market development, crypto-currency, financial security, blockchain, and financial technology, this book is ideally designed for bankers, business managers, economists, computer scientists, academicians, researchers, financial professionals, and students.




Commodities


Book Description

Since a major source of income for many countries comes from exporting commodities, price discovery and information transmission between commodity futures markets are key issues for continued economic development. Commodities: Fundamental Theory of Futures, Forwards, and Derivatives Pricing, Second Edition covers the fundamental theory of and derivatives pricing for major commodity markets, as well as the interaction between commodity prices, the real economy, and other financial markets. After a thoroughly updated and extensive theoretical and practical introduction, this new edition of the book is divided into five parts – the fifth of which is entirely new material covering cutting-edge developments. Oil Products considers the structural changes in the demand and supply for hedging services that are increasingly determining the price of oil Other Commodities examines markets related to agricultural commodities, including natural gas, wine, soybeans, corn, gold, silver, copper, and other metals Commodity Prices and Financial Markets investigates the contemporary aspects of the financialization of commodities, including stocks, bonds, futures, currency markets, index products, and exchange traded funds Electricity Markets supplies an overview of the current and future modelling of electricity markets Contemporary Topics discuss rough volatility, order book trading, cryptocurrencies, text mining for price dynamics and flash crashes




Models for Data Analysis


Book Description

The 49Th Scientific meeting of the Italian Statistical Society was held in June 2018 in Palermo, with more than 450 attendants. There were plenary sessions as well as specialized and solicited and contributed sessions. This volume collects a selection of twenty extended contributions covering a wide area of applied and theoretical issues, according to the modern trends in statistical sciences. Only to mention some topics, there are papers on modern textual analysis, sensorial analysis, social inequalities, themes on demography, modern modeling of functional data and high dimensional data, and many other topics. This volume is addressed to academics, PhD students, professionals and researchers in applied and theoretical statistical models for data analysis.