Long-Memory Time Series


Book Description

A self-contained, contemporary treatment of the analysis of long-range dependent data Long-Memory Time Series: Theory and Methods provides an overview of the theory and methods developed to deal with long-range dependent data and describes the applications of these methodologies to real-life time series. Systematically organized, it begins with the foundational essentials, proceeds to the analysis of methodological aspects (Estimation Methods, Asymptotic Theory, Heteroskedastic Models, Transformations, Bayesian Methods, and Prediction), and then extends these techniques to more complex data structures. To facilitate understanding, the book: Assumes a basic knowledge of calculus and linear algebra and explains the more advanced statistical and mathematical concepts Features numerous examples that accelerate understanding and illustrate various consequences of the theoretical results Proves all theoretical results (theorems, lemmas, corollaries, etc.) or refers readers to resources with further demonstration Includes detailed analyses of computational aspects related to the implementation of the methodologies described, including algorithm efficiency, arithmetic complexity, CPU times, and more Includes proposed problems at the end of each chapter to help readers solidify their understanding and practice their skills A valuable real-world reference for researchers and practitioners in time series analysis, economerics, finance, and related fields, this book is also excellent for a beginning graduate-level course in long-memory processes or as a supplemental textbook for those studying advanced statistics, mathematics, economics, finance, engineering, or physics. A companion Web site is available for readers to access the S-Plus and R data sets used within the text.




Time Series Analysis with Long Memory in View


Book Description

Provides a simple exposition of the basic time series material, and insights into underlying technical aspects and methods of proof Long memory time series are characterized by a strong dependence between distant events. This book introduces readers to the theory and foundations of univariate time series analysis with a focus on long memory and fractional integration, which are embedded into the general framework. It presents the general theory of time series, including some issues that are not treated in other books on time series, such as ergodicity, persistence versus memory, asymptotic properties of the periodogram, and Whittle estimation. Further chapters address the general functional central limit theory, parametric and semiparametric estimation of the long memory parameter, and locally optimal tests. Intuitive and easy to read, Time Series Analysis with Long Memory in View offers chapters that cover: Stationary Processes; Moving Averages and Linear Processes; Frequency Domain Analysis; Differencing and Integration; Fractionally Integrated Processes; Sample Means; Parametric Estimators; Semiparametric Estimators; and Testing. It also discusses further topics. This book: Offers beginning-of-chapter examples as well as end-of-chapter technical arguments and proofs Contains many new results on long memory processes which have not appeared in previous and existing textbooks Takes a basic mathematics (Calculus) approach to the topic of time series analysis with long memory Contains 25 illustrative figures as well as lists of notations and acronyms Time Series Analysis with Long Memory in View is an ideal text for first year PhD students, researchers, and practitioners in statistics, econometrics, and any application area that uses time series over a long period. It would also benefit researchers, undergraduates, and practitioners in those areas who require a rigorous introduction to time series analysis.




Time Series with Long Memory


Book Description

Long memory time series are characterized by a strong dependence between distant events.




Long-Memory Processes


Book Description

Long-memory processes are known to play an important part in many areas of science and technology, including physics, geophysics, hydrology, telecommunications, economics, finance, climatology, and network engineering. In the last 20 years enormous progress has been made in understanding the probabilistic foundations and statistical principles of such processes. This book provides a timely and comprehensive review, including a thorough discussion of mathematical and probabilistic foundations and statistical methods, emphasizing their practical motivation and mathematical justification. Proofs of the main theorems are provided and data examples illustrate practical aspects. This book will be a valuable resource for researchers and graduate students in statistics, mathematics, econometrics and other quantitative areas, as well as for practitioners and applied researchers who need to analyze data in which long memory, power laws, self-similar scaling or fractal properties are relevant.




Modeling Financial Time Series with S-PLUS


Book Description

The field of financial econometrics has exploded over the last decade This book represents an integration of theory, methods, and examples using the S-PLUS statistical modeling language and the S+FinMetrics module to facilitate the practice of financial econometrics. This is the first book to show the power of S-PLUS for the analysis of time series data. It is written for researchers and practitioners in the finance industry, academic researchers in economics and finance, and advanced MBA and graduate students in economics and finance. Readers are assumed to have a basic knowledge of S-PLUS and a solid grounding in basic statistics and time series concepts. This Second Edition is updated to cover S+FinMetrics 2.0 and includes new chapters on copulas, nonlinear regime switching models, continuous-time financial models, generalized method of moments, semi-nonparametric conditional density models, and the efficient method of moments. Eric Zivot is an associate professor and Gary Waterman Distinguished Scholar in the Economics Department, and adjunct associate professor of finance in the Business School at the University of Washington. He regularly teaches courses on econometric theory, financial econometrics and time series econometrics, and is the recipient of the Henry T. Buechel Award for Outstanding Teaching. He is an associate editor of Studies in Nonlinear Dynamics and Econometrics. He has published papers in the leading econometrics journals, including Econometrica, Econometric Theory, the Journal of Business and Economic Statistics, Journal of Econometrics, and the Review of Economics and Statistics. Jiahui Wang is an employee of Ronin Capital LLC. He received a Ph.D. in Economics from the University of Washington in 1997. He has published in leading econometrics journals such as Econometrica and Journal of Business and Economic Statistics, and is the Principal Investigator of National Science Foundation SBIR grants. In 2002 Dr. Wang was selected as one of the "2000 Outstanding Scholars of the 21st Century" by International Biographical Centre.




Statistics for Long-Memory Processes


Book Description

Statistical Methods for Long Term Memory Processes covers the diverse statistical methods and applications for data with long-range dependence. Presenting material that previously appeared only in journals, the author provides a concise and effective overview of probabilistic foundations, statistical methods, and applications. The material emphasizes basic principles and practical applications and provides an integrated perspective of both theory and practice. This book explores data sets from a wide range of disciplines, such as hydrology, climatology, telecommunications engineering, and high-precision physical measurement. The data sets are conveniently compiled in the index, and this allows readers to view statistical approaches in a practical context. Statistical Methods for Long Term Memory Processes also supplies S-PLUS programs for the major methods discussed. This feature allows the practitioner to apply long memory processes in daily data analysis. For newcomers to the area, the first three chapters provide the basic knowledge necessary for understanding the remainder of the material. To promote selective reading, the author presents the chapters independently. Combining essential methodologies with real-life applications, this outstanding volume is and indispensable reference for statisticians and scientists who analyze data with long-range dependence.




Time Series Analysis with Long Memory in View


Book Description

Provides a simple exposition of the basic time series material, and insights into underlying technical aspects and methods of proof Long memory time series are characterized by a strong dependence between distant events. This book introduces readers to the theory and foundations of univariate time series analysis with a focus on long memory and fractional integration, which are embedded into the general framework. It presents the general theory of time series, including some issues that are not treated in other books on time series, such as ergodicity, persistence versus memory, asymptotic properties of the periodogram, and Whittle estimation. Further chapters address the general functional central limit theory, parametric and semiparametric estimation of the long memory parameter, and locally optimal tests. Intuitive and easy to read, Time Series Analysis with Long Memory in View offers chapters that cover: Stationary Processes; Moving Averages and Linear Processes; Frequency Domain Analysis; Differencing and Integration; Fractionally Integrated Processes; Sample Means; Parametric Estimators; Semiparametric Estimators; and Testing. It also discusses further topics. This book: Offers beginning-of-chapter examples as well as end-of-chapter technical arguments and proofs Contains many new results on long memory processes which have not appeared in previous and existing textbooks Takes a basic mathematics (Calculus) approach to the topic of time series analysis with long memory Contains 25 illustrative figures as well as lists of notations and acronyms Time Series Analysis with Long Memory in View is an ideal text for first year PhD students, researchers, and practitioners in statistics, econometrics, and any application area that uses time series over a long period. It would also benefit researchers, undergraduates, and practitioners in those areas who require a rigorous introduction to time series analysis.




Large Sample Inference For Long Memory Processes


Book Description

Box and Jenkins (1970) made the idea of obtaining a stationary time series by differencing the given, possibly nonstationary, time series popular. Numerous time series in economics are found to have this property. Subsequently, Granger and Joyeux (1980) and Hosking (1981) found examples of time series whose fractional difference becomes a short memory process, in particular, a white noise, while the initial series has unbounded spectral density at the origin, i.e. exhibits long memory.Further examples of data following long memory were found in hydrology and in network traffic data while in finance the phenomenon of strong dependence was established by dramatic empirical success of long memory processes in modeling the volatility of the asset prices and power transforms of stock market returns.At present there is a need for a text from where an interested reader can methodically learn about some basic asymptotic theory and techniques found useful in the analysis of statistical inference procedures for long memory processes. This text makes an attempt in this direction. The authors provide in a concise style a text at the graduate level summarizing theoretical developments both for short and long memory processes and their applications to statistics. The book also contains some real data applications and mentions some unsolved inference problems for interested researchers in the field./a







Time Series


Book Description

The goals of this text are to develop the skills and an appreciation for the richness and versatility of modern time series analysis as a tool for analyzing dependent data. A useful feature of the presentation is the inclusion of nontrivial data sets illustrating the richness of potential applications to problems in the biological, physical, and social sciences as well as medicine. The text presents a balanced and comprehensive treatment of both time and frequency domain methods with an emphasis on data analysis. Numerous examples using data illustrate solutions to problems such as discovering natural and anthropogenic climate change, evaluating pain perception experiments using functional magnetic resonance imaging, and the analysis of economic and financial problems. The text can be used for a one semester/quarter introductory time series course where the prerequisites are an understanding of linear regression, basic calculus-based probability skills, and math skills at the high school level. All of the numerical examples use the R statistical package without assuming that the reader has previously used the software. Robert H. Shumway is Professor Emeritus of Statistics, University of California, Davis. He is a Fellow of the American Statistical Association and has won the American Statistical Association Award for Outstanding Statistical Application. He is the author of numerous texts and served on editorial boards such as the Journal of Forecasting and the Journal of the American Statistical Association. David S. Stoffer is Professor of Statistics, University of Pittsburgh. He is a Fellow of the American Statistical Association and has won the American Statistical Association Award for Outstanding Statistical Application. He is currently on the editorial boards of the Journal of Forecasting, the Annals of Statistical Mathematics, and the Journal of Time Series Analysis. He served as a Program Director in the Division of Mathematical Sciences at the National Science Foundation and as an Associate Editor for the Journal of the American Statistical Association and the Journal of Business & Economic Statistics.