Complexity, Aftershock Sequences, and Uncertainty in Earthquake Statistics


Book Description

Earthquake statistics is a growing field of research with direct application to probabilistic seismic hazard evaluation. The earthquake process is a complex spatio-temporal phenomenon, and has been thought to be an example of the self-organised criticality (SOC) paradigm, in which events occur as cascades on a wide range of sizes, each determined by fine details of the rupture process. As a consequence, deterministic prediction of specific event sizes, locations, and times may well continue to remain elusive. However, probabilistic forecasting, based on statistical patterns of occurrence, is a much more realistic goal at present, and is being actively explored and tested in global initiatives. This thesis focuses on the temporal statistics of earthquake populations, exploring the uncertainties in various commonly-used procedures for characterising seismicity and explaining the origins of these uncertainties. Unlike many other SOC systems, earthquakes cluster in time and space through aftershock triggering. A key point in the thesis is to show that the earthquake inter-event time distribution is fundamentally bimodal: it is a superposition of a gamma component from correlated (co-triggered) events and an exponential component from independent events. Volcano-tectonic earthquakes at Italian and Hawaiian volcanoes exhibit a similar bimodality, which in this case, may arise as the sum of contributions from accelerating and decelerating rates of events preceding and succeeding volcanic activity. Many authors, motivated by universality in the scaling laws of critical point systems, have sought to demonstrate a universal data collapse in the form of a gamma distribution, but I show how this gamma form is instead an emergent property of the crossover between the two components. The relative size of these two components depends on how the data is selected, so there is no universal form. The mean earthquake rate--or, equivalently, inter-event time--for a given region takes time to converge to an accurate value, and it is important to characterise this sampling uncertainty. As a result of temporal clustering and non-independence of events, the convergence is found to be much slower than the Gaussian rate of the central limit theorem. The rate of this convergence varies systematically with the spatial extent of the region under consideration: the larger the region, the closer to Gaussian convergence. This can be understood in terms of the increasing independence of the inter-event times with increasing region size as aftershock sequences overlap in time to a greater extent. On the other hand, within this high-overlap regime, a maximum likelihood inversion of parameters for an epidemic-type statistical model suffers from lower accuracy and a systematic bias; specifically, the background rate is overestimated. This is because the effect of temporal overlapping is to mask the correlations and make the time series look more like a Poisson process of independent events. This is an important result with practical relevance to studies using inversions, for example, to infer temporal variations in background rate for time-dependent hazard estimation.







Improving Uncertainty Quantification and Visualization for Spatiotemporal Earthquake Rate Models for the Pacific Northwest


Book Description

The Pacific Northwest (PNW) has substantial earthquake risk, both due to the offshore Cascadia megathrust fault but also other fault systems that produce earthquakes under the region's population centers. Forecasts of aftershocks following large earthquakes are thus highly desirable and require statistical models of a catalog of the PNW’s past earthquakes and aftershock sequences. This is complicated by the fact that the PNW contains multiple tectonic regimes hypothesized to have different aftershock dynamics as well as two types of earthquake clustering (aftershock sequences and swarms). The Epidemic-Type Aftershock Sequence (ETAS) model is a top-performing spatiotemporal point process model which describes the dynamics of earthquakes and aftershocks in a seismic region using a set of parameters. Typically, maximum likelihood estimation is used to fit ETAS to an earthquake catalog; however, the ETAS likelihood suffers from flatness near its optima, parameter correlation and numerical instability, making likelihood-based estimates less reliable. We present a Bayesian procedure for ETAS estimation, such that parameter estimates and uncertainty can be robustly quantified, even for small and complex catalogs like the PNW. The procedure is conditional on knowing which earthquakes triggered which aftershocks; this latent structure and the ETAS parameters are estimated iteratively. The procedure uses a Gibbs sampler to conditionally estimate the posterior distributions of each part of the model. We simulate several synthetic catalogs and test the modelling procedure, showing well-mixed posterior distributions centered on true parameter values. We also use the procedure to model the continental PNW, using a new catalog formed by algorthmically combining US and Canadian data sources and then, identifying and removing earthquake swarms. While MLEs are unstable and depend on both the optimization procedure and its initial values, Bayesian estimates are insensitive to these choices. Bayesian estimates also fit the catalog better than do MLEs. We use the Bayesian method to quantify the uncertainty in ETAS estimates when including swarms in the model or modelling across different tectonic regimes, as well as from catalog measurement error. Seismicity rate estimates and the earthquake forecasts they yield vary spatially and are usually represented as heat maps. While the visualization literature suggests that displaying forecast uncertainty improves understanding in users of forecast maps, research on uncertainty visualization (UV) is missing from earthquake science. In a pre-registered online experiment, we test the effectiveness of three UV techniques for displaying uncertainty in aftershock forecasts. Participants completed two map-reading tasks and a comparative judgment task, which demonstrated how successful a visualization was in reaching two key communication goals: indicating where many aftershocks and no aftershocks are likely (sure bets) and where the forecast is low but the uncertainty is high enough to imply potential risk (surprises). All visualizations performed equally well in the goal of communicating sure bet situations. But the visualization mapping the lower and upper bounds of an uncertainty interval was substantially better than the other map designs at communicating potential surprises. We discuss the implications of these experimental results for the communication of uncertainty in aftershock forecast maps.




Complexity of Seismic Time Series


Book Description

Complexity of Seismic Time Series: Measurement and Application applies the tools of nonlinear dynamics to seismic analysis, allowing for the revelation of new details in micro-seismicity, new perspectives in seismic noise, and new tools for prediction of seismic events. The book summarizes both advances and applications in the field, thus meeting the needs of both fundamental and practical seismology. Merging the needs of the classical field and the very modern terms of complexity science, this book covers theory and its application to advanced nonlinear time series tools to investigate Earth’s vibrations, making it a valuable tool for seismologists, hazard managers and engineers. Covers the topic of Earth’s vibrations involving many different aspects of theoretical and observational seismology Identifies applications of advanced nonlinear time series tools for the characterization of these Earth’s signals Merges the needs of geophysics with the applications of complexity theory Describes different methodologies to analyze problems, not only in the context of geosciences, but also those associated with different complex systems across disciplines




Statistical Seismology


Book Description

Statistical Seismology aims to bridge the gap between physics-based and statistics-based models. This volume provides a combination of reviews, methodological studies, and applications, which point to promising efforts in this field. The volume will be useful to students and professional researchers alike, who are interested in using stochastic modeling for probing the nature of earthquake phenomena, as well as an essential ingredient for earthquake forecasting.




Treatise on Geophysics: Earthquake seismology


Book Description

The Treatise on geophysics is the only comprehensive, state-of-the-art, and integrated summary of the present state of geophysics. Offering an array of articles from some of the top scientists around the world, this 11-volume work deals with all major parts of solid-Earth geophysics, including a volume on the terrestrial planets and moons in our Solar System. This major reference work will aid researchers, advanced undergrad and graduate students, as well as professionals in cutting-edge research.




Statistical Methods and Modeling of Seismogenesis


Book Description

The study of earthquakes is a multidisciplinary field, an amalgam of geodynamics, mathematics, engineering and more. The overriding commonality between them all is the presence of natural randomness. Stochastic studies (probability, stochastic processes and statistics) can be of different types, for example, the black box approach (one state), the white box approach (multi-state), the simulation of different aspects, and so on. This book has the advantage of bringing together a group of international authors, known for their earthquake-specific approaches, to cover a wide array of these myriad aspects. A variety of topics are presented, including statistical nonparametric and parametric methods, a multi-state system approach, earthquake simulators, post-seismic activity models, time series Markov models with regression, scaling properties and multifractal approaches, selfcorrecting models, the linked stress release model, Markovian arrival models, Poisson-based detection techniques, change point detection techniques on seismicity models, and, finally, semi-Markov models for earthquake forecasting.




Living on an Active Earth


Book Description

The destructive force of earthquakes has stimulated human inquiry since ancient times, yet the scientific study of earthquakes is a surprisingly recent endeavor. Instrumental recordings of earthquakes were not made until the second half of the 19th century, and the primary mechanism for generating seismic waves was not identified until the beginning of the 20th century. From this recent start, a range of laboratory, field, and theoretical investigations have developed into a vigorous new discipline: the science of earthquakes. As a basic science, it provides a comprehensive understanding of earthquake behavior and related phenomena in the Earth and other terrestrial planets. As an applied science, it provides a knowledge base of great practical value for a global society whose infrastructure is built on the Earth's active crust. This book describes the growth and origins of earthquake science and identifies research and data collection efforts that will strengthen the scientific and social contributions of this exciting new discipline.




Integrated Approaches to Earthquake Forecasting


Book Description

A comprehensive study on seismic hazard and earthquake triggering is crucial for effective mitigation of earthquake risks. The destructive nature of earthquakes motivates researchers to work on forecasting despite the apparent randomness of the earthquake occurrences. Understanding their underlying mechanisms and patterns is vital, given their potential for widespread devastation and loss of life. This thesis combines methodologies, including Coulomb stress calculations and aftershock analysis, to shed light on earthquake complexities, ultimately enhancing seismic hazard assessment. The Coulomb failure stress (CFS) criterion is widely used to predict the spatial distributions of aftershocks following large earthquakes. However, uncertainties associated with CFS calculations arise from non-unique slip inversions and unknown fault networks, particularly due to the choice of the assumed aftershocks (receiver) mechanisms. Recent studies have proposed alternative stress quantities and deep neural network approaches as superior to CFS with predefined receiver mechanisms. [...].




Geocomplexity and the Physics of Earthquakes


Book Description

Published by the American Geophysical Union as part of the Geophysical Monograph Series, Volume 120. Earthquakes in urban centers are capable of causing enormous damage. The January 16, 1995 Kobe, Japan earthquake was only a magnitude 6.9 event and yet produced an estimated $200 billion loss. Despite an active earthquake prediction program in Japan, this event was a complete surprise. Similar scenarios are possible in Los Angeles, San Francisco, Seattle, and other urban centers around the Pacific plate boundary. The development of forecast or prediction methodologies for these great damaging earthquakes has been complicated by the fact that the largest events repeat at irregular intervals of hundreds to thousands of years, resulting in a limited historical record that has frustrated phenomenological studies. The papers in this book describe an emerging alternative approach, which is based on a new understanding of earthquake physics arising from the construction and analysis of numerical simulations. With these numerical simulations, earthquake physics now can be investigated in numerical laboratories. Simulation data from numerical experiments can be used to develop theoretical understanding that can be subsequently applied to observed data. These methods have been enabled by the information technology revolution, in which fundamental advances in computing and communications are placing vast computational resources at our disposal.