Earthquake Statistical Analysis through Multi-state Modeling


Book Description

Earthquake occurrence modeling is a rapidly developing research area. This book deals with its critical issues, ranging from theoretical advances to practical applications. The introductory chapter outlines state-of-the-art earthquake modeling approaches based on stochastic models. Chapter 2 presents seismogenesis in association with the evolving stress field. Chapters 3 to 5 present earthquake occurrence modeling by means of hidden (semi-)Markov models and discuss associated characteristic measures and relative estimation aspects. Further comparisons, the most important results and our concluding remarks are provided in Chapters 6 and 7.




Statistical Methods and Modeling of Seismogenesis


Book Description

The study of earthquakes is a multidisciplinary field, an amalgam of geodynamics, mathematics, engineering and more. The overriding commonality between them all is the presence of natural randomness. Stochastic studies (probability, stochastic processes and statistics) can be of different types, for example, the black box approach (one state), the white box approach (multi-state), the simulation of different aspects, and so on. This book has the advantage of bringing together a group of international authors, known for their earthquake-specific approaches, to cover a wide array of these myriad aspects. A variety of topics are presented, including statistical nonparametric and parametric methods, a multi-state system approach, earthquake simulators, post-seismic activity models, time series Markov models with regression, scaling properties and multifractal approaches, selfcorrecting models, the linked stress release model, Markovian arrival models, Poisson-based detection techniques, change point detection techniques on seismicity models, and, finally, semi-Markov models for earthquake forecasting.




Earthquakes


Book Description

This book is the first comprehensive and methodologically rigorous analysis of earthquake occurrence. Models based on the theory of the stochastic multidimensional point processes are employed to approximate the earthquake occurrence pattern and evaluate its parameters. The Author shows that most of these parameters have universal values. These results help explain the classical earthquake distributions: Omori's law and the Gutenberg-Richter relation. The Author derives a new negative-binomial distribution for earthquake numbers, instead of the Poisson distribution, and then determines a fractal correlation dimension for spatial distributions of earthquake hypocenters. The book also investigates the disorientation of earthquake focal mechanisms and shows that it follows the rotational Cauchy distribution. These statistical and mathematical advances make it possible to produce quantitative forecasts of earthquake occurrence. In these forecasts earthquake rate in time, space, and focal mechanism orientation is evaluated.




Earthquake Data in Engineering Seismology


Book Description

This book addresses current activities in strong-motion networks around the globe, covering issues related to designing, maintaining and disseminating information from these arrays. The book is divided into three principal sections. The first section includes recent developments in regional and global ground-motion predictive models. It presents discussions on the similarities and differences of ground motion estimations from these models and their application to design spectra as well as other novel procedures for predicting engineering parameters in seismic regions with sparse data. The second section introduces topics about the particular methodologies being implemented in the recently established global and regional strong-motion databanks in Europe to maintain and disseminate the archived accelerometric data. The final section describes major strong-motion arrays around the world and their historical developments. The last three chapters of this section introduce projects carried out within the context of arrays deployed for seismic risk studies in metropolitan areas. Audience: This timely book will be of particular interest for researchers who use accelerometric data extensively to conduct studies in earthquake engineering and engineering seismology.







An Introduction to the Theory of Point Processes


Book Description

Point processes and random measures find wide applicability in telecommunications, earthquakes, image analysis, spatial point patterns, and stereology, to name but a few areas. The authors have made a major reshaping of their work in their first edition of 1988 and now present their Introduction to the Theory of Point Processes in two volumes with sub-titles Elementary Theory and Models and General Theory and Structure. Volume One contains the introductory chapters from the first edition, together with an informal treatment of some of the later material intended to make it more accessible to readers primarily interested in models and applications. The main new material in this volume relates to marked point processes and to processes evolving in time, where the conditional intensity methodology provides a basis for model building, inference, and prediction. There are abundant examples whose purpose is both didactic and to illustrate further applications of the ideas and models that are the main substance of the text.




Characterization of Modern and Historical Seismic–Tsunamic Events, and Their Global–Societal Impacts


Book Description

Earthquakes and tsunamis are devastating geohazards with significant societal impacts. Most recent occurrences have shown that their impact on the stability of nations–societies and the world geopolitics is immense, potentially triggering a tipping point for a major downturn in the global economy. This Special Publication presents the most current information on the causes and effects of some of the modern and historical earthquake–tsunami events, and effective practices of risk assessment–disaster management, implemented by various governments, international organizations and intergovernmental agencies. Findings reported here show that the magnitude of human casualties and property loss resulting from earthquakes–tsunamis are highly variable around the globe, and that increased community, national and global resilience is significant to empower societal preparedness for such geohazards. It is clear that all stakeholders, including scientists, policymakers, governments, media and world organizations must work together to disseminate accurate, objective and timely information on geohazards, and to develop effective legislation for risk reduction and realistic hazard mitigation–management measures in our globally connected world of today.










Big Data and Knowledge Sharing in Virtual Organizations


Book Description

Knowledge in its pure state is tacit in nature—difficult to formalize and communicate—but can be converted into codified form and shared through both social interactions and the use of IT-based applications and systems. Even though there seems to be considerable synergies between the resulting huge data and the convertible knowledge, there is still a debate on how the increasing amount of data captured by corporations could improve decision making and foster innovation through effective knowledge-sharing practices. Big Data and Knowledge Sharing in Virtual Organizations provides innovative insights into the influence of big data analytics and artificial intelligence and the tools, methods, and techniques for knowledge-sharing processes in virtual organizations. The content within this publication examines cloud computing, machine learning, and knowledge sharing. It is designed for government officials and organizations, policymakers, academicians, researchers, technology developers, and students.