A Data-driven Building Seismic Response Prediction Framework: from Simulation and Recordings to Statistical Learning


Book Description

Structural seismic resilience society has been grown rapidly in the past three decades. Extensive probabilistic techniques have been developed to address uncertainties from ground motions and building systems to reduce structural damage, economic loss and social impact of buildings subjected to seismic hazards where seismic structural responses are essential and often are retrieved through Nonlinear Response History Analysis. This process is largely limited by accuracy of model and computational effort. An alternative data-driven framework is proposed to reconstruct structure responses through machine learning techniques from limited available sources which may potentially benefit for "real-time" interpolating monitoring data to enable rapid damage assessment and reducing computational effort for regional seismic hazard assessment. It also provides statistical insight to understand uncertainties of seismic building responses from both structural and earthquake engineering perspective.




Data-driven Frameworks for Hybrid Analysis of Structures Under Seismic Loading


Book Description

Numerical simulation and hybrid simulation are extensively used in earthquake engineering to evaluate the seismic response of structures under seismic loading. Despite the advances in computing power and the development of efficient integration algorithms in the past, numerical simulation techniques suffer from a high computational cost and the uncertainty associated with the definition of constitutive material models, boundary conditions, and mesh density, in particular in highly nonlinear, large or complex structures. On the other hand, the results of hybrid simulation can become biased when only one or limited number of potential critical components, seismic fuses, are physically tested due to laboratory or cost constraints. The recent progress in machine learning algorithms and applications in engineering has motivated novel and innovative simulation techniques achieved by leveraging data in various fields of engineering including seismic engineering where complexities arising from the stochastic nature of the phenomenon can be tackled by making use of available experimental and numerical data towards the development of more reliable simulation models and dynamic analysis frameworks. Furthermore, to better exploit the potential of data-driven models, such models can efficiently be incorporated into the physics-based and experimental techniques, leading to improved seismic response assessment methods. This M.Sc. thesis proposes two new hybrid analysis frameworks by integrating emerging data-driven techniques into the conventional structural response assessment techniques, namely numerical simulation and hybrid testing, to perform the nonlinear structural analysis under seismic loading. The first framework, referred to as the hybrid data-driven and physics-based simulation (HyDPS) technique, combines the well-understood components of the structure modeled numerically with the critical components of the structure, e.g., seismic fuses, simulated using the proposed data-driven PI-SINDy model. The data-driven model is developed for steel buckling-restrained braces based on experimental data to mathematically estimate the underlying relationship between displacement history and restoring force. The second framework incorporates the data-driven model into the conventional seismic hybrid simulation framework where the experimental test data of one of the critical components (physical twin), e.g., steel buckling-restrained brace, produced during hybrid simulation can be used in real-time to predict the nonlinear cyclic response of the other critical components of the system (digital twins) that are not physically tested. This framework features a novel multi-element seismic hybrid simulation technique achieved by recursively updating the force-deformation response of the digital twin. The performance of the proposed data-driven hybrid analysis frameworks is verified using past experimental test data and nonlinear response history analyses performed under representative earthquake ground motion accelerations. The results reveal that integrating data-driven techniques into conventional seismic analysis methods, namely numerical simulation and hybrid simulation, yields a more efficient seismic simulation tool that can be used to examine the seismic response of structural systems.




Machine-learning-based Models, Methods, and Software for Intensity, Vulnerability, and Risk Assessment of Central U.S. Induced Earthquakes


Book Description

Since 2009, the Central U.S. has been subjected to a new type of seismic hazard attributed to human activities from the petroleum industry. Since then, there has been an increase in the number of earthquakes in the Central U.S. from an average of 25 per year in 2008 to 365 in 2017. These earthquakes can adversely affect the safety of infrastructure in the region, considering most were designed with minimal to no seismic detailing considerations due to the historically low seismicity in the region. The main objective of this dissertation is threefold: 1) To characterize the seismic demand of these earthquakes by developing region-specific ground motion models. 2) To evaluate the vulnerability of the built environment (in particular, bridge portfolios and residential buildings with masonry façades) to these recent earthquakes by developing fragility functions. 3) To integrate the ground motion and fragility models with other region-specific information to investigate regional consequences (i.e., potential economic loss) on the built environment for future seismic events. This information is now used by the Texas Department of Transportation to inform decision-making in terms of post-earthquake response and planning for future events. For each objective, the present study combines machine learning science with structural and earthquake engineering knowledge into a data-driven, state-of-the-art framework to develop more reliable prediction models compared to the conventional methods in the literature. This dissertation comparatively investigates the advantages of using machine learning techniques instead of conventional methods in developing each model (i.e., ground motion and fragility models). Moreover, this study investigates the seismic characteristics, vulnerability, and risk associated with these earthquakes, compared with those associated with other seismic hazards in the U.S. The comparison includes similar magnitude natural earthquakes in the Western U.S., New Madrid seismic hazards (i.e., the historical seismic hazard of interest in the Central U.S.), and estimates from HAZUS (i.e., the software provided by Federal Emergency Management Agency for disaster risk assessment). As part of this study, open-source application software named ShakeRisk is developed for risk, reliability, and resilience assessment of the built environment to natural hazards. ShakeRisk provides a platform to integrate artificial intelligence, systems engineering, structural and earthquake engineering research fields to simulate civil infrastructure responses at both structural and system scales in a reliable and computationally efficient way. Adopting clean architecture principles and object-oriented programming language in the design of ShakeRisk, it can be readily extended by adding features (i.e., new data sources, models, analyses, and user interfaces) and customizing existing ones without the need to modify existing code




Modeling Multidimensional and Multi-scale Seismic Site Response Using a Data-driven 3D Vs Model


Book Description

Subsurface spatial variability is known to significantly influence the frequency content and amplitude of seismic ground shaking. A significant amount of seismic site response research over the past decade has focused on our abilities to replicate recorded ground motions at borehole array sites, where both the input (rock) and output (surface) ground motions are known. When viewed in aggregate, these studies have found that approximately 50% of borehole array sites are poorly modeled using one-dimensional (1D) ground response analyses (GRAs) based on a single shear wave velocity (Vs) profile, with individual studies reporting values between approximately 30-80%. When 1D GRAs fail to accurately predict recorded site response, the site is often considered too complex to be effectively modeled as 1D. While three-dimensional (3D) numerical GRAs are possible and believed to be more accurate, there is rarely a 3D subsurface model available for these analyses. The lack of affordable and reliable site characterization methods to quantify spatial variability in subsurface conditions, particularly regarding Vs measurements needed for GRAs, has pushed researchers to adopt stochastic approaches, such as Vs randomization and spatially correlated random fields. However, these stochastically generated models require the assumption of generic, or guessed, input parameters, introducing significant uncertainties into the site response predictions. This research describes a new geostatistical approach that can be used for building pseudo-3D Vs models as a means to rationally account for spatial variability in GRAs, increase model accuracy, and reduce uncertainty. The proposed approach distinguishes itself from previous studies in three key ways: (1) it requires only a single, accurately measured Vs profile down to engineering bedrock, (2) it relies majorly on estimates of fundamental site frequency (f0; a key parameter governing site effects) obtained from simple horizontal-to-vertical spectral ratio (H/V) noise measurements (f0,[subscript H/V]), and (3) it creates models that can be used to ensure proper incorporation of site-specific spatial variability in 1D, 2D, and 3D GRAs. At the two sites investigated in this research, the H/V geostatistical approach is capable of generating pseudo-3D Vs models that reliably capture important subsurface features present in geologic cross-sections. Furthermore, the 1D GRA predictions associated with the H/V geostatistical approach were more accurate than those associated with common and recently proposed strategies of accounting for Vs variability. One of the most significant contributions of this research is providing insights on the lateral area influencing seismic site response. The H/V geostatistical approach enables predicting site response as a function of the spatial variability across different footprints. The results show that 1D GRAs are significantly improved when an area of at least 400 m x 400 m (i.e., 0.16 km2) is incorporated, and even larger incorporated areas could produce better results. Thus, this size of an area might be considered as a minimum over which to account for spatial variability in GRAs. These results are supported by two-dimensional (2D) GRAs, which show that incorporating variability along at least 600 m was needed to appropriately model decreased amplification at the fundamental mode caused by wave scattering, while a lateral extent of 1700 m was needed to more accurately model other observed complex phenomena. These results and insights work toward achieving more accurate and reliable seismic hazard assessment and risk mitigation




Bulletin of the Atomic Scientists


Book Description

The Bulletin of the Atomic Scientists is the premier public resource on scientific and technological developments that impact global security. Founded by Manhattan Project Scientists, the Bulletin's iconic "Doomsday Clock" stimulates solutions for a safer world.




Seismic Hazard and Risk Analysis


Book Description

Seismic hazard and risk analyses underpin the loadings prescribed by engineering design codes, the decisions by asset owners to retrofit structures, the pricing of insurance policies, and many other activities. This is a comprehensive overview of the principles and procedures behind seismic hazard and risk analysis. It enables readers to understand best practises and future research directions. Early chapters cover the essential elements and concepts of seismic hazard and risk analysis, while later chapters shift focus to more advanced topics. Each chapter includes worked examples and problem sets for which full solutions are provided online. Appendices provide relevant background in probability and statistics. Computer codes are also available online to help replicate specific calculations and demonstrate the implementation of various methods. This is a valuable reference for upper level students and practitioners in civil engineering, and earth scientists interested in engineering seismology.







Resilience of Critical Infrastructure Systems


Book Description

With rapid urbanization in developing countries and the emergence of smart systems and integrated intelligent devices, the new generation of infrastructure will be smarter and more efficient. However, due to natural and anthropomorphic hazards, as well as the adverse impact of climate change, civil infrastructure systems are increasingly vulnerable. Therefore, future-proofing and designing resilience into infrastructure is one of the biggest challenges facing the industry and governments in all developing and industrialized societies. This book provides a comprehensive overview of infrastructure resiliency, new developments in this emerging field and its scopes, including ecology and sustainability, and the challenges involved in building more resilient civil infrastructure systems. Moreover, it introduces a strategic roadmap for effective and efficient methods needed for modeling, designing, and assessing resiliency. Features: Includes contributions from internationally recognized scholars in the emerging field of infrastructure resilience. Covers a broad range of topics in infrastructure resilience such as disaster assessment, civil infrastructure and lifeline systems, natural hazard mitigation, and seismic protection. Includes practical global case studies and leading-edge research from several countries. Presents an interdisciplinary approach in addressing the challenges in the emerging field of infrastructure resilience Resilience of Critical Infrastructure Systems: Emerging Developments and Future Challenges serves as a valuable resource for practicing professionals, researchers, and advanced students seeking practical, forward-looking guidance.




Building Machine Learning and Deep Learning Models on Google Cloud Platform


Book Description

Take a systematic approach to understanding the fundamentals of machine learning and deep learning from the ground up and how they are applied in practice. You will use this comprehensive guide for building and deploying learning models to address complex use cases while leveraging the computational resources of Google Cloud Platform. Author Ekaba Bisong shows you how machine learning tools and techniques are used to predict or classify events based on a set of interactions between variables known as features or attributes in a particular dataset. He teaches you how deep learning extends the machine learning algorithm of neural networks to learn complex tasks that are difficult for computers to perform, such as recognizing faces and understanding languages. And you will know how to leverage cloud computing to accelerate data science and machine learning deployments. Building Machine Learning and Deep Learning Models on Google Cloud Platform is divided into eight parts that cover the fundamentals of machine learning and deep learning, the concept of data science and cloud services, programming for data science using the Python stack, Google Cloud Platform (GCP) infrastructure and products, advanced analytics on GCP, and deploying end-to-end machine learning solution pipelines on GCP. What You’ll Learn Understand the principles and fundamentals of machine learning and deep learning, the algorithms, how to use them, when to use them, and how to interpret your resultsKnow the programming concepts relevant to machine and deep learning design and development using the Python stack Build and interpret machine and deep learning models Use Google Cloud Platform tools and services to develop and deploy large-scale machine learning and deep learning products Be aware of the different facets and design choices to consider when modeling a learning problem Productionalize machine learning models into software products Who This Book Is For Beginners to the practice of data science and applied machine learning, data scientists at all levels, machine learning engineers, Google Cloud Platform data engineers/architects, and software developers




Dynamic Mode Decomposition


Book Description

Data-driven dynamical systems is a burgeoning field?it connects how measurements of nonlinear dynamical systems and/or complex systems can be used with well-established methods in dynamical systems theory. This is a critically important new direction because the governing equations of many problems under consideration by practitioners in various scientific fields are not typically known. Thus, using data alone to help derive, in an optimal sense, the best dynamical system representation of a given application allows for important new insights. The recently developed dynamic mode decomposition (DMD) is an innovative tool for integrating data with dynamical systems theory. The DMD has deep connections with traditional dynamical systems theory and many recent innovations in compressed sensing and machine learning. Dynamic Mode Decomposition: Data-Driven Modeling of Complex Systems, the first book to address the DMD algorithm, presents a pedagogical and comprehensive approach to all aspects of DMD currently developed or under development; blends theoretical development, example codes, and applications to showcase the theory and its many innovations and uses; highlights the numerous innovations around the DMD algorithm and demonstrates its efficacy using example problems from engineering and the physical and biological sciences; and provides extensive MATLAB code, data for intuitive examples of key methods, and graphical presentations.