Classical Statistical Mechanics with Nested Sampling


Book Description

This thesis develops a nested sampling algorithm into a black box tool for directly calculating the partition function, and thus the complete phase diagram of a material, from the interatomic potential energy function. It represents a significant step forward in our ability to accurately describe the finite temperature properties of materials. In principle, the macroscopic phases of matter are related to the microscopic interactions of atoms by statistical mechanics and the partition function. In practice, direct calculation of the partition function has proved infeasible for realistic models of atomic interactions, even with modern atomistic simulation methods. The thesis also shows how the output of nested sampling calculations can be processed to calculate the complete PVT (pressure–volume–temperature) equation of state for a material, and applies the nested sampling algorithm to calculate the pressure–temperature phase diagrams of aluminium and a model binary alloy.




Black Hole Formation and Growth


Book Description

The ultimate proofs that black holes exist have been obtained very recently thanks to the detection of gravitational waves from their coalescence and due to material orbiting at a distance of some gravitational radii imaged by optical interferometry or X-ray reverberation mapping. This book provides three comprehensive and up-to-date reviews covering the gravitational wave breakthrough, our understanding of accretion and feedback in supermassive black holes and the relevance of black holes for the Universe since the Big Bang. Neil J. Cornish presents gravitational wave emission from black hole mergers and the physics of detection. Andrew King reviews the physics of accretion on to supermassive black holes and their feedback on host galaxies. Tiziana Di Matteo addresses our understanding of black hole formation at cosmic dawn, the emergence of the first quasars, black hole merging and structure formation. The topics covered by the 48th Saas-Fee Course provide a broad overview of the importance of black holes in modern astrophysics.




The Random-Cluster Model


Book Description

The random-cluster model has emerged as a key tool in the mathematical study of ferromagnetism. It may be viewed as an extension of percolation to include Ising and Potts models, and its analysis is a mix of arguments from probability and geometry. The Random-Cluster Model contains accounts of the subcritical and supercritical phases, together with clear statements of important open problems. The book includes treatment of the first-order (discontinuous) phase transition.




Statistical Inference as Severe Testing


Book Description

Mounting failures of replication in social and biological sciences give a new urgency to critically appraising proposed reforms. This book pulls back the cover on disagreements between experts charged with restoring integrity to science. It denies two pervasive views of the role of probability in inference: to assign degrees of belief, and to control error rates in a long run. If statistical consumers are unaware of assumptions behind rival evidence reforms, they can't scrutinize the consequences that affect them (in personalized medicine, psychology, etc.). The book sets sail with a simple tool: if little has been done to rule out flaws in inferring a claim, then it has not passed a severe test. Many methods advocated by data experts do not stand up to severe scrutiny and are in tension with successful strategies for blocking or accounting for cherry picking and selective reporting. Through a series of excursions and exhibits, the philosophy and history of inductive inference come alive. Philosophical tools are put to work to solve problems about science and pseudoscience, induction and falsification.




Statistical Rethinking


Book Description

Statistical Rethinking: A Bayesian Course with Examples in R and Stan builds readers’ knowledge of and confidence in statistical modeling. Reflecting the need for even minor programming in today’s model-based statistics, the book pushes readers to perform step-by-step calculations that are usually automated. This unique computational approach ensures that readers understand enough of the details to make reasonable choices and interpretations in their own modeling work. The text presents generalized linear multilevel models from a Bayesian perspective, relying on a simple logical interpretation of Bayesian probability and maximum entropy. It covers from the basics of regression to multilevel models. The author also discusses measurement error, missing data, and Gaussian process models for spatial and network autocorrelation. By using complete R code examples throughout, this book provides a practical foundation for performing statistical inference. Designed for both PhD students and seasoned professionals in the natural and social sciences, it prepares them for more advanced or specialized statistical modeling. Web Resource The book is accompanied by an R package (rethinking) that is available on the author’s website and GitHub. The two core functions (map and map2stan) of this package allow a variety of statistical models to be constructed from standard model formulas.




Statistical Parametric Mapping: The Analysis of Functional Brain Images


Book Description

In an age where the amount of data collected from brain imaging is increasing constantly, it is of critical importance to analyse those data within an accepted framework to ensure proper integration and comparison of the information collected. This book describes the ideas and procedures that underlie the analysis of signals produced by the brain. The aim is to understand how the brain works, in terms of its functional architecture and dynamics. This book provides the background and methodology for the analysis of all types of brain imaging data, from functional magnetic resonance imaging to magnetoencephalography. Critically, Statistical Parametric Mapping provides a widely accepted conceptual framework which allows treatment of all these different modalities. This rests on an understanding of the brain's functional anatomy and the way that measured signals are caused experimentally. The book takes the reader from the basic concepts underlying the analysis of neuroimaging data to cutting edge approaches that would be difficult to find in any other source. Critically, the material is presented in an incremental way so that the reader can understand the precedents for each new development. This book will be particularly useful to neuroscientists engaged in any form of brain mapping; who have to contend with the real-world problems of data analysis and understanding the techniques they are using. It is primarily a scientific treatment and a didactic introduction to the analysis of brain imaging data. It can be used as both a textbook for students and scientists starting to use the techniques, as well as a reference for practicing neuroscientists. The book also serves as a companion to the software packages that have been developed for brain imaging data analysis. - An essential reference and companion for users of the SPM software - Provides a complete description of the concepts and procedures entailed by the analysis of brain images - Offers full didactic treatment of the basic mathematics behind the analysis of brain imaging data - Stands as a compendium of all the advances in neuroimaging data analysis over the past decade - Adopts an easy to understand and incremental approach that takes the reader from basic statistics to state of the art approaches such as Variational Bayes - Structured treatment of data analysis issues that links different modalities and models - Includes a series of appendices and tutorial-style chapters that makes even the most sophisticated approaches accessible




Probability and Statistical Physics in Two and More Dimensions


Book Description

This volume is a collection of lecture notes for six of the ten courses given in Buzios, Brazil by prominent probabilists at the 2010 Clay Mathematics Institute Summer School, ``Probability and Statistical Physics in Two and More Dimensions'' and at the XIV Brazilian School of Probability. In the past ten to fifteen years, various areas of probability theory related to statistical physics, disordered systems and combinatorics have undergone intensive development. A number of these developments deal with two-dimensional random structures at their critical points, and provide new tools and ways of coping with at least some of the limitations of Conformal Field Theory that had been so successfully developed in the theoretical physics community to understand phase transitions of two-dimensional systems. Included in this selection are detailed accounts of all three foundational courses presented at the Clay school--Schramm-Loewner Evolution and other Conformally Invariant Objects, Noise Sensitivity and Percolation, Scaling Limits of Random Trees and Planar Maps--together with contributions on Fractal and Multifractal properties of SLE and Conformal Invariance of Lattice Models. Finally, the volume concludes with extended articles based on the courses on Random Polymers and Self-Avoiding Walks given at the Brazilian School of Probability during the final week of the school. Together, these notes provide a panoramic, state-of-the-art view of probability theory areas related to statistical physics, disordered systems and combinatorics. Like the lectures themselves, they are oriented towards advanced students and postdocs, but experts should also find much of interest.




Data Analysis


Book Description

One of the strengths of this book is the author's ability to motivate the use of Bayesian methods through simple yet effective examples. - Katie St. Clair MAA Reviews.




An Introduction to Statistical Signal Processing


Book Description

This book describes the essential tools and techniques of statistical signal processing. At every stage theoretical ideas are linked to specific applications in communications and signal processing using a range of carefully chosen examples. The book begins with a development of basic probability, random objects, expectation, and second order moment theory followed by a wide variety of examples of the most popular random process models and their basic uses and properties. Specific applications to the analysis of random signals and systems for communicating, estimating, detecting, modulating, and other processing of signals are interspersed throughout the book. Hundreds of homework problems are included and the book is ideal for graduate students of electrical engineering and applied mathematics. It is also a useful reference for researchers in signal processing and communications.




Computational Physics


Book Description

This textbook presents basic and advanced computational physics in a very didactic style. It contains very-well-presented and simple mathematical descriptions of many of the most important algorithms used in computational physics. The first part of the book discusses the basic numerical methods. The second part concentrates on simulation of classical and quantum systems. Several classes of integration methods are discussed including not only the standard Euler and Runge Kutta method but also multi-step methods and the class of Verlet methods, which is introduced by studying the motion in Liouville space. A general chapter on the numerical treatment of differential equations provides methods of finite differences, finite volumes, finite elements and boundary elements together with spectral methods and weighted residual based methods. The book gives simple but non trivial examples from a broad range of physical topics trying to give the reader insight into not only the numerical treatment but also simulated problems. Different methods are compared with regard to their stability and efficiency. The exercises in the book are realised as computer experiments.