Gravitational Wave Data Analysis


Book Description

The articles in this book represent the major contributions at the NATO Advanced Research Workshop that was held from 6 to 9 July 1987 in the magnificent setting of Dyffryn House and Gardens, in St. Nicholas, just outside Cardiff, Wales. The idea for such a meeting arose in discussions that I had in 1985 and 1986 with many of the principal members of the various groups building prototype laser-interferometric gravitational wave detectors. It became clear that the proposals that these groups were planning to submit for large-scale detectors would have to address questions like the following: • What computing hardware might be required to sift through data corning in at rates of several gigabytes per day for gravitational wave events that might last only a second or less and occur as rarely as once a month? • What software would be required for this task, and how much effort would be required to write it? • Given that every group accepted that a worldwide network of detectors operating in co incidence with one another was required in order to provide both convincing evidence of detections of gravitational waves and sufficient information to determine the amplitude and direction of the waves that had been detected, what sort of problems would the necessary data exchanges raise? Yet most of the effort in these groups had, quite naturally, been concentrated on the detector systems.







Analysis of Gravitational-Wave Data


Book Description

Introducing gravitational-wave data analysis, this book is an ideal starting point for researchers entering the field, and researchers currently analyzing data. Detailed derivations of the basic formulae enable readers to apply general statistical concepts to the analysis of gravitational-wave signals. It also discusses new ideas on devising the efficient algorithms.










De-noising of Gravitational-Wave Data


Book Description

Since the first experimental evidence for the existence of gravitational waves in 2015, the amount of data in this scientific area has increased enormously. There has also been a great deal of interest in the scientific community in gravitational waves. The interferometers, used to capture these waves, need to achieve a high level of instrumental sensitivity to be able to detect and analyse the weak signals emitted by both distant sources of intrinsically high intensity and nearby sources of much lower intensity. High sensitivity is often accompanied by high levels of noise that difficult data analysis. In nowadays interferometers, large amounts of data are recorded with a high percentage of noise from which we attempt to extract the possible gravitational waves buried therein. In this dissertation we propose to use a denoising method based on the minimisation of the total variance of the time series that constitute the data. Known as the ROF method, it assumes that the largest contribution to the total variance of a function comes from noise. In this way, a minimisation of this variance should lead to a drastic reduction in the presence of noise. This denoising procedure helps to improve the detection and data quality of gravitational wave analysis. We have implemented two ROF-based denoising algorithms in a commonly used gravitational-wave analysis software package. The analysis package is known as coherent WaveBurst (cWB) and uses the excess energy from the coherence between data from two or more interferometers to find gravitational waves. The denoising methods are the one-step regularised ROF (rROF), and the iterative rROF procedure (irROF). We have tested both methods using events from the gravitational-wave catalogue of the first three observing periods of the LIGO-Virgo-KAGRA scientific collaboration. These events, named GW1501914, GW151226, GW170817 and GW190521, comprise different wave morphologies of compact binary systems injected at different noise quality levels.







De-noising of Gravitational-Wave Data


Book Description

Since the first experimental evidence for the existence of gravitational waves in 2015, the amount of data in this scientific area has increased enormously. There has also been a great deal of interest in the scientific community in gravitational waves. The interferometers, used to capture these waves, need to achieve a high level of instrumental sensitivity to be able to detect and analyse the weak signals emitted by both distant sources of intrinsically high intensity and nearby sources of much lower intensity. High sensitivity is often accompanied by high levels of noise that difficult data analysis. In nowadays interferometers, large amounts of data are recorded with a high percentage of noise from which we attempt to extract the possible gravitational waves buried therein. In this dissertation we propose to use a denoising method based on the minimisation of the total variance of the time series that constitute the data. Known as the ROF method, it assumes that the largest contribution to the total variance of a function comes from noise. In this way, a minimisation of this variance should lead to a drastic reduction in the presence of noise. This denoising procedure helps to improve the detection and data quality of gravitational wave analysis. We have implemented two ROF-based denoising algorithms in a commonly used gravitational-wave analysis software package. The analysis package is known as coherent WaveBurst (cWB) and uses the excess energy from the coherence between data from two or more interferometers to find gravitational waves. The denoising methods are the one-step regularised ROF (rROF), and the iterative rROF procedure (irROF). We have tested both methods using events from the gravitational-wave catalogue of the first three observing periods of the LIGO-Virgo-KAGRA scientific collaboration. These events, named GW1501914, GW151226, GW170817 and GW190521, comprise different wave morphologies of compact binary systems injected at different noise quality levels.