We analyze the signal processing required for the optimal detection of a stochastic background of gravitational radiation using laser interferometric detectors. Starting with basic assumptions about the statistical properties of a stochastic gravity-wave background, we derive expressions for the optimal filter function and signal-to-noise ratio for the cross-correlation of the outputs of two gravity-wave detectors. Sensitivity levels required for detection are then calculated. Issues related to (i) calculating the signal-to-noise ratio for arbitrarily large stochastic backgrounds, (ii) performing the data analysis in the presence of nonstationary detector noise, (iii) combining data from multiple detector pairs to increase the sensitivity of a stochastic background search, (iv) correlating the outputs of 4 or more detectors, and (v) allowing for the possibility of correlated noise in the outputs of two detectors are discussed. We briefly describe a computer simulation that was used to “experimentally” verify the theoretical calculations derived in the paper, and which mimics the generation and detection of a simulated stochastic gravity-wave signal in the presence of simulated detector noise. Numerous graphs and tables of numerical data for the five major interferometers (LIGO-WA, LIGO-LA, VIRGO, GEO-600, and TAMA-300) are also given. This information consists of graphs of the noise power spectra, overlap reduction functions, and optimal filter functions; also included are tables of the signal-to-noise ratios and sensitivity levels for cross-correlation measurements between different detector pairs. The treatment given in this paper should be accessible to both theorists involved in data analysis and experimentalists involved in detector design and data acquisition.
|Journal||Physical Review D - Particles, Fields, Gravitation and Cosmology|
|State||Published - Mar 31 1999|