The data from the gravitational wave detectors are non-stationary and plagued with non-Gaussian noise which often contains spurious noise transients (glitches). The change in nature of these glitches as the sensitivity of the detectors changes makes the task of detecting transient gravitational wave signals in a noise dominated data even more challenging. For a given dataset, transient gravitational wave searches produce a corresponding list of triggers corresponding to a gravitational wave signal. These triggers could often be the result of glitches mimicking gravitational wave characteristics, if the transient is short in nature hence affecting the sensitivity of the search for systems such as short duration bursts or signals from very massive compact binaries. To distinguish glitches from gravitational wave signals, search algorithms like coherent Waveburst apply thresholds on the trigger properties. Here, we present the Gaussian Mixture Models, a supervised machine learning approach, as a means of modelling the multi-dimensional trigger attribute space. We test this algorithm on the observation data with simulations from gravitational wave short duration transients.
Back to Workshop IV: Big Data in Multi-Messenger Astrophysics