Abstract
Abstract
We present a maximum-likelihood (ML) algorithm that is fast enough to detect γ-ray transients in real time on low-performance processors often used for space applications. We validate the routine with simulations and find that, relative to algorithms based on excess counts, the ML method is nearly twice as sensitive, allowing detection of 240%–280% more short γ-ray bursts. We characterize a reference implementation of the code, estimating its computational complexity and benchmarking it on a range of processors. We exercise the reference implementation on archival data from the Fermi Gamma-ray Burst Monitor (GBM), verifying the sensitivity improvements. In particular, we show that the ML algorithm would have detected GRB 170817A even if it had been nearly 4 times fainter. We present an ad hoc but effective scheme for discriminating transients associated with background variations. We show that the onboard localizations generated by ML are accurate, but that refined off-line localizations require a detector response matrix with about 10 times finer resolution than is the current practice. Increasing the resolution of the GBM response matrix could substantially reduce the few-degree systematic uncertainty observed in the localizations of bright bursts.
Publisher
American Astronomical Society
Subject
Space and Planetary Science,Astronomy and Astrophysics
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献