Ulisses Barres de Almeida, Michael Daniel
In this paper we discuss a simple method of testing for the presence of energy-dependent dispersion in high energy data-sets. It uses the minimisation of the Kolmogorov distance between the cumulative distribution of two probability functions as the statistical metric to estimate the magnitude of any spectral dispersion within transient features in a light-curve and we also show that it performs well in the presence of modest energy resolutions (~20%) typical of gamma-ray observations. After presenting the method in detail we apply it to a parameterised simulated lightcurve based on the extreme VHE gamma-ray flare of PKS 2155-304 observed with H.E.S.S. in 2006, in order to illustrate its potential through the concrete example of setting constraints on quantum-gravity induced Lorentz invariance violation (LIV) effects. We obtain comparable limits to those of the most advanced techniques used in LIV searches applied to similar datasets, but the present method has the advantage of being particularly straightforward to use. Whilst the development of the method was motivated by LIV searches, it is also applicable to other astrophysical situations where energy-dependent dispersion is expected, such as spectral lags from the acceleration and cooling of particles in relativistic outflows.
View original:
http://arxiv.org/abs/1204.2205
No comments:
Post a Comment