Monday, July 15, 2013

1307.3250 (Lorenzo Sironi et al.)

A Late-Time Flattening of Light Curves in Gamma-Ray Burst Afterglows    [PDF]

Lorenzo Sironi, Dimitrios Giannios
The afterglow emission from Gamma-Ray Bursts (GRBs) is usually interpreted as synchrotron radiation from relativistic electrons accelerated at the GRB external shock, that decelerates from ultra-relativistic to non-relativistic speeds as it sweeps up the surrounding medium. We investigate the temporal decay of the emission from GRB afterglows at late times, when the bulk of the shock-accelerated electrons are non-relativistic. For a uniform circumburst medium, we show that such "deep Newtonian phase" begins at t_{DN} ~ 3 epsilon_{e,-1}^{5/6} t_{ST}, where t_{ST} marks the transition of the blast wave to the non-relativistic spherically-symmetric Sedov-Taylor solution, and epsilon_e = 0.1 epsilon_{e,-1} quantifies the amount of shock energy transferred to the post-shock electrons. For typical parameters, the deep Newtonian stage starts ~ 0.5-several years after the GRB. The radio flux in this phase decays as F_nu ~ t^{-3(p+1)/10}, with decay slope between -0.9 and -1.2, for a power-law distribution of shock-accelerated electrons with index 2 < p < 3. This is shallower than the commonly assumed scaling F_nu ~ t^{-3(5p-7)/10} (with decay slope between -0.9 and -2.4) derived by Frail et al. (2000), which only applies if the GRB shock is non-relativistic, but the electron distribution still peaks at ultra-relativistic energies (a regime that is relevant for a narrow time interval, and only if t_{DN} > t_{ST}, or epsilon_e > 0.03). We discuss how the deep Newtonian phase can be reliably used for GRB calorimetry, and we comment on the good detection prospects of trans-relativistic blast waves at 0.1-10 GHz with EVLA and LOFAR.
View original:

No comments:

Post a Comment