R. Hascoet, A. M. Beloborodov, F. Daigne, R. Mochkovitch
The peak time of optical afterglow may be used as a proxy to constrain the Lorentz factor Gamma of the gamma-ray burst (GRB) ejecta. We revisit this method by including bursts with optical observations that started when the afterglow flux was already decaying; these bursts can provide useful lower limits on Gamma. Combining all analyzed bursts in our sample, we find that the previously reported correlation between Gamma and the burst luminosity L_gamma does not hold. However, the data clearly shows a lower bound Gamma_min which increases with L_gamma. We suggest an explanation for this feature: explosions with large jet luminosities and Gamma < Gamma_min suffer strong adiabatic cooling before their radiation is released at the photosphere; they produce weak bursts, barely detectable with present instruments. To test this explanation we examine the effect of adiabatic cooling on the GRB location in the L_gamma - Gamma plane using a Monte Carlo simulation of the GRB population. Our results predict detectable on-axis "orphan" afterglows. We also derive upper limits on the density of the ambient medium that decelerates the explosion ejecta. We find that the density in many cases is smaller than expected for stellar winds from normal Wolf-Rayet progenitors. The burst progenitors may be peculiar massive stars with weaker winds or there might exist a mechanism that reduces the stellar wind a few years before the explosion.
View original:
http://arxiv.org/abs/1304.5813
No comments:
Post a Comment