L. Nava, L. Sironi, G. Ghisellini, A. Celotti, G. Ghirlanda
Forward shocks caused by the interaction between a relativistic blast wave and the circum-burst medium are thought to be responsible for the afterglow emission in Gamma-Ray Bursts (GRBs). We consider the hydrodynamics of a spherical relativistic blast wave expanding into the surrounding medium and we generalize the standard theory in order to account for several effects that are generally ignored. In particular, we consider the role of adiabatic and radiative losses on the hydrodynamical evolution of the shock, under the assumption that the cooling losses are fast. Our model can describe adiabatic, fully radiative and semi-radiative blast waves, and can describe the effects of a time-varying radiative efficiency. The equations we present are valid for arbitrary density profiles, and also for a circum-burst medium enriched with electron-positron pairs. The presence of pairs enhances the fraction of shock energy gained by the leptons, thus increasing the importance of radiative losses. Our model allows to study whether the high-energy (>0.1 GeV) emission in GRBs may originate from afterglow radiation. In particular, it is suitable to test whether the fast decay of the high-energy light curve observed in several Fermi LAT GRBs can be ascribed to an initial radiative phase, followed by the standard adiabatic evolution.
View original:
http://arxiv.org/abs/1211.2806
No comments:
Post a Comment