Katsuaki Asano, Peter Mészáros
The temporal--spectral evolution of the prompt emission of gamma-ray bursts (GRBs) is simulated numerically for both leptonic and hadronic models. For weak enough magnetic fields, leptonic models can reproduce the few seconds delay of the onset of GeV photon emission observed by {\it Fermi}-LAT, due to the slow growth of the target photon field for inverse Compton scattering. However, even for stronger magnetic fields, the GeV delay can be explained with hadronic models, due to the long acceleration timescale of protons and the continuous photopion production after the end of the particle injection. While the FWHMs of the MeV and GeV lightcurves are almost the same in one-zone leptonic models, the FWHM of the 1--30 GeV lightcurves in hadronic models are significantly wider than those of the 0.1--1 MeV lightcurves. The amount of the GeV delay depends on the importance of the Klein--Nishina effect in both the leptonic and hadronic models. In our examples of hadronic models the energies of the escaped neutrons are comparable to the gamma-ray energy, although their contribution to the ultra high-energy cosmic rays is still subdominant. The resulting neutrino spectra are hard enough to avoid the flux limit constraint from IceCube. The delay of the neutrino emission onset is up to several times longer than the corresponding delay of the GeV photon emission onset. The quantitative differences in the lightcurves for various models may be further tested with future atmospheric Cherenkov telescopes whose effective area is larger than that of {\it Fermi}-LAT, such as CTA.
View original:
http://arxiv.org/abs/1206.0347
No comments:
Post a Comment