P. L. Biermann, J. K. Becker, E. -S. Seo, M. Mandelartz
We show that the large-scale cosmic ray anisotropy at ~10 TeV can be explained by a modified Compton-Getting effect in the magnetized flow field of old supernova remnants. This approach suggests an optimum energy scale for detecting the anisotropy. Two key assumptions are that propagation is based on turbulence following a Kolmogorov law and that cosmic ray interactions are dominated by transport through stellar winds of the exploding stars. A prediction is that the amplitude is smaller at lower energies due to incomplete sampling of the velocity field and also smaller at larger energies due to smearing.
View original:
http://arxiv.org/abs/1206.0828
No comments:
Post a Comment