Duncan K. Galloway, Nathanael Lampe
The radius of neutron stars can in principle be measured via the
normalisation of a blackbody fitted to the X-ray spectrum during thermonuclear
(type-I) X-ray bursts, although few previous studies have addressed the
reliability of such measurements. Here we examine the apparent radius in a
homogeneous sample of long, mixed H/He bursts from the low-mass X-ray binaries
GS 1826-24 and KS 1731-26. The measured blackbody normalisation (proportional
to the emitting area) in these bursts is constant over a period of up to 60s in
the burst tail, even though the flux (blackbody temperature) decreased by a
factor of 60-75% (30-40%). The typical rms variation in the mean normalisation
from burst to burst was 3-5%, although a variation of 17% was found between
bursts observed from GS 1826-24 in two epochs. A comparison of the
time-resolved spectroscopic measurements during bursts from the two epochs
shows that the normalisation evolves consistently through the burst rise and
peak, but subsequently increases further in the earlier epoch bursts. The
elevated normalisation values may arise from a change in the anisotropy of the
burst emission, or alternatively variations in the spectral correction factor,
f_c, of order 10%. Since burst samples observed from systems other than GS
1826-24 are more heterogeneous, we expect that systematic uncertainties of at
least 10% are likely to apply generally to measurements of neutron-star radii,
unless the effects described here can be corrected for.
View original:
http://arxiv.org/abs/1201.1680
No comments:
Post a Comment