Stanley P. Owocki, Jon O. Sundqvist, David H. Cohen, Kenneth G. Gayley
X-ray satellites since Einstein have empirically established that the X-ray luminosity from single O-stars scales linearly with bolometric luminosity, Lx ~ 10^{-7} Lbol. But straightforward forms of the most favored model, in which X-rays arise from instability-generated shocks embedded in the stellar wind, predict a steeper scaling, either with mass loss rate Lx ~ Mdot ~ Lbol^{1.7} if the shocks are radiative, or with Lx ~ Mdot^{2} ~ Lbol^{3.4} if they are adiabatic. This paper presents a generalized formalism that bridges these radiative vs. adiabatic limits in terms of the ratio of the shock cooling length to the local radius. Noting that the thin-shell instability of radiative shocks should lead to extensive mixing of hot and cool material, we propose that the associated softening and weakening of the X-ray emission can be parametrized as scaling with the cooling length ratio raised to a power m$, the "mixing exponent". For physically reasonable values m ~= 0.4, this leads to an X-ray luminosity Lx ~ Mdot^{0.6} ~ Lbol that matches the empirical scaling. To fit observed X-ray line profiles, we find such radiative-shock-mixing models require the number of shocks to drop sharply above the initial shock onset radius. This in turn implies that the X-ray luminosity should saturate and even decrease for optically thick winds with very high mass-loss rates. In the opposite limit of adiabatic shocks in low-density winds (e.g., from B-stars), the X-ray luminosity should drop steeply with Mdot^2. Future numerical simulation studies will be needed to test the general thin-shell mixing ansatz for X-ray emission.
View original:
http://arxiv.org/abs/1212.4235
No comments:
Post a Comment