Abstract
Abstract
Observations show an almost ubiquitous presence of extra mixing in low-mass upper giant branch stars. The most commonly invoked explanation for this is thermohaline mixing. One-dimensional stellar evolution models include various prescriptions for thermohaline mixing, but the use of observational data directly to discriminate between thermohaline prescriptions has thus far been limited. Here, we propose a new framework to facilitate direct comparison: using carbon-to-nitrogen measurements from the Sloan Digital Sky Survey-IV APOGEE survey as a probe of mixing and a fluid parameter known as the reduced density ratio from one-dimensional stellar evolution programs, we compare the observed amount of extra mixing on the upper giant branch to predicted trends from three-dimensional fluid dynamics simulations. Using this method, we are able to empirically constrain how mixing efficiency should vary with the reduced density ratio. We find the observed amount of extra mixing is strongly correlated with the reduced density ratio and that trends between reduced density ratio and fundamental stellar parameters are robust across choices for modeling prescription. We show that stars with available mixing data tend to have relatively low density ratios, which should inform the regimes selected for future simulation efforts. Finally, we show that there is increased mixing at low reduced density ratios, which is consistent with current hydrodynamical models of thermohaline mixing. The introduction of this framework sets a new standard for theoretical modeling efforts, as validation for not only the amount of extra mixing, but trends between the degree of extra mixing and fundamental stellar parameters is now possible.
Publisher
American Astronomical Society
Subject
Space and Planetary Science,Astronomy and Astrophysics
Cited by
7 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献