Abstract
Abstract
Line intensity ratios (LIRs) of helium (He) atoms are known to depend on electron density,
n
e
, and temperature,
T
e
, and thus are widely utilized to evaluate these parameters, known as the
H
e
I
LIR method. In this conventional method, the measured LIRs are compared with theoretical values calculated using a collisional-radiative (CR) model to find the best possible
n
e
and
T
e
. Basic CR models have been improved to take into account several effects. For instance, radiation trapping can occur to a significant degree in weakly ionized plasmas, leading to major alterations of LIRs. This effect has been included with optical escape factors in CR models. A new approach to the evaluation of
n
e
and
T
e
from He I LIRs has recently been explored using machine learning (ML). In the ML-aided LIR method, a predictive model is developed with training data, which consists of an input (measured LIRs) and a desired/known output (measured
n
e
or
T
e
from other diagnostics). It has been demonstrated that this new method predicts
n
e
and
T
e
better than using the conventional method coupled with a CR model, not only for He but also for other species. This review focuses mainly on low-temperature plasmas with
T
e
⩽
10
eV in linear plasma devices.
Funder
the U.S. Department of Energy Cooperative Agreement
Japan Society for the Promotion of Science
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献