f-divergence regression models for compositional data
-
Published:2022-12-05
Issue:
Volume:
Page:867-882
-
ISSN:2220-5810
-
Container-title:Pakistan Journal of Statistics and Operation Research
-
language:
-
Short-container-title:Pak.j.stat.oper.res.
Author:
Alenazi Abdulaziz AhmedORCID
Abstract
The paper considers the class of $f$-divergence regression models as alternatives to parametric regression models for compositional data. The special cases examined in this paper include the Jensen-Shannon, Kullback-Leibler, Hellinger, chi^2 and total variation divergence. Strong advantages of the proposed regression models are a) the absence of parametric assumptions and b) the ability to treat zero values (which commonly occur in practice) naturally. Extensive Monte Carlo simulation studies comparatively assess the performance of the models in terms of bias and an empirical evaluation using real data examining further aspects, such as predictive performance and computational cost. The results reveal that Kullback-Leibler and Jensen-Shannon divergence regression models exhibited high quality performance in multiple directions. Ultimately, penalised versions of the Kullback-Leibler divergence regression are introduced and illustrated using real data rendering this model the optimal model to utilise in practice.
Publisher
Pakistan Journal of Statistics and Operation Research
Subject
Management Science and Operations Research,Statistics, Probability and Uncertainty,Modeling and Simulation,Statistics and Probability
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献