Abstract
AbstractMixture of experts (MoE) models are widely applied for conditional probability density estimation problems. We demonstrate the richness of the class of MoE models by proving denseness results in Lebesgue spaces, when inputs and outputs variables are both compactly supported. We further prove an almost uniform convergence result when the input is univariate. Auxiliary lemmas are proved regarding the richness of the soft-max gating function class, and their relationships to the class of Gaussian gating functions.
Funder
Australian Research Council
Publisher
Springer Science and Business Media LLC
Subject
Statistics, Probability and Uncertainty,Computer Science Applications,Statistics and Probability
Reference45 articles.
1. Bartle, R.: The Elements of Integration and Lebesgue Measure. Wiley, New York (1995).
2. Castillo, R. E., Rafeiro, H.: An Introductory Course in Lebesgue Spaces. Springer, Switzerland (2010).
3. Chamroukhi, F.: Robust mixture of experts modeling using the t distribution. Neural Netw. 79, 20–36 (2016).
4. Chamroukhi, F., Mohammed, S., Trabelsi, D., Oukhellou, L., Amirat, Y.: Joint segmentation of multivariate time series with hidden process regression for human activity recognition. Neurocomputing. 120, 633–644 (2013).
5. Cheney, W., Light, W.: A Course in Approximation Theory. Brooks/Cole, Pacific Grove (2000).
Cited by
8 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献