Abstract
Multilabel learning goes beyond standard supervised learning models by associating a sample with more than one class label. Among the many techniques developed in the last decade to handle multilabel learning best approaches are those harnessing the power of ensembles and deep learners. This work proposes merging both methods by combining a set of gated recurrent units, temporal convolutional neural networks, and long short-term memory networks trained with variants of the Adam optimization approach. We examine many Adam variants, each fundamentally based on the difference between present and past gradients, with step size adjusted for each parameter. We also combine Incorporating Multiple Clustering Centers and a bootstrap-aggregated decision trees ensemble, which is shown to further boost classification performance. In addition, we provide an ablation study for assessing the performance improvement that each module of our ensemble produces. Multiple experiments on a large set of datasets representing a wide variety of multilabel tasks demonstrate the robustness of our best ensemble, which is shown to outperform the state-of-the-art.
Reference78 articles.
1. Multi label learning: A review of the state of the art and ongoing research;Galindo;Wiley Interdiscip. Rev. Data Min. Knowl. Discov.,2014
2. pLoc_bal-mAnimal: Predict subcellular localization of animal proteins by balancing training dataset and PseAAC;Cheng;Bioinformatics,2018
3. Predicting gene phenotype by multi-label multi-class model based on essential functional features;Chen;Mol. Genet. Genom. MGG,2021
4. pLoc_Deep-mAnimal: A Novel Deep CNN-BLSTM Network to Predict Subcellular Localization of Animal Proteins;Shao;Nat. Sci.,2020
5. Shu, S., Lv, F., Feng, L., Huang, J., He, S., He, J., and Li, L. (2020). Incorporating Multiple Cluster Centers for Multi-Label Learning. arXiv.
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献