GEML: a graph-enhanced pre-trained language model framework for text classification via mutual learning
Author:
Funder
National Natural Science Foundation of China
Education Department of Jilin Province
Department of Science and Technology of Jilin Province
Publisher
Springer Science and Business Media LLC
Link
https://link.springer.com/content/pdf/10.1007/s10489-024-05831-1.pdf
Reference62 articles.
1. Chen H, Lin Y, Qi F et al (2021) Aspect-level sentiment-controllable review generation with mutual learning framework. In: Proceedings of the AAAI conference on artificial intelligence, pp 12639–12647
2. Chen Z, Mao H, Li H et al (2024) Exploring the potential of large language models (llms) in learning on graphs. ACM SIGKDD Explorations Newsl 25(2):42–61
3. Cui H, Wang G, Li Y et al (2022) Self-training method based on GCN for semi-supervised short text classification. Inf Sci 611:18–29
4. Devlin J (2018) Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805
5. Ding K, Wang J, Li J et al (2020) Be more with less: hypergraph attention networks for inductive text classification. In: Conference on empirical methods in natural language processing. pp 4927–4936
1.学者识别学者识别
2.学术分析学术分析
3.人才评估人才评估
"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370
www.globalauthorid.com
TOP
Copyright © 2019-2024 北京同舟云网络信息技术有限公司 京公网安备11010802033243号 京ICP备18003416号-3