Author:
Ma Kai,Zheng Shuai,Tian Miao,Qiu Qinjun,Tan Yongjian,Hu Xinxin,Li HaiYan,Xie Zhong
Funder
National Key R&D Program of China
Natural Science Foundation of Hubei Province of China
Opening Fund of Key Laboratory of Geological Survey and Evaluation of Ministry of Education
Fundamental Research Funds for the Central Universities , the China Postdoctoral Science Foundation
Open Fund of Key Laboratory of Urban Land Resources Monitoring and Simulation, Ministry of Natural Resources
Open Fund of Hubei Key Laboratory of Intelligent Vision Based Monitoring for Hydroelectric Engineering
Publisher
Springer Science and Business Media LLC
Subject
General Earth and Planetary Sciences
Reference65 articles.
1. Araci D (2019) Finbert: Financial sentiment analysis with pre-trained language models. arXiv preprint arXiv:1908.10063
2. Beltagy I, Lo K, Cohan A (2019) SciBERT: A pretrained language model for scientific text. arXiv preprint arXiv:1903.10676
3. Brown T, Mann B, Ryder N et al (2020) Language models are few-shot learners. Adv Neural Inf Process Syst 33:1877–1901
4. Chen Q, Zhuo Z, Wang W (2019) Bert for joint intent classification and slot filling. arXiv preprint arXiv:1902.10909
5. Lample G, Conneau A (2019) Cross-lingual language model pretraining. arXiv preprint arXiv:1901.07291
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献