Funder
Florida Department of Health
National Heart, Lung, and Blood Institute
National Cancer Institute
PCORI
Nvidia
National Institute of Allergy and Infectious Diseases
National Institute on Aging
Cancer Center, University of Florida Health
Clinical and Translational Science Institute, University of Florida
NVIDIA AI Technology Center, University of Florida
Reference44 articles.
1. Pre-train, prompt, and predict: A systematic survey of prompting methods in natural language processing;Liu;ACM Comput Surv,2023
2. Liu X, Ji K, Fu Y, et al. P-tuning: Prompt tuning can be comparable to fine-tuning across scales and tasks. In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers). Stroudsburg, PA, USA: : Association for Computational Linguistics 2022. doi:10.18653/v1/2022.acl-short.8.
3. Lester B, Al-Rfou R, Constant N. The power of scale for parameter-efficient prompt tuning. In: Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing. Stroudsburg, PA, USA: : Association for Computational Linguistics 2021. doi:10.18653/v1/2021.emnlp-main.243.
4. Deep learning for AI;Bengio;Commun ACM,2021
5. Lafferty JD, McCallum A, Pereira FCN. Conditional Random Fields: Probabilistic Models for Segmenting and Labeling Sequence Data. In: Proceedings of the Eighteenth International Conference on Machine Learning. San Francisco, CA, USA: : Morgan Kaufmann Publishers Inc. 2001. 282–9.https://dl.acm.org/doi/10.5555/645530.655813 (accessed 9 Dec 2023).
Cited by
6 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献