FinKENet: A Novel Financial Knowledge Enhanced Network for Financial Question Matching

Author:

Guo Yu1ORCID,Liang Ting2,Chen Zhongpu1ORCID,Yang Binchen1,Wang Jun13,Zhao Yu1ORCID

Affiliation:

1. Financial Intelligence and Financial Engineering Key Laboratory of Sichuan Province, Fintech Innovation Center, Southwestern University of Finance and Economics, Chengdu 611130, China

2. School of Accounting, Southwestern University of Finance and Economics, Chengdu 611130, China

3. School of Management Science and Engineering, Southwestern University of Finance and Economics, Chengdu 611130, China

Abstract

Question matching is the fundamental task in retrieval-based dialogue systems which assesses the similarity between Query and Question. Unfortunately, existing methods focus on improving the accuracy of text similarity in the general domain, without adaptation to the financial domain. Financial question matching has two critical issues: (1) How to accurately model the contextual representation of a financial sentence? (2) How to accurately represent financial key phrases in an utterance? To address these issues, this paper proposes a novel Financial Knowledge Enhanced Network (FinKENet) that significantly injects financial knowledge into contextual text. Specifically, we propose a multi-level encoder to extract both sentence-level features and financial phrase-level features, which can more accurately represent sentences and financial phrases. Furthermore, we propose a financial co-attention adapter to combine sentence features and financial keyword features. Finally, we design a multi-level similarity decoder to calculate the similarity between queries and questions. In addition, a cross-entropy-based loss function is presented for model optimization. Experimental results demonstrate the effectiveness of the proposed method on the Ant Financial question matching dataset. In particular, the Recall score improves from 73.21% to 74.90% (1.69% absolute).

Funder

National Natural Science Foundation of China

Sichuan Science and Technology Program

Guanghua Talent Project of Southwestern University of Finance and Economics, and Financial Innovation Center, SWUFE

International Innovation Project

Fundamental Research Funds for the Central Universities

Publisher

MDPI AG

Subject

General Physics and Astronomy

Reference38 articles.

1. Training language models to follow instructions with human feedback;Ouyang;Adv. Neural Inf. Process. Syst.,2022

2. Du, Z., Qian, Y., Liu, X., Ding, M., Qiu, J., Yang, Z., and Tang, J. (2022, January 22–27). GLM: General Language Model Pretraining with Autoregressive Blank Infilling. Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), Dublin, Ireland.

3. Zeng, A., Liu, X., Du, Z., Wang, Z., Lai, H., Ding, M., Yang, Z., Xu, Y., Zheng, W., and Xia, X. (2022). Glm-130b: An open bilingual pre-trained model. arXiv.

4. Sun, Y., Wang, S., Feng, S., Ding, S., Pang, C., Shang, J., Liu, J., Chen, X., Zhao, Y., and Lu, Y. (2021). Ernie 3.0: Large-scale knowledge enhanced pre-training for language understanding and generation. arXiv.

5. Shen, Y., He, X., Gao, J., Deng, L., and Mesnil, G. (2014, January 7–11). Learning semantic representations using convolutional neural networks for web search. Proceedings of the 23rd International Conference on World Wide Web, Seoul, Republic of Korea.

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3