Author:
Grover Khushnuma,Kaur Katinder,Tiwari Kartikey,Rupali ,Kumar Parteek
Reference23 articles.
1. Alammar, J.: The illustrated transformer - jay alammar - visualizing machine learning one concept at a time. http://jalammar.github.io/illustrated-transformer/. Accessed 29 Oct 2020
2. Chan, Y.H., Fan, Y.C.: A recurrent BERT-based model for question generation. In: Proceedings of the 2nd Workshop on Machine Reading for Question Answering (2019). https://doi.org/10.18653/v1/d19-5821
3. Das, R., Ray, A., Mondal, S., Das, D.: A rule based question generation framework to deal with simple and complex sentences. In: 2016 International Conference on Advances in Computing, Communications and Informatics (ICACCI), pp. 542–548 (2016)
4. Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North, pp. 4171–4186 (2019). https://doi.org/10.18653/v1/N19-1423. https://www.aclweb.org/anthology/N19-1423
5. Du, X., Shao, J., Cardie, C.: Learning to ask: neural question generation for reading comprehension. CoRR abs/1705.00106 (2017). http://arxiv.org/abs/1705.00106
Cited by
9 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献