Author:
Magooda Ahmed,Litman Diane,Ashraf Ahmed,Menekse Muhsin
Publisher
Springer International Publishing
Reference13 articles.
1. Bommasani, R., Davis, K., Cardie, C.: Interpreting pretrained contextualized representations via reductions to static embeddings. In: Proceedings of ACL (2020)
2. Carpenter, D., Cloude, E., Rowe, J., Azevedo, R., Lester, J.: Investigating student reflection during game-based learning in middle grades science. In: LAK21: 11th International Learning Analytics and Knowledge Conference, pp. 280–291 (2021)
3. Lecture Notes in Computer Science (Lecture Notes in Artificial Intelligence);D Carpenter,2020
4. Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. arXiv:1810.04805 (2018)
5. Fan, X., Luo, W., Menekse, M., Litman, D., Wang, J.: Scaling reflection prompts in large classrooms via mobile interfaces and natural language processing. In: Proceedings of 22nd International Conference on Intelligent User Interfaces, pp. 363–374 (2017)
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献