Author:
Iida Tsumugi,Komatsu Takumi,Kaneda Kanta,Hirakawa Tsubasa,Yamashita Takayoshi,Fujiyoshi Hironobu,Sugiura Komei
Publisher
Springer Nature Switzerland
Reference37 articles.
1. Abnar, S., Zuidema, W.: Quantifying attention flow in transformers. arXiv preprint arXiv:2005.00928 (2020)
2. Adebayo, J., Gilmer, J., Muelly, M., Goodfellow, I., Hardt, M., Kim, B.: Sanity checks for saliency maps. In: NeurIPS, vol. 31 (2018)
3. Bello, I.: LambdaNetworks: modeling long-range interactions without attention. In: ICLR (2021)
4. Lecture Notes in Computer Science;A Binder,2016
5. Chefer, H., Gur, S., Wolf, L.: Transformer interpretability beyond attention visualization. In: CVPR, pp. 782–791 (2021)