1. Brody, S., Alon, U., Yahav, E.: How attentive are graph attention networks? In: ICLR (2021)
2. Cheng, D., Wang, X., Zhang, Y., Zhang, L.: Graph neural network for fraud detection via spatial-temporal attention. TKDE, 3800–3813 (2020)
3. Chiang, W.L., Liu, X., Si, S., Li, Y., Bengio, S., Hsieh, C.J.: Cluster-GCN: an efficient algorithm for training deep and large graph convolutional networks. In: SIGKDD, pp. 257–266 (2019)
4. Deng, Z., Russakovsky, O.: Remember the past: Distilling datasets into addressable memories for neural networks. arXiv preprint arXiv:2206.02916 (2022)
5. Dong, T., Zhao, B., Lyu, L.: Privacy for free: How does dataset condensation help privacy? In: ICML, pp. 5378–5396 (2022)