1. TabEL: Entity Linking in Web Tables
2. Zhoujun Cheng , Haoyu Dong , Ran Jia , Pengfei Wu , Shi Han , Fan Cheng , and Dongmei Zhang . 2021 . FORTAP: Using Formulas for Numerical-Reasoning-Aware Table Pretraining. arXiv preprint arXiv:2109.07323 (2021). Zhoujun Cheng, Haoyu Dong, Ran Jia, Pengfei Wu, Shi Han, Fan Cheng, and Dongmei Zhang. 2021. FORTAP: Using Formulas for Numerical-Reasoning-Aware Table Pretraining. arXiv preprint arXiv:2109.07323 (2021).
3. Kevin Clark , Minh-Thang Luong , Quoc V Le , and Christopher D Manning . 2020 . Electra: Pre-training text encoders as discriminators rather than generators. arXiv preprint arXiv:2003.10555 (2020). Kevin Clark, Minh-Thang Luong, Quoc V Le, and Christopher D Manning. 2020. Electra: Pre-training text encoders as discriminators rather than generators. arXiv preprint arXiv:2003.10555 (2020).
4. Web-scale table census and classification
5. Jacob Devlin , Ming-Wei Chang , Kenton Lee , and Kristina Toutanova . 2018 . BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. https://arxiv.org/abs/1810.04805 Jacob Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. 2018. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. https://arxiv.org/abs/1810.04805