Abstract
AbstractThe central tenet of molecular biology is that a protein’s amino acid sequence determines its three-dimensional structure, and thus its function. However, proteins with similar sequences do not always fold into the same shape, and vice-versa, dissimilar sequences can adopt similar folds. In this work, we explore antibodies, a class of proteins in the immune system, whose local shapes are highly unpredictable, even with small variations in their sequence. Inspired by the CLIP method [1], we propose a multimodal contrastive learning approach, contrastive sequence-structure pre-training (CSSP), which amalgamates the representations of antibody sequences and structures in a mutual latent space. Integrating structural information leads both antibody and protein language models to show better correspondence with structural similarity and improves accuracy and data efficiency in downstream binding prediction tasks. We provide an optimised CSSP-trained model, AntiBERTa2-CSSP, for non-commercial use athttps://huggingface.co/alchemab.
Publisher
Cold Spring Harbor Laboratory
Reference31 articles.
1. A. Radford , J. W. Kim , C. Hallacy , A. Ramesh , G. Goh , S. Agarwal , G. Sastry , A. Askell , P. Mishkin , J. Clark , G. Krueger , and I. Sutskever , “Learning Transferable Visual Models From Natural Language Supervision,” arXiv, 2 2021.
2. Advances in protein structure prediction and design
3. Evolutionary-scale prediction of atomic-level protein structure with a language model
4. “ProtTrans: Toward Understanding the Language of Life Through Self-Supervised Learning;IEEE Transactions on Pattern Analysis and Machine Intelligence,2022
5. E. Nijkamp , J. Ruffolo , E. N. Weinstein , N. Naik , and A. Madani , “ProGen2: Exploring the Boundaries of Protein Language Models,” arXiv, 2022.
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献