Abstract
AbstractRemarkable progress has been made in the field of protein structure prediction in the past years. State-of-the-art methods like AlphaFold2 and RoseTTAFold2 achieve prediction accuracy close to experimental structural determination, but at the cost of heavy computational consumption for model training. In this work, we propose a new protein structure prediction framework, Cerebra, for improving the computational efficiency of protein structure prediction. In this innovative network architecture, multiple sets of atomic coordinates are predicted parallelly and their mutual complementary is leveraged to rapidly improve the quality of predicted structures through a novel attention mechanism. Consequently, Cerebra markedly reduces the model training consumption, achieving a training acceleration of at least 7 folds, in comparison to OpenFold, the academic version of AlphaFold2. When evaluated on the CAMEO and CASP15 sets, the Cerebra model insufficiently trained on a single GPU only shows slight performance inferiority to the published OpenFold model.
Publisher
Cold Spring Harbor Laboratory
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献