Abstract
AbstractAccurate predictive modeling of human gene relationships would fundamentally transform our ability to uncover the molecular mechanisms that underpin key biological and disease processes. Recent studies have employed advanced AI techniques to model the complexities of gene networks using large gene expression datasets1–11. However, the extent and nature of the biological information these models can learn is not fully understood. Furthermore, the potential for improving model performance by using alternative data types, model architectures, and methodologies remains underexplored. Here, we developed GeneRAIN models by training on a large dataset of 410K human bulk RNA-seq samples, rather than single-cell RNA-seq datasets used by most previous studies. We showed that although the models were trained only on gene expression data, they learned a wide range of biological information well beyond gene expression. We introduced GeneRAIN-vec, a state-of-the-art, multifaceted vectorized representation of genes. Further, we demonstrated the capabilities and broad applicability of this approach by making 4,797 biological attribute predictions for each of 13,030 long non-coding RNAs (62.5 million predictions in total). These achievements stem from various methodological innovations, including experimenting with multiple model architectures and a new ‘Binning-By-Gene’ normalization method. Comprehensive evaluation of our models clearly demonstrated that they significantly outperformed current state-of-the-art models3,12. This study improves our understanding of the capabilities of Transformer and self-supervised deep learning when applied to extensive expression data. Our methodological advancements offer crucial insights into refining these techniques. These innovations are set to significantly advance our understanding and exploration of biology.
Publisher
Cold Spring Harbor Laboratory
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献