Abstract
In the domain of energy exploration, the forecasting of fluid via well logging is pivotal in directing exploration endeavors. Understanding the composition of fluid underground is key for exploration teams to accurately determine the size, potential reserves, and quality of oil and gas resources. This knowledge is critical in refining exploration tactics and employing resources wisely. We present a novel machine learning architecture termed “PIFormer” for predicting fluid. This design merges Persistence Initialization with a Transformer module. The combination of persistent initialization and Transformer modules is achieved by using the persistent initialization feature representation as input to the Transformer model. Persistent initialization provides a stable starting point, enabling the Transformer model to converge to effective feature representations more rapidly during the learning process. This combination helps address issues in existing methods such as training instability, slow convergence, and local optima problems caused by random initialization. By integrating persistent initialization and the Transformer model, prior knowledge and global information can be more effectively utilized, enhancing the accuracy and robustness of fluid identification. Compared to existing models, the combination of persistent initialization and the Transformer model demonstrates higher accuracy and robustness in fluid identification tasks. Specifically, our approach achieves significant improvements in fluid identification accuracy and outperforms existing models across various types of fluid identification problems. Additionally, our method significantly reduces model training time and improves convergence speed. These results clearly indicate that the combination of persistent initialization and the Transformer model effectively addresses limitations in existing models for fluid identification tasks, providing new avenues and methods for further research and application in this field.
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献