Affiliation:
1. Baker Hughes, United States
2. Saudi Aramco, Saudi Arabia
Abstract
Abstract
The main objective of this study is to develop a robust AI model to estimate the level of bit wear during the real-time field deployments. We aim to enhance drilling efficiency by accurately predicting bit wear, considering limited offset training data with minimal subsurface information. The proposed model has practical implications, enabling informed decisions, optimized maintenance schedules, and reduced downtime in drilling operations.
We collected drilling parameters (e.g., RPM, ROP), and subsurface data (e.g., gamma) from seven offset wells to train the AI model. We handled missing data and outliers through interpolation. The data were then standardized to ensure consistent scales, and segmented into windows. An LSTM-VAE model was designed and trained in an unsupervised manner. To predict bit wear, the trained AI model was first employed to project each window data into a latent space, similarity scores across runs were then computed, and an XGBoost model was trained to predict bit wear based on relative positions of input data in the latent space.
In the field real-time model study, ten bits from five different vendors were examined to assess the performance of the AI model. To test the model's robustness, only Gamma ray logging was utilized for formation interpretation. The AI model successfully predicted the dull status for six out of eight qualified bit evaluations. Furthermore, the model accurately predicted the dull grade for bits from different vendors, indicating its agnosticism to specific bit types. These results validate the robustness and transferability of the developed AI bit wear model, particularly in challenging drilling applications.
The novelty of this paper lies in the development of an AI model for real-time bit wear estimation, considering limited inputs and offset training data. The use of an LSTM-VAE model, coupled with XGBoost for estimation, presents a unique approach to address the challenges of accurate bit wear prediction. Additionally, this study introduces the implementation of similarity scores across runs to further simplify the latent space, enhancing the model's robustness.
Reference14 articles.
1. Abadi, Mart ín, Barham, Paul, Chen, Jianminet al.
2016. "Tensorflow: A system for large-scale machine learning. Proc.,12th USENIX symposium on operating systems design and implementation."
2. Probabilistic Neural Network with Bayesian-based, spectral torque imaging and Deep Convolutional Autoencoder for PDC bit wear monitoring.;Agostini,2020
3. The python deep learning library.;Chollet,2018
4. Effects of thermal and mechanical loading on PDC bit life;Glowka;SPE Drilling Engineering,1986
5. Artificial neural network drilling parameter optimization system improves ROP by predicting/managing bit wear;Gidh;Proc. SPE Intelligent Energy International.,2012
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献