Affiliation:
1. University of Southern California
2. Aramco Americas
Abstract
Abstract
Given sufficiently extensive data, deep-learning models can effectively predict the behavior of unconventional reservoirs. However, current approaches in building the models do not directly reveal the causal effects of flow behavior, underlying physics, or well-specific correlations; especially when the models are trained using data from multiple wells of a large field. Field observations have indicated that a single reservoir does not have similar production behaviors. This makes pre-filtering the data to build local models that capture region specific correlations more pertinent than a single global model that will provide averaged-out predictions from different correlations.
In this work, we investigate a sophisticated network architecture to expedite the clustering process by training the global model. We utilize attention-based (transformer) neural networks for the input data before mapping to the target variable to extract the attention scores between well properties and the production performance. We leverage the interpretability from these attention-based models to improve the prediction performance for data-centric models derived from clustered datasets. We show the benefits of building local models that are more accurate as they learn correlations that are more region/data specific. Specifically, by utilizing the attention mechanism, we can separate and curate data subsets to train local models, improving the prediction performance by reducing the variability in the entire field.
Reference49 articles.
1. Applied Learnings in Reservoir Simulation of Unconventional Plays;Altman,2020
2. Growth drivers of Bakken oil well productivity;Attanasi;Natural Resources Research,2020
3. Well predictive performance of play-wide and940 subarea random forest models for bakken productivity;Attanasi;Journal of Petroleum Science and Engineering,2020
4. Petroleum Reservoir Simulation;Aziz,1979
5. Neural machine translation by jointly learning to align and translate;Bahdanau,2014
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献