Abstract
Abstract
Recently Ensemble Kalman Filtering (EnKF) has gained increasing attention for history matching and continuous reservoir model updating using data from permanent downhole sensors. It is a sequential Monte-Carlo approach that works with an ensemble of reservoir models. Specifically, the method utilizes cross-covariances between measurements and model parameters estimated from the ensemble. For practical field applications, the ensemble size needs to be kept small for computational efficiency. However, this leads to poor approximations of the cross-covariance matrix, resulting in loss of geologic realism. Specifically, the updated parameter field tends to become scattered with a loss of connectivities of extreme values such as high permeability channels and low permeability barriers, which are of special significance during reservoir characterization.
We propose a novel approach to overcome this limitation of the EnKF through a ‘covariance localization’ method that utilizes sensitivities that quantify the influence of model parameters on the observed data. These sensitivities are used in the EnKF to modify the cross-covariance matrix in order to reduce unwanted influences of distant observation points on model parameter updates. In particular, streamline-based analytic sensitivities are easy to compute, require very little extra computational effort and can be obtained using either a finite difference or streamline-based flow simulator.
We show that the effect of the covariance localization is to increase the effective ensemble size. But key to the success of the sensitivity-based covariance-localization is its close link to the underlying physics of flow compared to a simple distance-dependent covariance function as used in the past. This flow-relevant conditioning leads to an efficient and robust approach for history matching and continuous reservoir model updating, avoiding much of the problems in traditional EnKF associated with instabilities, parameter overshoots and loss of geologic continuity. We illustrate the power and utility of our approach using both synthetic and field applications.
Introduction
In recent years, there has been a paradigm shift from attempting to ‘history match’ a single reservoir model to generating a suite of realizations consistent with all dynamic data and prior geologic information. Predicting future reservoir performance with these multiple realizations would provide for a measure of uncertainty in model forecasts, leading to better reservoir development and management strategies. This effort has been aided by the development of robust and efficient algorithms for automatic and assisted history matching,1–3 and availability of greater computational power. The Ensemble Kalman Filter (EnKF) is one such promising technique for generating a suite of plausible reservoir models.4–10 The EnKF samples from multi-dimensional probability density functions (pdf) that are consistent with our prior knowledge of the model parameters. These samples or realizations help specify covariances between model parameters and cross-covariances that relate measurements and model parameters. Instead of computing gradients as in variational methods, these covariances and cross-covariances are utilized to update the models.
The increased deployment of permanent downhole sensors and intelligent well systems that provide a continuous stream of information has made the EnKF an appealing method for sequential model updating.4–10 The capability to maintain ‘live models’ combined with the ability to assimilate diverse data types and the ease of implementation have resulted in increased research effort and interest in the EnKF.
Cited by
18 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献