Abstract
Abstract
Today an increasing number of oil and gas wells are equipped with downhole and topside gauges for measurements of pressure and rates. These gauges are capable of providing data with a resolution up to every second. For each sensor, this can accumulate to millions of data points every year. Often only a small amount of these data are analyzed in order to retrieve the information they carry about well and reservoir conditions. This might be due to both time-consuming analysis and limited engineering time.
One analysis that can be performed on continuous data from permanent gauges is well test analysis after any well shut-in. In this paper, we will show that this analysis can be automated in order to minimize the engineering effort associated with performing it. This will provide the asset teams with additional information about well and reservoir conditions since today's workload for the engineers does not allow analysis of data from each well shut-in.
The analysis will be performed on real-time data for each detected arbitrary shut-in period, providing time series of the estimated parameters such as skin, permeability, reservoir pressure and drainage area. We show the benefit of linking the automated well test to an automated wavelet filter, where the filter purpose is outlier removal, denoising, compression and transient detection. This allows for developing a system whereby the data first is automatically filtered, compressed and made ready for analysis, and subsequently analyzed with a notification sent to the relevant engineers (also including a qualification on the success of the analysis). Hypothesis testing is finally performed to detect changes in data. If no changes are detected, average parameter values can be calculated. We show examples of such a completely automated well-test analyzer.
Introduction
Improved and better use of real-time data from downhole gauges have been a major concern in the oil industries for some time now. As an increasing number of oil and gas wells are equipped with downhole and topside gauges for measurements of pressure, temperature and rate, more information about wells and reservoir can be retrieved more frequently and accurately than before. However, the large amount of data involved in this can make it a challenging task for the engineers. During the last decade, wavelet filtering and compression has become popular in order to solve this problem1–4. In several of the papers involving wavelet filtering, much of the effort has been to do an automated analysis, or at least a semi automated analysis. This in order to reduce the engineering effort associated with performing the analysis. Athichanagorn et al.2 proposed a seven step procedure to achieve this, including outlier removal, denoising, transient detection, compression, and rate reconstruction, a window approach for parameter estimation and removal of bad transients. As shown by Olsen et al.3,5, the wavelet techniques involved in outlier removal, denoising, compression and transient detection can be considerable improved by choosing the "best" wavelet techniques for the signals considered. It is believed by the authors that by utilizing the improved procedure by Olsen et al., a completely automated "universal best" wavelet analysis will be possible for the signals considered. This will provide continuously filtered and updated data for application in automated well testing.
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献