Abstract
Abstract
Big data technology is applied to analyze massive micro-seismic data set, which incorporates previously over-looked data set. This, in turn, will give redundant fracture modelling in stages and exact fracture propagation map in real time. Micro-seismic is pivotal to the success of Hydraulic Fracturing activity. However, with the advent of advanced geophones, the massive dataset requires a different analytical point of view, currently absent in conventional database processing and algorithms. HADOOP equips with the necessary tools for better and advanced real-time processing and analysis.
Various Algorithms are developed to show comprehensive fracture operation analysis. Previous job failures are used to predict future anomalies, hence enhancing success ratio. The holistic dataset for the reservoir (Exploration, Drilling, and Production) are considered to synchronize the reservoir information. For example, drilling data (ROP, drillability, WOB etc.) is analyzed to predict the type of formation (like Brittle or Ductile), poroelastic constant, elasticity etc. Comprehensive analysis of fracture propagation would consider all the parameters associated not just conventional ones.
The dataset is stored in Hadoop and called upon whenever needed. The massive amount of dataset is not being processed in conventional databases but can be integrated using Hadoop. The analytical results provided from Hadoop stands out from conventional formulae based software. The visualization of results keeps the minimum scope of error contradicting with the currently used ones. In current ones, many data trends and parameters are left out which are not used in formulae. Those patterns are visually shown and incorporated into the analysis, causing better mapping of fractures. Not only just complementing current analysis, Big Data provides the scope of comprehensive analysis from start to end. When 3D seismic appeared, it was a radical change. It not only showed 2D maps were of low resolution, rather those were rendered misleading. The Hadoop analytics is providing a unique perspective, leaving some mismatches, which are needed and to be seriously considered for future planning.
The resulting model does not use conventional formulae, hence not limited to consider the real-time data. Rather field data (associated with noise) is analyzed using algorithms, generating trends from noises and deviations. The conventional software misses massive relevant data, which apparently cannot be incorporated into formulae. That inability is being met with Big Data analytics. Conventional database management is unable to handle so much data, which are being taken care off by Hadoop platform.
Cited by
5 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献