Full text loading...
-
Reducing turnaround by automating velocity model building using a big data approach
- Publisher: European Association of Geoscientists & Engineers
- Source: Conference Proceedings, EAGE/AAPG Digital Subsurface for Asia Pacific Conference, Sep 2020, Volume 2020, p.1 - 4
Abstract
Most steps in seismic processing require manual input, repetitive testing and significant quality control. Additionally, key steps like velocity model building for depth imaging can be time-consuming using workflows that are ‘stop-go’ linear processes. If model building can be automated, project timelines could be significantly reduced, enabling better decision making during evaluation, planning and development. Probability simulations, such as Monte Carlo methods use random sampling to resolve problems where the solution may be mixed or under determined. When using this type of ‘big data’ approach for velocity model building we need to understand how the data quality impacts the model. Following this, we create a population of models to solve with inverse theory. Using a simple statistics driven reinforcement loop for each population of inverted models enables automation for numerous repeated cycles. This continues until the data accurately converges to a global solution. The size of the population in each pass is data dependent, but over the lifecycle of the model building will be several orders of magnitude greater than conventional model building. This method transforms the model build from a