- Home
- A-Z Publications
- First Break
- Previous Issues
- Volume 36, Issue 12, 2018
First Break - Volume 36, Issue 12, 2018
Volume 36, Issue 12, 2018
-
-
AVO inversion on unconventional reservoirs: systematic estimation of uncertainty in the Vaca Muerta shale
More LessAbstractAmplitude versus offset inversion is widely used in reservoir characterization, but its implementation on unconventional reservoirs represents a challenge owing to the subtle changes of elastic parameters that demarcate profitable areas. For this reason, it is important to have an estimation of the uncertainty expected from seismic inversion to improve the interpretation of inverted results. We have implemented a systematic estimation of uncertainty of relative elastic properties obtained by damped least squares inversion of Aki and Richards and Fatti et al. approximations of Zoeppritz’s equation. Well log data of the Vaca Muerta unconventional shale in Argentina was used to generate synthetic angle gathers, varying in terms of asignal-to-noise ratio, range of angle coverage and number of partial angle stacks. The inversion results provide a systematic estimation of the uncertainty of the inverted coefficients. Results show that signal-to-noise ratio and range of angle coverage of the seismic data are key parameters to assess the feasibility of the inversion. In case of characterization based on acoustic impedance, the inverted parameters show good correlation with the model, with these correlation values being predicted by the parameter resolution matrix values.
-
-
-
Seismic wavefield divergence at the free surface
Authors Pascal Edme, Everhard Muyzert, Nicolas Goujon, Nihed El Allouche and Ed KraghAbstractThis paper explores the concept of pressure measurements in a land seismic acquisition setting. We first review the theory for pressure measurements near the surface of the Earth and show the significance of the S-to-P conversion which results in the pressure being proportional to the sum of the slowness-scaled horizontal velocity fields. In the second part of this paper we test a land hydrophone prototype via a small-scale experiment to validate the pseudo-pressure measurement. Potential applications are briefly discussed. This study suggests that such a land hydrophone can potentially allow local and omni-directional noise attenuation, via adaptive subtraction using the pseudo-pressure data as noise model, and therefore can potentially allow sparser acquisition geometries with associated field effort and cost reduction.
-
-
-
The melding of artificial and human intelligence in digital subsurface workflows: a historical perspective
More LessAbstractArtificial intelligence (AI) is not some elusive, mystical technology that humanity is chasing, especially in regard to its usage in digital subsurface workflows. Artificial intelligence has been complementing human intelligence since the 1960s, and AI was deeply integrated into our personal and professional lives long before the technological revolutions of the 21st century. However, we tend to not realize how intrinsic AI is to our lives already. We are constantly moving the goalposts for defining AI as it solves more and more problems. This is known as the AI effect, where people tend to only think of AI as ‘whatever hasn’t been done yet’ (Hofstadter, 1979). This article attempts to review the historical melding of human and artificial intelligence in digital subsurface workflows, with some extra focus on the field of geophysics.
-
-
-
Analysis of gas production data via an intelligent model: application to natural gas production
Authors Mohammad Ali Ahmadi and Zhangxin ChenAbstractPredicting the future oil and gas production rate and evaluating oil/gas reserves are very challenging issues. Many engineers have found decline curve analysis a useful approach (Ahmed, 2010; Arps, 1945; Ebrahimi, 2010; Fetkovich, 1980; Gentry, 1972; Li and Horne, 2005; Ling and He, 2012; Oghena, 2012; Shirman, 1999; Zheng and Fei, 2008). The production rate or cumulative production at a constant bottom-hole pressure declines with time (Ahmed, 2010). Since mechanisms affecting the production are constant throughout the lifetime of a reservoir, extrapolating decline curves is used to forecast the future production rate. To do so, initial production rate, the decline curvature, and its rate should be considered (Ahmed, 2010). Arps’s equations are fundamental for the most heuristic and conventional decline curve analysis models (Arps, 1945). Arps demonstrated that the hyperbolic family of equations can express mathematically the curvature behaviour of the production rate versus time curve. The Arps (Arps, 1945) equations are divided into three categories, including exponential, hyperbolic, and harmonic decline curve models. Fetkovich (Fetkovich, 1980) proposed type curves for analysing decline curves. The procedure of type curve matching is summarized by the visual matching with log-log paper that includes pre-plotted curves of production data. Each of the curves has characteristics which can be shown when plotting them on Cartesian, semi-log and log-log scales as shown in Figure 1.
-
-
-
Combined pre-stack and post-stack interpretation for velocity model building and hydrocarbon prospectivity: a learning case study from 3D seismic data offshore Gabon
Authors Paolo Esestime, Milos Cvetkovic, Jonathan Rogers, Howard Nicholls and Karyna RodriguezAbstractWe present an integrated geological and geophysical study conducted during the acquisition and processing of extensive 3D Multi-Client seismic campaigns offshore Gabon. These campaigns resulted in two distinct surveys, the first of 11.500 km2 in the southern shelf, and a second of 5500 km2 to the north, offshore Libreville (Figure 1). The acquisition parameters and survey design were planned with seismic illumination studies, long offset streamers were utilised and the data was processed with a modern broadband sequence (Esestime et al., 2017).
-
-
-
Increasing resolution in the North Sea
Authors Phil Hayes, Luke Twigger, Krzysztof Ubik, Thomas Latter, Chris Purcell, Bingmu Xiao and Andrew RatcliffeAbstractRecent step changes in seismic processing and imaging technology have delivered dramatic improvements in resolution, velocity model building and multiple attenuation. This article will explore improvements in resolution that have been demonstrated on two multi-client surveys in the North Sea: Cornerstone and Northern Viking Graben (NVG). Extending full-waveform inversion (FWI) to include absorption effects as well as velocity has delivered improved imaging, higher resolution and more reliable AVO products for the NVG survey, and this will now be applied to the reprocessed Cornerstone data. Both surveys cover more than 35,000 km2 each, so application of high-resolution processing sequences to these two data sets will deliver advanced high-resolution data over two large areas of the North Sea.
-
-
-
Mode conversion noise attenuation, modelling and removal: case studies from Cyprus and Egypt
Authors Jyoti Kumar, Marcus Bell, Mamdouh Salem, Tony Martin and Stuart FairheadAbstractThe wavefront from a source that strikes an acoustic impedance contrast separates into four variables: transmitted and reflected compressional waves (P-wave) and transmitted and reflected shear waves (S-wave). Converted waves are those whose mode changes at the interface and can be recorded in a marine environment in the presence of large velocity contrasts. The difference in acoustic velocities at the boundary means that in high contrast media the distinction between compressional and shear velocities across the boundary is small; shear waves are not as refracted as compressional waves and can be recorded as partially converted reflected energy.
-
-
-
Nonlinear beamforming for enhancing prestack seismic data with a challenging near surface or overburden
AbstractModern land seismic data acquisition is moving from sparse grids of large source/receiver arrays to denser grids of smaller arrays or point-source, point-receiver systems. Large arrays were designed to attenuate ground-roll and backscattered noise and to increase overall signal-to-noise ratio (SNR). An example of a typical raw common-shot gather acquired using a legacy acquisition design with 72 geophones in a group and five vibrators per sweep is shown in Figure 1b. We can clearly see that the ground-roll noise with low apparent velocity was partially attenuated by field arrays, and reflection events with high apparent velocity are strong and can be reliably identified. Decreasing the size of field arrays during acquisition in arid environments leads to dramatic decrease in data SNR. An example of raw common-shot gather from a 2D line acquired using a single-sensor survey is shown in Figure 1a. In contrast to legacy data, the single-sensor data is dominated by noise caused by severe multiple scattering in complex near-surface layers and shows no apparent evidence of reflection signal. The sources and receivers were spaced at 10 m intervals in this recent 2D single-sensor survey. This sampling involves much denser acquisition compared to the conventional data using intervals of 30 m or more. Theoretically, high-density seismic acquisition better samples the entire wavefield (signal and noise) and is expected to result in improved imaging. Achieving this in practice with huge amounts of low SNR data proves to be very challenging. Conventional time processing tools such as surface-consistent scaling, deconvolution, static corrections, require reliable prestack signal in the data. Their application to modern seismic datasets acquired with small arrays often leads to unreliable results because the derived operators are based on noise and not on the expected signal. To extract the maximum value from dense high-channel acquisition, we need to enhance signal in the prestack data. Fortunately, densely sampled data gives us more flexibility than grouping geophones directly in the field.
-
-
-
Is it worth the effort? — why state-of-the art reprocessing of old seismic data was an indispensable tool for a reservoir simulation study in the Murzuq Basin (Libya)
Authors Christian Stotter, Alexey Burlakov, Dmitry Ablikov, Robert Rieger and Adel ZeglamAbstractField development and reservoir simulation studies are usually performed relatively late in the life of an oil or gas field. At this time, specific questions such as mitigating early water breakthrough, infill well locations, fracture behaviour and so on need to be addressed. Typically, this is done through reservoir simulation studies. Commonly, the available 3D seismic data was acquired early in the field life for exploration or appraisal purposes. In many reservoir simulation studies the question arises as to whether it is worth the effort to reprocess vintage seismic data, or use only available fault and horizon interpretation of the original or previous processing for building the static model and simulation grid. Unfortunately, the latter approach is often followed, since seismic reprocessing is time-consuming and often perceived as awkward. We cannot deny the fact that seismic processing has made significant progress in recent years. However, the application to relatively old data is often regarded as suspect. The applicability of methods developed for high density, wide azimuth/large offset surveys, to poorly sampled data, is questioned.
-
-
-
Is Machine Learning taking productivity in petroleum geoscience on a Moore’s Law trajectory?
Authors Eirik Larsen, S.J. Purves, D. Economou and B. AlaeiAbstractDuring the last three decades Wolf and Pelissier-Combescure (1982), Delfiner et al. (1987), Baldwin et al. (1990), Wong et al. (1995), Helle et al. (2001), Bhatt and Helle (2002a,b), Dubois et al. (2007), Li and Anderson-Sprecher (2006), Zhang and Zhan (2017) have shown that neural networks such as multi-layer perceptrons (MLP) can be trained to infer lithology, sedimentary facies, porosity, and fluid saturation as functions of wireline logs. Machine Learning (ML) has been used to classify the seismic waveform (Anderson and Boyd 2004), solve AVO problems (Russell et al. 2002), and to segment seismic facies in 3D volumes (Meldahl et al., 2001; Zhao et al., 2015; Qi et al., 2016). Now, the next generation of ML techniques are transforming the subsurface workflow beyond these applications. This transformation is being enabled by multiple developments from outside the geoscience domain, namely:
- Algorithmic development, driven by AI researchers and tech companies, has given us; i) convolutional neural networks (CNN) (leCun et al., 1990; Krizhevsky et al., 2012) that have transformed the quality of image classification and segmentation tasks, ii) recurrent neural networks (RNN, LSTM) (Hochreiter and Schmidhuber, 1997a,b; Graves et al., 2006; Graves 2013; Sutskever et al., 2014) that have dramatically improved sequence-to-sequence learning, and generative adversarial networks (GAN) (Goodfellow et al., 2014; Zhu et al., 2017) which enable generation of realistic synthetic data and provide a powerful new class of architecture applicable to a wide range of problems.
- Open source libraries such as scipy, tensorflow, pytorch, sklearn, as well as open source geoscience specific libraries such as gempy (de la Varga et al., 2018), and devito (Luporini et al., 2018) are emerging and facilitating application of ML in geoscience.
- Increasing availability and democratization of sub-surface data in national data repositories (NDR) and other sources is enabling the geoscience community to experiment with novel data-analytics techniques, building data science into their problem-solving repertoire.
- GPU enabled high-performance computing, and cloud computing and storage have given a wider audience access to the supercomputing needed to drive the often memory- and compute-hungry algorithms.
- Emergence of data analytics platforms make the application of ML methods more practical for the generalist geoscientist who wants to focus on solving geoscience problems rather than writing bespoke code for each use case. Such platforms integrate data analytics with structured databases and enable users and organizations to apply ML on a large scale while maintaining order, data management, and provenance so that workflows are reproducible.
-
Volumes & issues
-
Volume 42 (2024)
-
Volume 41 (2023)
-
Volume 40 (2022)
-
Volume 39 (2021)
-
Volume 38 (2020)
-
Volume 37 (2019)
-
Volume 36 (2018)
-
Volume 35 (2017)
-
Volume 34 (2016)
-
Volume 33 (2015)
-
Volume 32 (2014)
-
Volume 31 (2013)
-
Volume 30 (2012)
-
Volume 29 (2011)
-
Volume 28 (2010)
-
Volume 27 (2009)
-
Volume 26 (2008)
-
Volume 25 (2007)
-
Volume 24 (2006)
-
Volume 23 (2005)
-
Volume 22 (2004)
-
Volume 21 (2003)
-
Volume 20 (2002)
-
Volume 19 (2001)
-
Volume 18 (2000)
-
Volume 17 (1999)
-
Volume 16 (1998)
-
Volume 15 (1997)
-
Volume 14 (1996)
-
Volume 13 (1995)
-
Volume 12 (1994)
-
Volume 11 (1993)
-
Volume 10 (1992)
-
Volume 9 (1991)
-
Volume 8 (1990)
-
Volume 7 (1989)
-
Volume 6 (1988)
-
Volume 5 (1987)
-
Volume 4 (1986)
-
Volume 3 (1985)
-
Volume 2 (1984)
-
Volume 1 (1983)