- Home
- Conferences
- Conference Proceedings
- Conferences
74th EAGE Conference and Exhibition - Workshops
- Conference date: 04 Jun 2012 - 07 Jun 2012
- Location: Copenhagen, Denmark
- ISBN: 978-90-73834-28-6
- Published: 04 July 2012
81 - 100 of 156 results
-
-
Analysis of a Broadband Processing Technology Applicable to Conventional Streamer Data
Authors Zhengzheng Zhou, Milos Cvetkovic and Bing Xu and Philip FontanaWe recorded 2D lines parallel to and in close proximity of one another, with streamers towed at different depths. We applied WiBand, GXT’s broadband processing method, to a deep tow line and recovered data free of receiver ghost notches. We find a good phase match between the WiBand result and a shallow tow line. The match validates the phase fidelity of the WiBand process.
-
-
-
Preparing Data for Full Waveform Inversion: A Workflow for Free-surface Multiple Attenuation
Authors Jyoti Kumar, Adriana C. Ramrez and Suhail ButtWaveform inversion estimates a quantitative model of the subsurface by minimizing the differences (residuals) between observed and calculated seismic data. The success of waveform inversion depends on the complexity of the misfit function. If the starting model is not in the neighbourhood of the global minimum, it can cause the inversion to fail and converge into a local minimum (Sirgue et al., 2011). Since low-frequency data are more linear with respect to the model misfit than high-frequency data, most waveform inversion implementations adopt a strategy that proceeds sequentially from low to high frequencies. Therefore, data preconditioning for waveform inversion must preserve as much low frequency signal as possible. Traditionally, the bubble pulse generated by the source in marine acquisitions has been removed from the data. The bubble can generate undesired results in, e.g., data-driven multiple prediction algorithms such as SRME (Verschuur et al., 1991), where the auto-convolution of the bubble can generate long period artefacts, and requires long filters in adaptive subtraction step. It is difficult to constrain the adaptive subtraction to preserve the primaries untouched when a long filter is used. However, it has also been recognized that the bubble pulse contains valuable low-frequency signal that can benefit the quality of the velocity model estimated by waveform inversion. We propose a workflow for waveform inversion data preconditioning that preserves the bubble and low frequency signal while effectively attenuating the free surface multiples.
-
-
-
Delivering Technical Limit Seismic Data: Nature Vs. Nurture
Authors Linda Hodgson, Daniel Davies, Thomas Hance and Mike SmithThe quality of the seismic product depends on three interacting elements: the fixed physical constraints of the location, the acquisition methodology, and the processing sequence. Recent developments in marine technology have enabled a step change in acquisition, but how much difference will this make to the final product? We compare examples of the new ‘broadband’ methods to ‘conventional’ data, to explore how much extra signal may be expected at different parts of the frequency spectrum. In the right circumstances, substantial gains are possible at both low and high frequencies. In other, more challenging settings, matters such as improved processing or better azimuthal coverage may have more impact.
-
-
-
Assessing Frequency Bandwidth and Resolution Enhancement of Seismic Data:A Broadband Perspective
Authors Didier Rappin and Christian Deplante and Thierry CadoretImproved resolution and enlarged bandwidth are key direct expectations of broadband data, which stem from the definition of a broadband signal – which will be recalled. How should both be assessed and preserved or improved across seismic acquisition, processing and reservoir characterization, possibly including seismic inversion? Examples aim at stimulating thoughts on how our habits should evolve on these topics to adapt to modern high-bandwidth data. In a first part we will focus on how the concept of seismic resolution could be revisited when considering broadband data: event separability and detection issues which are mainly determined by bandwidth and signal/noise ratios will be discussed. Examples of both non-broadband and broadband signals will be used in order to study how these measurements should be assessed. Handling time-variant signals involves a bit of mathematical concepts to preserve relations between different variables such as temporal or spatial ones and frequencies or wavenumbers. It is especially when the bandwidth becomes much larger than the carrier frequency that these must be correctly taken into account throughout the design and use of signal processing tools. At this level, the topic of wavelet estimation will be a subject of particular attention knowing its importance for the reliability of reservoir characterization. Across various examples, some guidance for best practice will be proposed for discussion.
-
-
-
Increasing the Reservoir Characterization Potential with Multi-component Streamer Data
Increasing the bandwidth, both horizontally and vertically plays an important role in seismic inversion and reservoir characterization. This talk will present an overview of a novel multi-component (MC) marine seismic acquisition system combined with advanced data processing techniques that use the pressure recordings and its associated vertical and cross-line spatial gradients to estimate the scattered subsurface wavefield with unprecedented spatial wavenumber and temporal bandwidth content. Results from an experimental test with a mini-3D array of prototype MC streamers will be presented and the discussion focused on the consequences of the overall temporal and wavenumber bandwidth enhancements onto migration and inversion processing and applications. Preliminary well-tie and inversion results will be discussed.
-
-
-
High Frequency Losses – Stripping Various Causes
Authors Didier Rappin, Thierry Castex, Christophe Barnes and Kevin SamynThe quantitative use of the seismic amplitude information during the interpretation is a key point for many prospect evaluations and almost all reservoir characterization studies. The local amplitude information of interest is always affected by a series of signal attenuators along the propagation of the incident and reflected wavefield which are highly dependent upon the geological context, structural shape, lithologies and fluids. Also, incident amplitude effects might or might not have frequency dependence as well as phase or dispersion characteristics. These many causes of attenuation are often pragmatically treated by a combination of a few well-known tools: spherical divergence compensation, surface-consistent or volumic time and frequency-dependent compensations. Cases where severe amplitude attenuation effects cannot be treated using usual approaches are a serious issue, in particular for seismic characterization of reservoirs. In this paper, we present some ways to study the impact and the relevance of specific attenuation processes. Alternative mechanisms and tools for a quantitative assessment of these processes are proposed. Some results are shown from field case studies including VSP data and 3D surface acquisition, standard and broadband. We also attempt to point out the impact of the bandwidth in the identification and quantification of possible attenuation causes.
-
-
-
Broadband Verses Conventional Marine Seismic: The Importance of Compensating for the Earth Filter
Authors Anthony Hardwick, Nick Woodburn and James WhittakerBroadband seismic enables us to move closer towards the ideal goal of imaging, which is to provide the true response of the earth. The potential of broadband seismic can only be fully realised if the effect of various earth filters are accurately compensated for, which often requires some attenuation estimation such as effective Q. This is particularly true in sub-salt and sub-basalt settings where locally strong attenuation of the seismic signal occurs. Here we evaluate the differences in spectral content between broadband and conventional marine seismic through simple synthetic earth models in an attenuating medium. Without compensating for effective Q, we demonstrate that the top and base of a thin layer at depth may not be resolvable in the broadband case. We then describe the application of the pre-stack Q inversion method to derive a spatially varying interval consistent effective Q field from data in the Faroe-Shetland Basin. Application of this field demonstrates a substantial uplift in resolution within the prospective intra-basalt Flett formation.
-
-
-
Spectral Fusion – A Tool to Combine Low and High Frequency Datasets
More Less“Spectral fusion” is a new multi-trace filter which addresses the purpose of merging two seismic traces or datasets whose frequency spectrums modules have a partial overlap. Pillet et al. (2007) presented a specific application of pre-stack inversion, aimed at using a low-frequency dataset in a first inversion, in order to bridge the 7-16 Hz frequency gap in a second target inversion of a HR seismic. In parallel to this workflow which was carried forward to be used in operation, the geophysical team at Total’s Geosciences Research Center in Aberdeen took a different route and restated the problem directly from the seismic side. The idea was to find a generic way to combine two surveys which mainly differ by their respective bandwidths. When doing so, it is necessary to deal with the overlapping bandwidth appropriately. Spectral fusion directly enables to bridge the low-frequency gap of the high-resolution survey using the conventional, lower bandwidth streamer data. The method also improves the seismic well-tie. Spectral fusion was originally filed for patent in the UK on 9th May 2008 under GB 0808418.8.
-
-
-
Value of the Broadband Seismic for Interpreter and Reservoir Geophysics
Authors Cyrille Reiser, Folke Engelmark, Euan Anderson and Tim BirdIdeally, geoscientists would like seismic to provide clear, objective information about the subsurface in terms of: identification of the main geological features and stratigraphic sequences, structural elements, elastic/rock properties, potential prospects and lithology-fluid content of potential reservoirs. 3D seismic has offered the greatest benefits to seismic interpreters and reservoir geoscientists in the last few decades, but historically, seismic images have stopped short of delivering on these requirements, as the seismic bandwidth was limited due to the conventional streamer design and acquisition method. Over the last few years, starting in 2007 (Tenghamn et al. 2007) with the introduction of the dualsensor towed streamer technology, new acquisition methods and technologies have been made available with the aim of providing broader seismic bandwidth without any compromise in data quality or tradeoffs in acquisition efficiency. On one side, the combination of two sensors in the streamer cable itself enables an effective removal of the sea-surface ghost by wavefield separation, allowing us to capture the full bandwidth of the upcoming wavefield. More recently, a time and depth distributed source enables the removal of the sea surface ghost on the source side (Parkes, 2011) expanding further the frequency bandwidth. Thus, interpreters and reservoir geophysicists can now have ghost free seismic enabling a significant broadening of the seismic frequency bandwidth on the low and high side of the spectra. Some results of this latest development will be presented with an end-user perspective.
-
-
-
Variable Depth Streamer – Benefits for Rock Property Inversion
Authors Yves Lafet, L. Michel, R. Sablon and D. Russier and R. HanumanthaThe quality of an inversion depends on the seismic frequency content, the signal-to-noise ratio, the wavelet, and the low frequency model used to incorporate information outside the seismic bandwidth. In order to quantify the benefits of broadband seismic data for inversion, we compare pre-stack inversion results from conventional streamer and variable depth streamer data from NW Australia. The inversion results are combined with a Bayesian fluid classification scheme to map three rock facies and quantify associated uncertainty.
-
-
-
4D Processing Between Variable Depth and Conventional Streamer Data
More LessProcessing data with variable-depth streamer acquisition has recently become possible through a new advanced algorithm called joint deconvolution (Soubaras, 2010). In this particular acquisition, the receiver depth increases non-linearly with offset and this allows for a wide diversity of receiver ghosts to be recorded. This acquisition and associated processing dramatically increases the possible frequency bandwidth, on both low & high frequencies sides, from 2.5Hz to the source notch. This particular broadband technique will be referred to as BroadSeis in this paper. While most acquisitions in the future will certainly be realized with broadband techniques, the question of 4D matching between conventional and BroadSeis data must now be addressed during an intermediate period when the baseline data is a conventional acquisition. This paper considers this challenge and demonstrates that a good 4D response can be obtained Compared to conventional flat streamer data, processing variable depth streamer data implies a major change: the receiver ghosts are rigorously taken into account, whereas they cannot be removed from the wavelet in conventional flat streamer processing. The variable receiver depths of BroadSeis give asymmetrical ray paths which are taken into account by the imaging process and by the proper summation of the up-going and down-going wave fields in the joint deconvolution. Typical cross-equalization in a 4D process aims at solving issues related to differences in the vintages acquisition (acquisition related time- and amplitude differences) and positioning (4D binning). The case between BroadSeis and conventional data has one more problem to solve: the difference in cable profiles. This problem can be handled by joint de-ghosting and re-ghosting processes. In the following sections, we will discuss all these topics: wavelet processing, time de-striping, 4D binning, regularization, imaging and final matching. The dataset used for this comparison is a dual recording acquired by Shell in a highly structured deep offshore play. Data Overview While shooting a conventional 3D survey offshore Gabon, Shell acquired an additional 430 sq km swath of BroadSeis data to evaluate the uplift brought by the broadband image. The first comparison on PSTM data was generated in the end of 2011 and is shown in Figure 1. It shows the overall improvement typically achieved by BroadSeis in terms of enhanced spectral bandwidth. The acquisition geometry consists of 10 cables, 8000m long. The conventional streamers were towed at 11m depth with a source depth of 9m, while the BroadSeis configuration was towed between 11m and 50m with a source depth of 7m. No specific repeatability of source positions was requested when the vessel acquired these swaths, as shown by the azimuth maps of the two acquisitions (Figure 2). With these limitations in mind, both volumes were processed in a 4D sense in order to assess any impact of the variable streamer depth on the 4D signature, which should ideally be zero in the common bandwidth.
-
-
-
The Latest Sleipner CO2 Injection Monitoring Using Dual Sensor Streamer Technology
Authors Ivar Andreas Sand and Anne- Kari FurreCO2 is at Sleipner injected into the Utsira Fm, a shallow aquifer at 800-1100 m depth. Since 1996 more than 13 Mt has been injected (by 2012). In order to monitor the distribution of the CO2 in the sub-surface, a total of seven seismic monitor surveys have been acquired. These form, together with a base survey from 1994 (prior to injection start) a unique dataset (Chadwick et al 2004, Arts et al 2008). In addition, three gravity monitoring surveys give complementary information (Alnes et al 2011). In this paper we focus on the most recent seismic dataset, acquired using PGS’ Geostreamer dual sensor technology. These data can be redatumed from a deep tow depth to a shallower tow depth for comparison with previous monitor surveys, and are in addition expected to have broader frequency content than previous data ( which have non-optimal tow depths), enabling interpretation of finer details.
-
-
-
Inverse Methods to Combine Geology, Geostatistics and Multiphysic Data
By Miguel BoschInverse methods are used to infer model parameters from observed data that are related via random or deterministic functions. Their application to Earth Sciences has expanded more recently to encompass the problem of data and information integration considering the combination of multiple geophysical data surveys, information of multiple properties distributed in space, their relationships, embedded object structure and scale issues. Modelling the complexity related with the multiple parameter subspaces and functions across them is priced by the coherency of the estimated results with the available information and the simplification of the posterior information due to modes and uncertainty reduction. The formulation of this problem is based on modelling the posterior probability density that combines the various components of the available information and data. Appraisal of the posterior information can be obtained parameter wise with calculation of probability distributions describing the posterior uncertainty, or globally via full model configurations corresponding to maximum posterior probability configurations or realizations from the posterior probability. We present examples of the applications of these methods to various problems in Earth Sciences, ranging from the description of the lithospheric structure of interacting plate boundaries to the characterization of hydrocarbon reservoirs.
-
-
-
Monte Carlo Based Tomographic Full Waveform Inversion with Multiple-point a Priori Information
Authors Knud S. Cordua, Thomas M. Hansen and Klaus MosegaardIn a probabilistic formulation, the solution to an inverse problem can be expressed an a posteriori probability density function (pdf) that combines the independent states of information provided by data and a priori information. Here, we define an a posteriori probability density function that defines the solution to a tomographic full waveform inverse problem, which provides a means of obtaining an uncertainty estimate. Unfortunately, no explicit formulation of the solution to this problem can be defined. Therefore, the a posteriori probability density function has to be sampled. The full waveform inverse problem is known to be computationally very hard and is, traditionally, considered out of reach for Monte Carlo sampling strategies. We show that by means of informative a priori information this problem become tractable for a sampling strategy anyways. We outline the theoretical framework for a full waveform inversion strategy that integrates the extended Metropolis algorithm with sequential Gibbs sampling, which allows for arbitrary complex a priori information to be incorporated. At the same time we show how temporally correlated data uncertainties can be taken into account during the inversion. The suggested inversion strategy is tested on synthetic tomographic crosshole ground penetrating radar full waveform data using multiplepoint based a priori information. This is, to our knowledge, the first example of obtaining a posteriori realizations of a full waveform inverse problem.
-
-
-
Integration of Information from Diverce Sources
More LessFrom a probabilistic point-of-view solving inverse problems can be seen as a way of combining states of information in form of probability density functions. Typically, the states of information are provided by a set of observed data and some a priori information obtained independently of the data. The solution to the inverse problem is then the combined state of information quantified by the a posteriori probability density function. Within this probabilistic framework we will discuss methods for combining information from diverse sources. Specifically we will discuss methods for combining information from pre-stack seismic waveform data, a priori geological structural information and information about the relation between rock physics parameters (such as permeability, and oil saturation). One approach is to solve such an inverse problem sequentially: Initially an elastic inversion of the seismic data is performed followed by a transformation of elastic properties to rock physics parameters. Another approach is to directly solve the inverse problem parametrised with rock physics model parameters. We will discuss the benefits and challenges combining these sources of information sequentially and directly using the probabilistic formulation of inverse problems.
-
-
-
Subsurface Analytics: Operationalizing the Original “Big Data”
Authors Duncan Irving and Jamie Cruise“Big Data” has become a convenient short-hand for the exponential growth of data volumes across many industry sectors. This is nothing new in the subsurface domain but E&P DM practitioners can learn from “new” industries how best to deal with complexity and timeliness in their analytical ecosystems. We present an architecture that brings to bear the twin paradigms of massive knowledge discovery using Map-Reduce and “operationalized” decision support using a Relational Database Management System. We describe how this single data instance drives rigorous geological, geophysical and engineering insight into right-time integrated operations generally and allows data, and insight derived from it, to drive business decisions across the enterprise.
-
-
-
An Interdisciplinary Study of the Physico-chemical structure of Earth's Mantle: Combining Geophysical Data Analysis with Mineralogy, Petrology and Geochemistry
By Amir KhanWe jointly invert local fundamental-mode and higher-order surface-wave phase-velocities for radial models of the thermo-chemical and anisotropic physical structure of the Earth’s mantle to 1000 km depth beneath the North American continent. Inversion for thermo-chemical state relies on a self-consistent thermodynamic method whereby phase equilibria and physical properties (P-, S-wave velocity and density) are computed as functions of composition (in the Na2O-CaO-FeO-MgO-Al2O3-SiO2 model system), pressure and temperature. We employ a sampling-based strategy to solve the non-linear inverse problem relying on a Markov Chain Monte Carlo method to sample the posterior distribution in the model space. A range of models fitting the observations within uncertainties are obtained from which any statistics can be estimated. To further refine sampled models we compute geoid anomalies for a collection of these and compare with observations, exemplifying a posteriori filtering through the use of additional data. Our thermo-chemical maps reveal the tectonically stable older eastern parts of North America to be chemically depleted (high Mg#) and colder (>200°C) relative to the active younger regions (western margin and oceans). In the transition zone the thermo-chemical structure decouples from that of the upper mantle, with a relatively hot thermal anomaly appearing beneath the cratonic area that likely extends into the lower mantle. In the lower mantle no consistent large-scale thermo-chemical heterogeneities are observed, although our results do suggest distinct upper and lower mantle compositions. Concerning anisotropy structure, we find evidence for a number of distinct anisotropic layers pervading the mantle, including transition zone and upper-most lower mantle.
-
-
-
History Matching Under Uncertain Geological Scenario
Authors Hyucksoo Park and Jef CaersThe main interest lies in obtaining multiple history matched models under uncertain geological scenario.
-
-
-
Geothermal Energy in Denmark – Potential, Policy and Progress
A new assessment of the geothermal resources in Denmark published by GEUS concludes that the Danish subsurface contains huge geothermal resources (Mathiesen et al. 2009). To rationalise administration the Danish Energy Agency (DEA) has established a new simple application procedure with a standard license period and work program. These initiatives and rising prizes on fossil fuels have together with public concerns related to climatic changes and increasing emission of CO2 to the atmosphere triggered the awareness of the large potential of the geothermal resources, which may contribute to a safe, sustainable, price stable and reliable supply of energy. It is thus expected that geothermal energy may play an important role in the future energy strategy in Denmark (Fenham et al. 2010; Nielsen et al. 2011).
-
-
-
Shallow Geothermal Energy in Denmark – Current Status and Trends
Authors Claus Ditlefsen and Thomas Vangkilde- PedersenShallow geothermal energy is a renewable energy source with large potential for reducing CO2 emissions. The application in Denmark, however, is limited compared to e.g. Sweden and Germany. Preliminary estimates indicate that the energy extraction from a 100 m closed loop borehole may be up to 40% lower for an unfavourable geological situation compared to a favourable situation. More know-how and experience under Danish geological conditions is needed and the project GeoEnergy aims at paving the way for a wider use by providing knowledge, tools and best practice.
-