- Home
- Conferences
- Conference Proceedings
- Conferences
74th EAGE Conference and Exhibition - Workshops
- Conference date: 04 Jun 2012 - 07 Jun 2012
- Location: Copenhagen, Denmark
- ISBN: 978-90-73834-28-6
- Published: 04 July 2012
51 - 100 of 156 results
-
-
Wavelet Estimation and Multiple Modeling in Full-Waveform Inversion
Authors Ivan Chikichev, Ke Wang and Spyros LazaratosFull-waveform inversion (FWI) has the potential to extract information not only from primary reflections, but also from multiples. We show that accurate modeling of multiples provides strong constraints on the amplitude, frequency spectrum and phase of the seismic wavelet. Thus the method presented here leads to a very robust estimation of the wavelet without relying upon well control. As a consequence, it is applicable to and could be particularly beneficial to the early stages of exploration.
-
-
-
Resolution in Seismic Inversion- Spectral Gap or Spectral Overlap, Which is Harder to Handle?
By Dave NicholsEarly methods that combined kinematic inversion with amplitude inversion have a gap in the resolved wavenumbers between those resolved by the amplitudes of the scattered field and those resolved by the kinematics. This gap required us to add extra constraints to the problem to recover a model that spans the full wavenumber spectrum. Three advances have brought us to a situation where that gap is often closed. 1) Modern seismic acquisition techniques with lower frequency sources and wider offsets have broadened the spectrum of wavenumbers resolved by single scattering inversions. 2) Full wave inversion using the two way wave equation data provides an accurate treatment of multiply scattered data and can accurately treat the low frequency part of the spectrum. 3) Improved tomography techniques now provide kinematic inversions that resolve more detail of the smooth background. This closing of the gap provides us with a new challenge. We have multiple measurements contributing to resolution of the same parts of the model. We must be careful to combine them in a way that honours the accuracy of each of the measurements.
-
-
-
Waveform Inversion Using Blocky Parameterization in the Laplace Domain
Authors Changsoo Shin and Hong LeeThe Laplace-domain waveform inversion yields realistic smooth velocity models. By exploiting the inversion’s merits, we developed a cost-effective inversion algorithm by reducing the number of inversion parameters. To develop the idea, we adopted the blocky parameterization method. Then, we applied the Gauss-Newton method combined with the CGLS method to accelerate and stabilize the convergence process. Through numerical tests, we confirmed that the simultaneous inversion of both velocity and interface was feasible, and the smoothed inversion results are comparable to those of the conventional Laplace-domain inversion. The resolution of the inverted velocity models depends on the block size. Therefore, the block size should be carefully determined by tessellating the rough subsurface structure. In our tests on synthetic and field data, the total number of inversion parameters was reduced to less than one-hundredth of the conventional Laplace-domain inversion. The proposed algorithm maintains the robustness of the Laplace domain inversion, and the results are acceptable for use as initial velocity models for consequent inversions, such as the frequency-domain waveform inversion.
-
-
-
Challenges in the Full Waveform Inversion Regarding Data, Model and Optimisation
Authors Jean Virieux, Romain Brossier, Ludovic Mtivier, Vincent Etienne and Stphane OpertoFull waveform inversion has been proposed in the early eighties and we now find various illustrations of this high resolution seismic imaging technique on both synthetic and real data. We investigate the different issues one may address regarding the three elements of this technique. The optimisation formulation should move towards more complete Newton-like methods. The hierarchical data sampling strategy should prevent the local optimisation approach to be trapped into a local minimum. The model description should keep the number of degrees of freedom as low as possible while prior information should be integrated into a regularisation term moving the full waveform inversion from a data-driven approach to a more balanced data and model contributions when available.
-
-
-
Full Waveform Inversion by Iterative Depth Migration and Impedance Estimation Using Well Control
Authors Gary F. Margrave, Robert J. Ferguson and Chad M. HoganWe relate full waveform inversion (FWI) to processes familiar to practicing geophysicists. A key theoretical result behind FWI is that a linear update to a migration velocity model is proportional to a prestack reverse-time migration of the data residual (the difference between the actual data and data predicted by the model) where the proportionality factor must be estimated. We argue that in most real-world cases this factor will be frequency dependent, or in the time domain, it will be a convolutional wavelet. The estimation of the velocity update from the migrated section and the common process of impedance inversion are analogous, and we view FWI as a practical cycle of data modeling, migration of the data residual, and "calibration" of this migration to deduce the velocity update. The calibration step can be accomplished like a conventional impedance inversion where the migrated data residual is tied to the velocity residual (the difference between actual velocity and migration velocity) at a well. As there are a great many established algorithms for impedance inversion, so there are a plethora of possibilities for calibration. We present an extended example using the Marmousi model in which we use wave-equation migration (e.g. depth stepping) of the data residual and a simple least-squares amplitude scaling and constant phase rotation, determined at a simulated well, to calibrate the migration. We find that our approach produces a much improved velocity model in only a few iterations.
-
-
-
Understanding Uncertainty and Managing Risk with Geophysics
More LessUncertainty in Geophysics starts with uncertainties in the measurements. It continues with the uncertainties in the models, explicit or implicit, that are used for processing. Modelling, calibration, optimisation and interpretation are mixed to produce numerical models of the subsurface that are the basis for decision making. All these aspects are illustrated in several examples.
-
-
-
Incorporating Fault Uncertainties Into the Reservoir Model and Evaluating their Impact on the Fluid Flow
Authors Cecilie Otterlei, Oddvar Lia, Judithvan Hagen, Paul Gillespie and Signe OttesenSeismic imaging can be very challenging for some reservoirs, like sub-salt and deep structures, and also for structurally complex fields there can be significant uncertainty associated with the seismic interpretation. A workflow that incorporates the fault uncertainties into the reservoir model has been created, and their impact on the fluid flow is evaluated. Uncertainties in compartmentalization, fault location, fault displacement, sub-seismic faults and fault sealing have all been considered. The workflow is fully automatic and contains the complete chain from structural modelling to flow simulations. It can be extended to also take uncertainties in horizons and properties into account. The field case application is a sub-salt structure with poor quality seismic, and where pressure and PVT data indicate that the reservoir is compartmentalized. The economics are marginal, but large in-place volumes provide a high potential for the field. The uncertainties related to the faults are believed to be among the most important factors contributing to the total uncertainty in the in-place volumes and the recovery factor, and the uncertainty workflow allows a robust field development plan to be created that reduces the investment risk.
-
-
-
Quantifying Petrophysical Uncertainty to Help Reduce Risk
More LessThe goal of formation evaluation is to identify the nature and volume of fluids contained in a given formation. Traditionally, single values for these parameters are presented though in reality each is subject to uncertainty. If we quantify the uncertainty associated with our analysis, our evaluations become more useful in the process of deciding whether or not a field is an economic prospect. Here we will consider the nature of petrophysical uncertainties and discuss approaches that can be used to quantify, understand and reduce them.
-
-
-
Decision Making in the Presence of Uncertainty
By Peter KingUncertainty is intrinsic to all aspects of modelling reservoirs and their performance. It arises from several sources i) many key data have to be interpreted from data using incomplete or imperfect physical models ii) data are only taken at very sparse intervals which have to be interpolated between iii) the intrinsic non-linearity of the flow makes forward prediction inherently unstable and therefore non-deterministic. It is a widely held view in the industry that gathering more data, such as production history, will reduce uncertainty. However, history matching (inferring reservoir properties from production history) is an inverse problem which is inherently unstable so good history matches do not necessarily produce good forecasts. Moreover reservoir modelling is carried out to support reservoir management decisions. Decision making in the presence of high levels of uncertainty can be complicated. The aim of this talk is to highlight the sources of uncertainty inherent in reservoir modeling and to indicate some modern ways by which optimal reservoir management decisions can be made in the presence of such uncertainty.
-
-
-
Integration of Stimulation into Field Development Planning
By Kevin MauthField development planning has occasionally included stimulation treatments as a contingency rather than as a key component of the plan, if stimulation was included in the planning process at all. Except in tight reservoirs, stimulation has been considered a means of last resort to establish or maintain production from wells that produce less than expectations. By leaving stimulation as a contingency, operators have missed opportunities to collect the data needed to enable successful stimulation. Often, completed well architechture requires compromise in stimulation practices, which makes optimization impossible. As the quality or accessibility of new reservoirs available for development continues to decline, the importance of incorporating stimulation and the associated data collection requirements into field development planning has never been greater. This presentation will discuss some common pitfalls associated with considering stimulation as a contingency as well as the integration of new disciplines to the planning process which increase the probability of success for stimulation treatments. Field examples will also be provided to emphasize the benefits of an expanded field development plan (FDP) workflow for different types of reservoirs.
-
-
-
Tight Chalk Reservoir Stimulation as a Field Development Tool
Authors Franz Marketz, Maryvan Domelen, Sara Kofoed and Simon WherityFor the development of very tight chalk reservoir in the Danish Sector of the North Sea stimulation is a key value driver. Stimulation and Completion techniques have therefore been screened as early as the “Assess” and “Select” phases the hydrocarbon maturation process. In this way a well concept tailored to the reservoir has been developed and tested before executing full field development.
-
-
-
Cemented, Multi-stage Ball Drop Completion Field Trial in the North Sea
More LessStimulation is a necessity for optimum production in many fields. This is becoming more and more important in order to make certain projects economically feasible. There are many stimulation techniques and technologies available. The challenge is to find the optimal stimulation solution that matches the drilling/completion design (casing size, zonal isolation type etc.) and does not drastically extend the stimulation time. One success story can be seen through the experience of one North Sea operator. They have been through an evolution on stimulation techniques on their long producing chalk field. Initial stimulation technique was traditional perforation in clusters throughout the entire reservoir section bullheading with 28% HCL. Post production analysis showed that this resulted in poor acid distribution and uneven production. Liner deformation around the clusters also resulted in costly operations.
-
-
-
Practicalities of Stimulation in Tight Gas Reservoir
More LessRemaining development opportunities in the Southern Gas Basin of the North Sea are mostly confined to lower permeability sandstones and pose significant technical execution challenges and cost hurdles. Field operators have increasingly had to rely upon the adoption of multiple fractured, horizontal wells to achieve commercial rates, deploying completion practises that historically evolved in chalk oil fields in Denmark and Norway and materials honed in the recent technology-enabled opening up of shale plays across North America. The presentation will summarise activities of the most active operator in recent years in the Southern Gas Basin.
-
-
-
Shale Developments: Use of Modern Data Mining Methods to Interpret Similarities and Differences Between Gas and Oil Completion/Stimulation Strategies
Authors Randy La Follette and William D. HolcombThe presentation and discussion will focus on results of statistical analysis of Barnett Shale gas and Bakken Shale oil production result drivers. Large data sets including key reservoir parameter proxies, well architecture information, completion variables, and stimulation data were compiled, quality controlled, and analyzed to identify key production influences. The analysis and interpretation took into account both controllable influences on well productivity, e.g., well length, azimuth, completion type, stimulation size and materials, along with uncontrollable influences, e.g., fracture bounding bed presence / absence, fracturing into unknown Geohazards. The most obvious production driver is well location, a fair proxy for reservoir quality in the “shale” plays. Well architecture, including azimuth, length, and drift angle may or may not be a major determinant of productivity. It is apparent from both the Barnett and Bakken data sets that longer well lengths do not produce proportionately more hydrocarbons. Specific completion and stimulation parameters, e.g., use of coarse-mesh proppants are also important productivity drivers in certain circumstances.
-
-
-
Monitoring and Modelling Hydraulic Fracture Stimulation: Future Directions
Authors Quentin J. Fisher, J- M. Kendall, J. P. Verdon and A. Baird and M. HudsonMultiple hydraulic fracturing along horizontal wells has proved to be a game changer that has led to the economic recovery of a vast amount of natural gas from shale resource plays in the USA. Optimization of hydraulic fracture stimulations has generally been achieved using a trial-and-error approach; although the microseismic monitoring of event locations has over the last decade proved to be a key enabling technology. Reductions in gas price, combined with the push to exploit resource plays in highly populated areas without a well-developed supply chain, mean that there is increasing pressure to optimize hydraulic fracture stimulations. Use and integration of advanced microseismic monitoring and geomechanical modelling offers the potential to make a step change in the optimization of hydraulic fracture stimulation. In particular, interpretation of microseismic attributes such as the magnitude and frequency dependence of shear wave splitting can be used to track temporal and spatial changes in fracture density, compliance and potentially size. Geomechanical modelling of the stress distributions prior to and following fracture stimulation can potentially help optimize the spacing and sequencing of individual stages of a fracture treatment as well as identifying the optimal time to conduct workovers (i.e. refracturing).
-
-
-
Interpretation and Application of Microseismic Images
More LessMicroseismic monitoring (MSM) of hydraulic fracture treatments is routine in North America and has added significantly to our understanding of fracture growth. The interpretation of microseismic images is advancing steadily, extracting more information from event patterns, temporal evolution, and acoustic waveforms. The increasing amount of information from MSM provides significant opportunities to improve stimulation designs, completion strategies, and field development. However, the applications of microseismic interpretations are many times ill-defined, overlooked, or not applied properly. The integration of microseismic images, fracture modeling and reservoir simulation is required to determine the effective stimulated volume. One of the most common misapplications of microseismic interpretations is the assumption that larger stimulated volume (SV) will automatically result in increased well productivity. Characterizing propped and un-propped regions of the hydraulic fracture is critical when evaluating well performance and estimating drainage area and hydrocarbon recovery. This abstract highlights the interpretation and application of microseismic images using excerpts from SPE 152165 (Cipolla et al 2011a).
-
-
-
New Deterministic Calculation Regime for the Estimation and Characterization of the Stimulated Reservoir Volume (SRV)
More LessA new deterministic calculation regime for the estimation and characterization of the Stimulated Reservoir Volume (SRV) exists. It is based upon a combination of ultra-fine-scaled measurements of induced surface deformation morphology (on the micrometer scale) and a new, two-pass, geomechanical inversion technique. The approach overcomes instabilities and questions of uniqueness with inverted solutions of reservoir strain distributions. The technique was deployed concurrently with the more conventional, High-node-count, highfrequency, downhole, offset microseismic mapping in an exploratory horizontal completion located in an Eagle Ford horizon where low deviatoric stresses made dual and tri-modal complex fracture networks likely. Passive microseismics and microdeformation techniques respond to fundamentally different mechanical processes associated with hydraulic fracturing. This pilot application merged the results from these two mapping diagnostics to explore the potential for integrated geophysical/geomechanical information to facilitate a more accurate and comprehensive understanding of treatment performance.
-
-
-
Advanced Drilling and Completion Solutions for Unconventional Shale Gas
More LessNatural gas and oil production from shale reservoirs has reshaped the petroleum industry in North America and is posed to have a major impact in other locations around the world. Traditionally shale has been seen as a source rock, a trap or a drilling hazard within the petroleum industry. Today, however, many of these source rocks have proven to yield commercial production of hydrocarbons when the correct technologies are applied to understand the reservoir potential of a given formation and proper drilling and completion techniques are used. This presentation will examine some of the basic geological requirements that need to be assessed to determine if a particular shale reservoir has good production potential. It will then look at drilling and completion solutions have proven successful in different shale reservoirs. Guidelines to optimize the completion design based on specific reservoir properties will be discussed. New validation technologies, including microseismic fracture mapping will be discussed showing their significance in maximizing the productive capacity from a given well and improving our understanding in the production mechanisms. New technology in the area of stimulation fluids and equipment development will also be discussed.
-
-
-
Completion Based Stimulation Technology; When Fracturing Just Doesnt Fit
Authors Thomas Jrgensen and Rune FreyerReservoir stimulation is required for efficient production in many fields. But traditional fracturing requires much preparation and design. Rock mechanics in small fields require significant data capture and it is not possible to develop as good understanding by trail and error like in the large North American plays where hundreds or even thousands of wells are stimulated before “cracking the code”. Challenges in small fields can be to understand stress fields, water or gas intervals, formation damage or other field specific issues.
-
-
-
Surveillance Field Trial to Identify Thief Zones in MFF-09B, a well with a Controlled Acid Jetting (CAJ) Liner
Authors Hansvan Dongen, John Davies, Kerem Yuksel and Edo BoekholtzThe Controlled Acid Jetting (CAJ) well design was developed by Maersk Oil for the development of relatively thin but aerially very extensive oil accumulations in low-permeability Chalk formations in the North Sea. These relatively low-cost long horizontal wells (up to 30,000 ft Total Depth) enabled the development of the Dan West Flank and Halfdan oil fields, which would otherwise have been uneconomic. Building on the favourable experience from the North Sea, Maersk Oil has also applied CAJ wells for the cost-effective development of the Al Shaheen oil field in Qatar. The initial development decision for CAJ wells to develop these waterflooded oil fields has effectively resulted in a zero (below Coil Tubing reach) well intervention policy in terms of inflow/outflow profile surveillance and subsequent treatment of any thief zones and/or high-skin intervals encountered. The initial field developments have met expectations with respect to development costs and initial oil production rates. A significant number of waterflood patterns now show faster than forecasted watercut development, which indicates the presence of non-conformances (i.e. natural and/or induced fractures acting as thief zones). When thief zones are present in water floods, oil ultimate recovery is lower than forecasted, which has triggered the need to develop new CAJ well intervention technologies for inflow/outflow profile surveillance and subsequent treatment of thief zones. A dedicated Long Wells Conformance Control (LWCC) team was set up in 2010 to accelerate the development and implementation of such new technologies for the North Sea and Qatar oil fields.
-
-
-
Temperature Dependence of Ultrasonic Velocities in Shales – Can We Use It For Interpreting Time-lapse Seismic?
Authors Andreas Bauer, Rune Holt, Audun Bakk, Erling Fjr and Jrn StenebrtenThe temperature dependence of ultrasonic velocities in shales show significantly larger temperature sensitivities than predicted by the Gassmann fluid-substitution model, which can be attributed to temperature dependent velocity dispersion. We expect the temperature dependence of velocities to be frequency dependent, resulting in different temperature sensitivities at seismic, sonic and ultrasonic frequencies.
-
-
-
A Study of Geomechanical Effects on Time-lapse Seismics
Authors Giorgio Cassiani, A. Brovelli G. Vignoli and B. PlischkeTime-lapse seismics is known to be a very effective monitoring technique for the subsurface fluid movement and saturation changes, as well as for geomechanical phenomena [Snieder et al., 2007]. The integration of seismic and reservoir engineering is now becoming state-of-theart [Boutte, 2007] while the number of applications is steadily increasing [Staples et al., 2006]. Among the future challenges to the use of time-lapse seismics is the integration with geomechanics [Landrø, 2006]. The improvement of time-lapse seismic technology [e.g. Tang et al., 2007, Aarre et al., 2007] allows for better and more accurate data acquisition, that in turn allows to “see” effects previously difficult to detect. The effects of geomechanics on time-lapse seismic data have been described in detail by a number of publications [Hatchell and Bourne, 2005; Sayers and Schutjens, 2007; Cox and Hatchell, 2008; Kristiansen and Plischke, 2010]. The overall impact of reservoir exploitation on the changes in seismic response includes the following aspects: (1) Fluid saturation effects, that are based upon: (a) dependence of density on fluid saturation; and (b) dependence of bulk moduli on fluid saturation (Gassmann, 1951). This is the key effect sought in time-lapse seismics, as it allows remote monitoring of the fluid migration in the reservoir. Mainly, two effects are sought in data hopefully depending on the above saturation changes, i.e.: - time shifts, i.e. changes in reflector location in time as a consequence of changes in velocity, and mainly: - impedance changes, i.e. reflectivity changes, as impedance is the product of velocity and density, both changing with fluid content. (2) Pressure (effective stress) effect: this is the first, well known geomechanical effect, often referred to in the literature as pressure effect, but it is actually a dependence on effective stress. It is generally observed that the velocity decrease is very strong in presence of effective stress decrease (expansion), while velocity increase is relatively mild under stress increase (compaction) [e.g. Hatchell and Bourne, 2005]. This asymmetric behaviour is often explained in terms of crack opening under stress release conditions.
-
-
-
Drilling in Depleted Fields - From Surprises to Surprises?
More LessAs a result of depletion, the reservoir rock generally compacts and thus leads to stress changes both within the reservoir and in its overburden. In turn these changes affect drilling operations. The presentation will describe those changes and introduce a series of field cases illustrating their impact on drilling. Most of these field cases are based upon the analysis of tens of wells and all of them showed behaviours, which surprised the authors at the time of their study. A brief overview of each case is given below. A few years ago, Geomec analysed the stress changes due to depletion – i.e. reservoir stresspath – for a series of over thirty fields, as part of a large Joint Industry Project (Figure 1). The presentation will give a brief overview of the project’s results, insisting on those, which were not expected at its onset.
-
-
-
Workflow for Coupled Geomechanical and Reservoir Problems – Recent Experiences
More LessDevelopment of unconventional resources requires solving difficult reservoir engineering problems, many of which have some geomechanical component of the analysis. Geomechanics is coupled with the reservoir problem to varying degrees, ranging from problems that can be solved sequentially to those requiring full coupling. In an overall workflow of solving such problems, the selection of the method to solve them, and the degree of coupling which they demand, are some of the most important decision points. The use of the coupled flow and geomechanical models has become more commonplace in recent years, but there are still somewhat specialized, and the best (and most efficicent) workflow for coupled simulation depends critically on the type of the problem being investigated. In this presentation, we will give a survey of this part of the workflow and discuss several examples that demonstrate the differences of the possible approaches.
-
-
-
Solutions for 3D Coupled Geomechanical and HF Modelling in UG Reservoirs
More LessWith growing worldwide activity in exploration and development of ultra-low permeability unconventional reservoirs the O&G industry has become increasingly dependent on efficient and effective horizontal wells drilling as well as hydraulic fracture completions to increase surface area and promote gas migration. Integrated geomechanical workflow allows encompassing rigorous 3D stress modelling provided by VISAGE* system, hydraulic fracture modelling by P3D model, near wellbore analyses and drillling optimization techniques. Solutions span from data screening, throughout data integration and analysis and finally to well design support covering various scales. They rely on a more accurate 3D stress field characterization, reflecting the structure, heterogeneity/anisotropy, pressure, temperature effects from well to reservoir scale.
-
-
-
An Acquisition System Using Complementary Components to Achieve Robust Broadband Seismic
Authors Stian Hegna and Gregg ParkesIn a conventional marine acquisition system there are several components that have limitations in terms of bandwidth. However it is possible to re-design or re-arrange several of these components to produce complementary responses providing broadband seismic. The basic components of an acquisition system consist of sources and receivers. On the receiver side the main limitations are related to the sea surface reflection (receiver ‘ghost’). On the source side the limitations are related to the sea surface reflection and the responses of the airgun arrays. The induced responses of these effects are all known or can be measured, so why can’t they simply be removed? The problem arises because most of these responses contain deep notches, which make their removal very unstable in any practical sense. Now all these effects can be related to specific components of the acquisition system. It is then possible to re-design the parameters of those components to produce complementary responses that negate the effect of the notches. Once these components are in place the seismic data can be corrected in a very robust way to produce optimal broadband seismic.
-
-
-
Deep Interpolated Streamer Coverage - Broadband Seismic Data Offshore South Africa
More LessDeep interpolated streamer coverage is a seismic acquisition technique based on 3D over/under towed-streamer acquisition (Kragh et al. 2009). The technique is designed to provide broadband seismic data and deploys two receiver spreads: a shallow spread primarily for high-frequency information and a deep spread for low-frequency information. The cable separation between over cables is chosen for high temporal and spatial resolution and is typically 50 - 100 m. The cable separation for the deep spread is designed to record the low-frequency component and is much coarser, typically 300 m. In the processing phase, data from the deep (under) cables are interpolated to match the crossline sampling of the shallow (over) cables. Wavefield separation is then used to combine the high-frequency response of the upper cables and the low-frequency response of the interpolated lower cable data set to give a broadband seismic data set that is suitable for detailed structural interpretation and amplitude inversion. In addition to the technical benefit of improved low-frequency response, there is also an operational advantage. The deep cables that provide most of the low-frequency component are towed in a seismically quieter environment than the shallow cables, and hence, are less susceptible to swell noise. This operational advantage is particularly important in areas such as offshore southern South Africa, which has a short season for seismic acquisition and is notorious for high levels of swell in both good and bad weather.
-
-
-
What to Expect from Variable-Depth Streamer Data
Authors Dechun Lin, Yves Lafet and Ronan SablonVariable-depth streamer acquisition is emerging as a key technique for providing wide-bandwidth seismic data. This technique allows us to obtain a usable bandwidth from 2.5 Hz up to the source notch. It has consistently produced high-quality images in terms of seismic resolution, layer stratigraphy and low-frequency penetration. Seismic interpretation and inversion becomes easier and more robust.
-
-
-
Increasing Spatial and Temporal Bandwidth with Multi-component Streamer Data
Increasing bandwidth is not only about temporal frequencies but also about spatial wavenumbers, in particular those which are poorly sampled in the cross-line direction with streamer separations of 16 to 24 times the inline sampling interval. In this talk, we present results from a test with a mini-3D array of prototype 4C marine streamers in which we use, in addition to the pressure, the vertical and crossline gradients of the pressure wavefield in order to reconstruct and 3D deghost the wavefield at arbitrary points within the aperture. From the experimental 3D survey, we show examples of spatial and temporal enhancement of wavefields reconstructed using a generalised matching pursuit algorithm, comparing pressure-only and multi-component reconstructions. We find that multicomponent reconstruction is able to de-alias high wavenumber diffractions, that are completely missed by a pressure-only matching pursuit algorithm with priors, and generate broad-band unmigrated timeslices with excellent resolution.
-
-
-
Increased Temporal Bandwidth Using Hydrophone Only Recording and Conventional Airgun Arrays – Why Not?
More LessUsable bandwidth is determined by the signal to noise ratio rather than just signal. Modern streamers have superior noise performance compared to older versions and this reduction in noise seems to have been overlooked in the search for greater temporal bandwidth. Above approximately 2Hz the noise floor is determined by environmental issues such as swell noise and cable jerk rather than noise inherent to the equipment. Tests show that a usable temporal bandwidth of at least 3-90Hz can be obtained using a conventional airgun array and modern hydrophone only recording when a) the sea surface is not a perfect mirror and b) the streamer is towed in a deep, quiet environment.
-
-
-
Analysis of a Broadband Processing Technology Applicable to Conventional Streamer Data
Authors Zhengzheng Zhou, Milos Cvetkovic and Bing Xu and Philip FontanaWe recorded 2D lines parallel to and in close proximity of one another, with streamers towed at different depths. We applied WiBand, GXT’s broadband processing method, to a deep tow line and recovered data free of receiver ghost notches. We find a good phase match between the WiBand result and a shallow tow line. The match validates the phase fidelity of the WiBand process.
-
-
-
Preparing Data for Full Waveform Inversion: A Workflow for Free-surface Multiple Attenuation
Authors Jyoti Kumar, Adriana C. Ramrez and Suhail ButtWaveform inversion estimates a quantitative model of the subsurface by minimizing the differences (residuals) between observed and calculated seismic data. The success of waveform inversion depends on the complexity of the misfit function. If the starting model is not in the neighbourhood of the global minimum, it can cause the inversion to fail and converge into a local minimum (Sirgue et al., 2011). Since low-frequency data are more linear with respect to the model misfit than high-frequency data, most waveform inversion implementations adopt a strategy that proceeds sequentially from low to high frequencies. Therefore, data preconditioning for waveform inversion must preserve as much low frequency signal as possible. Traditionally, the bubble pulse generated by the source in marine acquisitions has been removed from the data. The bubble can generate undesired results in, e.g., data-driven multiple prediction algorithms such as SRME (Verschuur et al., 1991), where the auto-convolution of the bubble can generate long period artefacts, and requires long filters in adaptive subtraction step. It is difficult to constrain the adaptive subtraction to preserve the primaries untouched when a long filter is used. However, it has also been recognized that the bubble pulse contains valuable low-frequency signal that can benefit the quality of the velocity model estimated by waveform inversion. We propose a workflow for waveform inversion data preconditioning that preserves the bubble and low frequency signal while effectively attenuating the free surface multiples.
-
-
-
Delivering Technical Limit Seismic Data: Nature Vs. Nurture
Authors Linda Hodgson, Daniel Davies, Thomas Hance and Mike SmithThe quality of the seismic product depends on three interacting elements: the fixed physical constraints of the location, the acquisition methodology, and the processing sequence. Recent developments in marine technology have enabled a step change in acquisition, but how much difference will this make to the final product? We compare examples of the new ‘broadband’ methods to ‘conventional’ data, to explore how much extra signal may be expected at different parts of the frequency spectrum. In the right circumstances, substantial gains are possible at both low and high frequencies. In other, more challenging settings, matters such as improved processing or better azimuthal coverage may have more impact.
-
-
-
Assessing Frequency Bandwidth and Resolution Enhancement of Seismic Data:A Broadband Perspective
Authors Didier Rappin and Christian Deplante and Thierry CadoretImproved resolution and enlarged bandwidth are key direct expectations of broadband data, which stem from the definition of a broadband signal – which will be recalled. How should both be assessed and preserved or improved across seismic acquisition, processing and reservoir characterization, possibly including seismic inversion? Examples aim at stimulating thoughts on how our habits should evolve on these topics to adapt to modern high-bandwidth data. In a first part we will focus on how the concept of seismic resolution could be revisited when considering broadband data: event separability and detection issues which are mainly determined by bandwidth and signal/noise ratios will be discussed. Examples of both non-broadband and broadband signals will be used in order to study how these measurements should be assessed. Handling time-variant signals involves a bit of mathematical concepts to preserve relations between different variables such as temporal or spatial ones and frequencies or wavenumbers. It is especially when the bandwidth becomes much larger than the carrier frequency that these must be correctly taken into account throughout the design and use of signal processing tools. At this level, the topic of wavelet estimation will be a subject of particular attention knowing its importance for the reliability of reservoir characterization. Across various examples, some guidance for best practice will be proposed for discussion.
-
-
-
Increasing the Reservoir Characterization Potential with Multi-component Streamer Data
Increasing the bandwidth, both horizontally and vertically plays an important role in seismic inversion and reservoir characterization. This talk will present an overview of a novel multi-component (MC) marine seismic acquisition system combined with advanced data processing techniques that use the pressure recordings and its associated vertical and cross-line spatial gradients to estimate the scattered subsurface wavefield with unprecedented spatial wavenumber and temporal bandwidth content. Results from an experimental test with a mini-3D array of prototype MC streamers will be presented and the discussion focused on the consequences of the overall temporal and wavenumber bandwidth enhancements onto migration and inversion processing and applications. Preliminary well-tie and inversion results will be discussed.
-
-
-
High Frequency Losses – Stripping Various Causes
Authors Didier Rappin, Thierry Castex, Christophe Barnes and Kevin SamynThe quantitative use of the seismic amplitude information during the interpretation is a key point for many prospect evaluations and almost all reservoir characterization studies. The local amplitude information of interest is always affected by a series of signal attenuators along the propagation of the incident and reflected wavefield which are highly dependent upon the geological context, structural shape, lithologies and fluids. Also, incident amplitude effects might or might not have frequency dependence as well as phase or dispersion characteristics. These many causes of attenuation are often pragmatically treated by a combination of a few well-known tools: spherical divergence compensation, surface-consistent or volumic time and frequency-dependent compensations. Cases where severe amplitude attenuation effects cannot be treated using usual approaches are a serious issue, in particular for seismic characterization of reservoirs. In this paper, we present some ways to study the impact and the relevance of specific attenuation processes. Alternative mechanisms and tools for a quantitative assessment of these processes are proposed. Some results are shown from field case studies including VSP data and 3D surface acquisition, standard and broadband. We also attempt to point out the impact of the bandwidth in the identification and quantification of possible attenuation causes.
-
-
-
Broadband Verses Conventional Marine Seismic: The Importance of Compensating for the Earth Filter
Authors Anthony Hardwick, Nick Woodburn and James WhittakerBroadband seismic enables us to move closer towards the ideal goal of imaging, which is to provide the true response of the earth. The potential of broadband seismic can only be fully realised if the effect of various earth filters are accurately compensated for, which often requires some attenuation estimation such as effective Q. This is particularly true in sub-salt and sub-basalt settings where locally strong attenuation of the seismic signal occurs. Here we evaluate the differences in spectral content between broadband and conventional marine seismic through simple synthetic earth models in an attenuating medium. Without compensating for effective Q, we demonstrate that the top and base of a thin layer at depth may not be resolvable in the broadband case. We then describe the application of the pre-stack Q inversion method to derive a spatially varying interval consistent effective Q field from data in the Faroe-Shetland Basin. Application of this field demonstrates a substantial uplift in resolution within the prospective intra-basalt Flett formation.
-
-
-
Spectral Fusion – A Tool to Combine Low and High Frequency Datasets
More Less“Spectral fusion” is a new multi-trace filter which addresses the purpose of merging two seismic traces or datasets whose frequency spectrums modules have a partial overlap. Pillet et al. (2007) presented a specific application of pre-stack inversion, aimed at using a low-frequency dataset in a first inversion, in order to bridge the 7-16 Hz frequency gap in a second target inversion of a HR seismic. In parallel to this workflow which was carried forward to be used in operation, the geophysical team at Total’s Geosciences Research Center in Aberdeen took a different route and restated the problem directly from the seismic side. The idea was to find a generic way to combine two surveys which mainly differ by their respective bandwidths. When doing so, it is necessary to deal with the overlapping bandwidth appropriately. Spectral fusion directly enables to bridge the low-frequency gap of the high-resolution survey using the conventional, lower bandwidth streamer data. The method also improves the seismic well-tie. Spectral fusion was originally filed for patent in the UK on 9th May 2008 under GB 0808418.8.
-
-
-
Value of the Broadband Seismic for Interpreter and Reservoir Geophysics
Authors Cyrille Reiser, Folke Engelmark, Euan Anderson and Tim BirdIdeally, geoscientists would like seismic to provide clear, objective information about the subsurface in terms of: identification of the main geological features and stratigraphic sequences, structural elements, elastic/rock properties, potential prospects and lithology-fluid content of potential reservoirs. 3D seismic has offered the greatest benefits to seismic interpreters and reservoir geoscientists in the last few decades, but historically, seismic images have stopped short of delivering on these requirements, as the seismic bandwidth was limited due to the conventional streamer design and acquisition method. Over the last few years, starting in 2007 (Tenghamn et al. 2007) with the introduction of the dualsensor towed streamer technology, new acquisition methods and technologies have been made available with the aim of providing broader seismic bandwidth without any compromise in data quality or tradeoffs in acquisition efficiency. On one side, the combination of two sensors in the streamer cable itself enables an effective removal of the sea-surface ghost by wavefield separation, allowing us to capture the full bandwidth of the upcoming wavefield. More recently, a time and depth distributed source enables the removal of the sea surface ghost on the source side (Parkes, 2011) expanding further the frequency bandwidth. Thus, interpreters and reservoir geophysicists can now have ghost free seismic enabling a significant broadening of the seismic frequency bandwidth on the low and high side of the spectra. Some results of this latest development will be presented with an end-user perspective.
-
-
-
Variable Depth Streamer – Benefits for Rock Property Inversion
Authors Yves Lafet, L. Michel, R. Sablon and D. Russier and R. HanumanthaThe quality of an inversion depends on the seismic frequency content, the signal-to-noise ratio, the wavelet, and the low frequency model used to incorporate information outside the seismic bandwidth. In order to quantify the benefits of broadband seismic data for inversion, we compare pre-stack inversion results from conventional streamer and variable depth streamer data from NW Australia. The inversion results are combined with a Bayesian fluid classification scheme to map three rock facies and quantify associated uncertainty.
-
-
-
4D Processing Between Variable Depth and Conventional Streamer Data
More LessProcessing data with variable-depth streamer acquisition has recently become possible through a new advanced algorithm called joint deconvolution (Soubaras, 2010). In this particular acquisition, the receiver depth increases non-linearly with offset and this allows for a wide diversity of receiver ghosts to be recorded. This acquisition and associated processing dramatically increases the possible frequency bandwidth, on both low & high frequencies sides, from 2.5Hz to the source notch. This particular broadband technique will be referred to as BroadSeis in this paper. While most acquisitions in the future will certainly be realized with broadband techniques, the question of 4D matching between conventional and BroadSeis data must now be addressed during an intermediate period when the baseline data is a conventional acquisition. This paper considers this challenge and demonstrates that a good 4D response can be obtained Compared to conventional flat streamer data, processing variable depth streamer data implies a major change: the receiver ghosts are rigorously taken into account, whereas they cannot be removed from the wavelet in conventional flat streamer processing. The variable receiver depths of BroadSeis give asymmetrical ray paths which are taken into account by the imaging process and by the proper summation of the up-going and down-going wave fields in the joint deconvolution. Typical cross-equalization in a 4D process aims at solving issues related to differences in the vintages acquisition (acquisition related time- and amplitude differences) and positioning (4D binning). The case between BroadSeis and conventional data has one more problem to solve: the difference in cable profiles. This problem can be handled by joint de-ghosting and re-ghosting processes. In the following sections, we will discuss all these topics: wavelet processing, time de-striping, 4D binning, regularization, imaging and final matching. The dataset used for this comparison is a dual recording acquired by Shell in a highly structured deep offshore play. Data Overview While shooting a conventional 3D survey offshore Gabon, Shell acquired an additional 430 sq km swath of BroadSeis data to evaluate the uplift brought by the broadband image. The first comparison on PSTM data was generated in the end of 2011 and is shown in Figure 1. It shows the overall improvement typically achieved by BroadSeis in terms of enhanced spectral bandwidth. The acquisition geometry consists of 10 cables, 8000m long. The conventional streamers were towed at 11m depth with a source depth of 9m, while the BroadSeis configuration was towed between 11m and 50m with a source depth of 7m. No specific repeatability of source positions was requested when the vessel acquired these swaths, as shown by the azimuth maps of the two acquisitions (Figure 2). With these limitations in mind, both volumes were processed in a 4D sense in order to assess any impact of the variable streamer depth on the 4D signature, which should ideally be zero in the common bandwidth.
-
-
-
The Latest Sleipner CO2 Injection Monitoring Using Dual Sensor Streamer Technology
Authors Ivar Andreas Sand and Anne- Kari FurreCO2 is at Sleipner injected into the Utsira Fm, a shallow aquifer at 800-1100 m depth. Since 1996 more than 13 Mt has been injected (by 2012). In order to monitor the distribution of the CO2 in the sub-surface, a total of seven seismic monitor surveys have been acquired. These form, together with a base survey from 1994 (prior to injection start) a unique dataset (Chadwick et al 2004, Arts et al 2008). In addition, three gravity monitoring surveys give complementary information (Alnes et al 2011). In this paper we focus on the most recent seismic dataset, acquired using PGS’ Geostreamer dual sensor technology. These data can be redatumed from a deep tow depth to a shallower tow depth for comparison with previous monitor surveys, and are in addition expected to have broader frequency content than previous data ( which have non-optimal tow depths), enabling interpretation of finer details.
-
-
-
Inverse Methods to Combine Geology, Geostatistics and Multiphysic Data
By Miguel BoschInverse methods are used to infer model parameters from observed data that are related via random or deterministic functions. Their application to Earth Sciences has expanded more recently to encompass the problem of data and information integration considering the combination of multiple geophysical data surveys, information of multiple properties distributed in space, their relationships, embedded object structure and scale issues. Modelling the complexity related with the multiple parameter subspaces and functions across them is priced by the coherency of the estimated results with the available information and the simplification of the posterior information due to modes and uncertainty reduction. The formulation of this problem is based on modelling the posterior probability density that combines the various components of the available information and data. Appraisal of the posterior information can be obtained parameter wise with calculation of probability distributions describing the posterior uncertainty, or globally via full model configurations corresponding to maximum posterior probability configurations or realizations from the posterior probability. We present examples of the applications of these methods to various problems in Earth Sciences, ranging from the description of the lithospheric structure of interacting plate boundaries to the characterization of hydrocarbon reservoirs.
-
-
-
Monte Carlo Based Tomographic Full Waveform Inversion with Multiple-point a Priori Information
Authors Knud S. Cordua, Thomas M. Hansen and Klaus MosegaardIn a probabilistic formulation, the solution to an inverse problem can be expressed an a posteriori probability density function (pdf) that combines the independent states of information provided by data and a priori information. Here, we define an a posteriori probability density function that defines the solution to a tomographic full waveform inverse problem, which provides a means of obtaining an uncertainty estimate. Unfortunately, no explicit formulation of the solution to this problem can be defined. Therefore, the a posteriori probability density function has to be sampled. The full waveform inverse problem is known to be computationally very hard and is, traditionally, considered out of reach for Monte Carlo sampling strategies. We show that by means of informative a priori information this problem become tractable for a sampling strategy anyways. We outline the theoretical framework for a full waveform inversion strategy that integrates the extended Metropolis algorithm with sequential Gibbs sampling, which allows for arbitrary complex a priori information to be incorporated. At the same time we show how temporally correlated data uncertainties can be taken into account during the inversion. The suggested inversion strategy is tested on synthetic tomographic crosshole ground penetrating radar full waveform data using multiplepoint based a priori information. This is, to our knowledge, the first example of obtaining a posteriori realizations of a full waveform inverse problem.
-
-
-
Integration of Information from Diverce Sources
More LessFrom a probabilistic point-of-view solving inverse problems can be seen as a way of combining states of information in form of probability density functions. Typically, the states of information are provided by a set of observed data and some a priori information obtained independently of the data. The solution to the inverse problem is then the combined state of information quantified by the a posteriori probability density function. Within this probabilistic framework we will discuss methods for combining information from diverse sources. Specifically we will discuss methods for combining information from pre-stack seismic waveform data, a priori geological structural information and information about the relation between rock physics parameters (such as permeability, and oil saturation). One approach is to solve such an inverse problem sequentially: Initially an elastic inversion of the seismic data is performed followed by a transformation of elastic properties to rock physics parameters. Another approach is to directly solve the inverse problem parametrised with rock physics model parameters. We will discuss the benefits and challenges combining these sources of information sequentially and directly using the probabilistic formulation of inverse problems.
-
-
-
Subsurface Analytics: Operationalizing the Original “Big Data”
Authors Duncan Irving and Jamie Cruise“Big Data” has become a convenient short-hand for the exponential growth of data volumes across many industry sectors. This is nothing new in the subsurface domain but E&P DM practitioners can learn from “new” industries how best to deal with complexity and timeliness in their analytical ecosystems. We present an architecture that brings to bear the twin paradigms of massive knowledge discovery using Map-Reduce and “operationalized” decision support using a Relational Database Management System. We describe how this single data instance drives rigorous geological, geophysical and engineering insight into right-time integrated operations generally and allows data, and insight derived from it, to drive business decisions across the enterprise.
-
-
-
An Interdisciplinary Study of the Physico-chemical structure of Earth's Mantle: Combining Geophysical Data Analysis with Mineralogy, Petrology and Geochemistry
By Amir KhanWe jointly invert local fundamental-mode and higher-order surface-wave phase-velocities for radial models of the thermo-chemical and anisotropic physical structure of the Earth’s mantle to 1000 km depth beneath the North American continent. Inversion for thermo-chemical state relies on a self-consistent thermodynamic method whereby phase equilibria and physical properties (P-, S-wave velocity and density) are computed as functions of composition (in the Na2O-CaO-FeO-MgO-Al2O3-SiO2 model system), pressure and temperature. We employ a sampling-based strategy to solve the non-linear inverse problem relying on a Markov Chain Monte Carlo method to sample the posterior distribution in the model space. A range of models fitting the observations within uncertainties are obtained from which any statistics can be estimated. To further refine sampled models we compute geoid anomalies for a collection of these and compare with observations, exemplifying a posteriori filtering through the use of additional data. Our thermo-chemical maps reveal the tectonically stable older eastern parts of North America to be chemically depleted (high Mg#) and colder (>200°C) relative to the active younger regions (western margin and oceans). In the transition zone the thermo-chemical structure decouples from that of the upper mantle, with a relatively hot thermal anomaly appearing beneath the cratonic area that likely extends into the lower mantle. In the lower mantle no consistent large-scale thermo-chemical heterogeneities are observed, although our results do suggest distinct upper and lower mantle compositions. Concerning anisotropy structure, we find evidence for a number of distinct anisotropic layers pervading the mantle, including transition zone and upper-most lower mantle.
-
-
-
History Matching Under Uncertain Geological Scenario
Authors Hyucksoo Park and Jef CaersThe main interest lies in obtaining multiple history matched models under uncertain geological scenario.
-
-
-
Geothermal Energy in Denmark – Potential, Policy and Progress
A new assessment of the geothermal resources in Denmark published by GEUS concludes that the Danish subsurface contains huge geothermal resources (Mathiesen et al. 2009). To rationalise administration the Danish Energy Agency (DEA) has established a new simple application procedure with a standard license period and work program. These initiatives and rising prizes on fossil fuels have together with public concerns related to climatic changes and increasing emission of CO2 to the atmosphere triggered the awareness of the large potential of the geothermal resources, which may contribute to a safe, sustainable, price stable and reliable supply of energy. It is thus expected that geothermal energy may play an important role in the future energy strategy in Denmark (Fenham et al. 2010; Nielsen et al. 2011).
-
-
-
Shallow Geothermal Energy in Denmark – Current Status and Trends
Authors Claus Ditlefsen and Thomas Vangkilde- PedersenShallow geothermal energy is a renewable energy source with large potential for reducing CO2 emissions. The application in Denmark, however, is limited compared to e.g. Sweden and Germany. Preliminary estimates indicate that the energy extraction from a 100 m closed loop borehole may be up to 40% lower for an unfavourable geological situation compared to a favourable situation. More know-how and experience under Danish geological conditions is needed and the project GeoEnergy aims at paving the way for a wider use by providing knowledge, tools and best practice.
-