- Home
- A-Z Publications
- First Break
- Previous Issues
- Volume 22, Issue 8, 2004
First Break - Volume 22, Issue 8, 2004
Volume 22, Issue 8, 2004
-
-
The case for applying wave equation depth migration in the North Sea
Authors R. Leggott, J. Cowley and R.G. WilliamsRichard Leggott, John Cowley and R. Gareth Williams of Veritas DGC explain how the company has found that wave equation depth migration - contrary to some expectations - can be applied to North Sea data with successful imaging results. Pre-stack depth migration (PSDM) is now an established tool for imaging seismic data in areas with significant lateral velocity variations. PSDM has proved to be of benefit to interpreters in various parts of the North Sea including areas with salt features as well as areas with, for example, gas clouds and shallow channels. To date, pre-stack depth migration has been performed using a Kirchhoff imaging algorithm. Although this provides a good improvement over time migration algorithms, it is still an approximation to the full solution; in its normal implementation it can only image a single arrival from a shot to any given sub-surface location. Newer methods called Wave Equation migration that overcome some of the assumptions associated with Kirchhoff methods are now becoming available. These new methods are computationally intensive, but they can provide better results in some circumstances. Wave equation migration is being used on large volumes of seismic data in the deep water Gulf of Mexico where rugose, tabular salt features cause complex wave fronts to reach the exploration targets. These can only be imaged partially with normal Kirchhoff methods, whereas the Wave Equation algorithm allows much better imaging and amplitude control. So far, these methods have not been used in the North Sea where the geology is often thought of as being structurally simpler than the sub-salt plays of the Gulf of Mexico. Nevertheless, we find that Wave Equation Migration (WEM) does give better results than those obtained with Kirchhoff methods in the North Sea, especially in areas of the central and southern North Sea characterized by Mesozoic and Tertiary salt tectonics.
-
-
-
Attenuation of multiple diffractions using a cascaded noise removal sequence
Authors J. Brittan and A. WrenchJohn Brittan and Andy Wrench, PGS Geophysical, UK, describe a seismic processing sequence to suppress diffracted multiple energy. Multiple diffractions are a significant noise problem on seismic data from many of the world’s most important hydrocarbon-rich provinces. These noises are most prevalent in the case of deep water and a complex near-surface reflection sequence. In such regimes the multiples of diffracted energy by the near surface geology are often coincident in time with primary reflections within the subsurface. The energy composing the primary reflections will have a considerably reduced amplitude and frequency content due to the long travel-paths through the Earth. By contrast, the multiples from the near-surface reflectors have travelled most of their propagation path through the very weakly absorbing sea-layer. Thus the deeper parts of the seismic section are dominated, particularly at high frequencies, by the incoherent, high amplitude diffracted multiple arrivals. Due to their aliased, non-hyperbolic nature these arrivals are difficult to suppress using standard demultiple methods, e.g. 2D SRME (Surface Related Multiple Elimination), parabolic Radon demultiple. It has been shown that the use of high-fold, multiazimuth data acquisition can considerably increase the effectiveness of CMP stacking in removing diffracted multiple energy (Widmaier et al., 2002). However, for standard marine acquisition, a rigorous noise attenuation methodology must be adopted. Indeed, while techniques such as 3D SRME hold great promise for multiple diffraction attenuatio (van Borselen et al., 2004), we discuss in this paper how for typical field data a cascaded sequence of (up to) four different processes can be used to attenuate the problem events (Figure 1).
-
-
-
High-resolution processing for time-lapse seismic
More LessD. Lecerf and C. Reiser, CGG London, review some of the issues in the processing of 4D seismic data. Today time-lapse seismic technology has become an effective reservoir management tool for monitoring fluid flow and detecting undrained reservoir compartments. Geophysicists have taken up the challenge to merge the information available from time-lapse seismic with the output of reservoir simulation models and the available reservoir knowledge. Optimizing this information across the disciplines encounters a major problem of scales and uncertainties. Improving the vertical resolution of the seismic data is a key factor in this overall reconciliation. Improved resolution relies on the ability to recover the high-frequency content of seismic data. CGG has developed a processing methodology that provides a 4D seismic signature with an extended frequency bandwidth. Preconditioning of the seismic data is the key to removal of spatially correlated noise, such as the acquisition imprint, and also random noise, which can contaminate the high-frequency content of the seismic. In addition, working with a larger bandwidth in the time-lapse context forces a review of the conventional frequency matching procedure.
-
-
-
Well-driven seismic: 3D data processing solutions from wireline logs and borehole seismic data
Authors S. Morice, J.-C. Puech and S. LeaneySteve Morice, product champion, well-driven seismic, WesternGeco, Jean-Claude Puech, borehole seismic co-ordinator, Schlumberger, Europe, CIS and Africa, and Scott Leaney, geophysics advisor, Schlumberger describe how new integrated seismic data processing techniques, combined with a rich borehole geophysics dataset, offer opportunities to significantly enhance 3D seismic preconditioning, prestack imaging and inversion to elastic properties. Meeting the demands for higher resolution, signal-tonoise ratio, positional accuracy and amplitude fidelity from 3D seismic data requires enhanced algorithms and techniques throughout the entire data processing sequence. Well data, in the form of wireline logs and vertical seismic profiles (VSPs), provide unique constraints on key seismic data processing parameters as well as a calibration of the 3D processing sequence in terms of wellto- seismic ties. New methods for the combined analysis of wireline logs, VSPs and prestack seismic data provide invaluable information about seismic velocities (P and S), anelastic attenuation (Q) factors, velocity anisotropy (of various symmetry axes) and multiples. The logs, VSPs and seismic data should be processed and analysed together - reconciling differences due to the basic geophysics of the various measurements, the range of resolution scales, and different source of errors and uncertainties - for a unified and self-consistent borehole and surface-seismic dataset. The derivation of seismic properties from well logs and VSPs is covered extensively in the geophysical literature (see Further Reading below). The purpose of this article is to describe and illustrate how these properties can be applied to address some of the major challenges in conventional 3D seismic data processing.
-
-
-
Seismic processing: past, present and future
By M. RothMurray Roth, executive vice president of marketing & systems, Landmark Graphics, explains how seismic data processing will have to adapt to the looming scenario of a seriously shrinking pool of expertise facing an explosion in the growth of data volumes. Since the dawn of digital seismic processing more than 40 years ago, the basic processing workflow has changed very little, despite huge strides in the evolution of computing hardware and algorithms. Seismic processors still follow time-honoured steps to take field data through static fractions, noise attenuation and parameter selection, ultimately ending up with a migrated, stacked image of the subsurface. Unfortunately, industry resources are shrinking while data volumes are exploding. Revolutionizing the way seismic processors do their daily work is not only desirable, but essential. This article reviews where we’ve come from, where we’re at today, and where we can reasonably expect to go in the near future based on emerging innovations in information technology. Origins of seismic processing The first real computer application in the oil patch was seismic data processing. During the 1950s, analogue seismic surveys were acquired, processed and interpreted in the field by a single individual known as a ‘human computer.’ He laid out the survey lines and supervised the shoot by day. At night, he ‘processed’ the analogue shot records, marked subsurface reflectors of interest, hand-timed and plotted each trace on paper, and drew a rough cross section looking for structural highs, to recommend for drilling. By the late 1950s, analogue seismic records were converted into digital form using big number-crunching computers (Figure 1). To do this, of course, petroleum companies had to pull the seismic processing step out of the field and move it into a centralized computer facility. Now field crews recorded data on magnetic tape, sent them to a processing centre where raw data were turned into clean paper sections, which were forwarded to an office somewhere else where seismic interpreters focused on mapping structures and identifying prospects. The original, unified prospect generation ‘value chain’- from seismic acquisition through processing to interpretation - although enhanced by new technology was nevertheless fragmented into increasingly isolated specialties. Only in recent years have they begun to reunite, through even more advanced information technologies. From the 1960s through the late 1980s, batch seismic processing sequences were executed overnight on mainframe computers. In 1990s, certain parts of the processing workflow moved onto a range of powerful new computers, from interactive workstations to massively parallel supercomputers. During that decade, interpreters also adopted increasingly sophisticated computer systems for the analysis and visualization of processed seismic data.
-
-
-
Marine controlled-source electromagnetic imaging for hydrocarbon exploration: interpreting subsurface electrical properties
More LessMichael J. Tompkins, senior research geophysicist, Offshore Hydrocarbons Mapping, Aberdeen, UK contributes to the growing body of knowledge on interpretation issues for marine controlled-source electromagnetic imaging, now an established commercial technique for detecting hydrocarbon reservoirs.
-
-
-
A new approach to enhancement of frequency bandwidth of surface seismic data
Authors S. Chopra and V. AlexeevIt is a common observation that seismic waves propagating through the earth are attenuated. As these elastic waves travel deeper they lose energy, in contrast to spherical spreading, where energy is spread over a wider area, and reflection and transmission of energy at interfaces, where its redistribution occurs in the upward or downward directions. This loss is frequency dependent: higher frequencies are absorbed more rapidly than lower frequencies, such that the highest frequency usually recovered on most seismic data is about 80 Hz. Moreover, absorption appears to vary with the lithology of the medium. The unconsolidated near-surface absorbs more energy than the underlying compact rocks. In the extreme case most of the energy may be absorbed in the first few hundred metres of the subsurface. It is therefore important to study absorption and to determine ways in which it can be detected in seismic data.
-
Volumes & issues
-
Volume 42 (2024)
-
Volume 41 (2023)
-
Volume 40 (2022)
-
Volume 39 (2021)
-
Volume 38 (2020)
-
Volume 37 (2019)
-
Volume 36 (2018)
-
Volume 35 (2017)
-
Volume 34 (2016)
-
Volume 33 (2015)
-
Volume 32 (2014)
-
Volume 31 (2013)
-
Volume 30 (2012)
-
Volume 29 (2011)
-
Volume 28 (2010)
-
Volume 27 (2009)
-
Volume 26 (2008)
-
Volume 25 (2007)
-
Volume 24 (2006)
-
Volume 23 (2005)
-
Volume 22 (2004)
-
Volume 21 (2003)
-
Volume 20 (2002)
-
Volume 19 (2001)
-
Volume 18 (2000)
-
Volume 17 (1999)
-
Volume 16 (1998)
-
Volume 15 (1997)
-
Volume 14 (1996)
-
Volume 13 (1995)
-
Volume 12 (1994)
-
Volume 11 (1993)
-
Volume 10 (1992)
-
Volume 9 (1991)
-
Volume 8 (1990)
-
Volume 7 (1989)
-
Volume 6 (1988)
-
Volume 5 (1987)
-
Volume 4 (1986)
-
Volume 3 (1985)
-
Volume 2 (1984)
-
Volume 1 (1983)