- Home
- A-Z Publications
- Geophysical Prospecting
- Previous Issues
- Volume 60, Issue 4, 2012
Geophysical Prospecting - 4 - Simultaneous Source Methods for Seismic Data, 2012
4 - Simultaneous Source Methods for Seismic Data, 2012
-
-
A 3D simultaneous source field test processed using alternating projections: a new active separation method
Authors Craig J. Beasley, Bill Dragoset and Antoun SalamaABSTRACTIn 2008, a wide azimuth simultaneous source (SimSrc) 3D data set was acquired over an area coinciding with an existing conventional wide azimuth survey. Using two (nearly) simultaneous sources provided twice the number of shots, an increased shot density and twice the fold for essentially the same acquisition time. Although the improved sampling for the SimSrc data was predicted to yield improvements in processes such as 3D multiple attenuation and migration, cross‐talk between the sources was a concern. Cross‐talk can be suppressed by known techniques such as separation algorithms that estimate the individual shots and thereby separate SimSrc records into their constituent shots. However, separation algorithms available at the time suffered from aliasing and were not effective for severely shot‐aliased data such as those encountered in the experimental data set, which had a 75‐m in‐line shot spacing, half that typical for wide azimuth geometries.
In this paper we present a new separation technique called the Alternating Projection Method (APM), which is demonstrated to be robust in the presence of aliasing. Through comparisons with conventional wide azimuth data we show that 3D multiple removal and migration are both aided by the increase in shots. Moreover, the high quality of the APM separated data indicates that prestack products such as migration angle gathers will have better quality than was achieved a few years ago using a passive separation approach. We found that by carefully selecting and tailoring the method's projection operators, excellent separation results can be achieved efficiently.
-
-
-
How technology drives high‐productivity vibroseis: a historical perspective
Authors Denis Mougenot and Daniel BoucardABSTRACTThe development of high‐productivity seismic recording using the vibroseis method over the last 30 years is treated from a Sercel perspective. The simultaneous use of vibrators has been highly dependent on real‐time field computing capabilities such as those delivered by the correlator‐stackers in the early 1980s. A second step forward in the mid 1980s–early 1990s was related to digital vibrator electronics, which provides efficient management and quality control of fleets of vibrators. However, it was only in the late 1990s with the advent of real‐time satellite positioning (GPS) of these fleets that alternate or simultaneous sweeping was commonly used in production. During the last ten years, thanks to GPS timing, continuous data recording and innovative simultaneous sourcing methodologies, vibroseis has been able to reach unexpected levels of productivity. As a result, the cost per seismic trace has dropped, enabling denser spatial sampling and associated seismic imaging improvement.
-
-
-
Unconstrained simultaneous source land data processing*
Authors Shoudong Huo, Constantine Tsingas, Panos G. Kelamis, Peter I. Pecholcs and Hai XuABSTRACTIn the past few years, significant effort has been put into developing land acquisition field techniques that can improve production performance, reduce acquisition cost and increase seismic data quality. In 2010, Saudi Aramco conducted a high‐productivity, unconstrained, simultaneous source acquisition field test. A fixed, active super‐spread continuously recorded 18 single vibrators operating simultaneously in 18 separate quadrants. The test employed high source density, unique up‐sweep lengths and spatial source separation to maximize the seismic data quality and minimize the effects of crosstalk. This test achieved an optimum productivity rate of 45,000 vibrator points (VP) per 24 hours with real time QC. In this paper, we demonstrate that the conventional processing techniques produced acceptable images despite noise interference or crosstalk. In addition, we illustrate suitable deblending processing workflows and algorithms for attenuating source interference noise. The objective of the deblending processing is to attenuate random uncompressed crosstalk and to improve the resolution of the prestack migrated time image. Finally, comparisons with legacy 3D seismic data acquired over the same area are depicted. This paper reviews how this unique integration of data acquisition and deblending processing yields a superior seismic image.
-
-
-
The acquisition and processing of dithered slip‐sweep vibroseis data
Authors Claudio Bagaini, Mark Daly and Ian MooreABSTRACTWe introduce the distance‐separated dithered slip‐sweep vibroseis acquisition technique with the objective of maximizing acquisition productivity for land data whilst maintaining the prestack amplitude fidelity required for detailed amplitude analysis. High productivity is achieved through the use of simultaneous sources and the technique is enabled by the large channel count, continuous recording and rule‐based acquisition technologies inherent in modern land acquisition systems. The technique is very flexible and allows surveys to be designed such that the source‐interference effects, after source separation, are limited to an acceptable level.
Two examples illustrate the effectiveness of the technique. Firstly, we simulate a 2D, dithered data set by summing pairs of shots acquired conventionally. After separation using a sparse inversion technique, the interference noise due to simultaneous shooting is substantially eliminated and the prestack and post‐stack data are comparable with those obtainable with sequential shooting using the same number of shot locations.
The second example is based on a 3D field test, in which up to 3 fleets of 2 vibrators were swept simultaneously and up to 4 fleets swept with some time overlap. The simultaneous‐source data set is coincident with existing data sets acquired with flip‐flop and slip‐sweep methods in the same area, each requiring significantly more acquisition effort. Migrated images for all three data sets are equivalent, even when no specific attempt is made to remove the interference in the simultaneous‐source data set. Separation using sparse inversion is effective in attenuating this interference prestack and is necessary for detailed amplitude analysis. It is envisaged that separation by sparse inversion will become more important as the number of simultaneous sources is increased, or the minimum distance between fleets that sweep simultaneously is decreased.
-
-
-
An inversion approach to separating sources in marine simultaneous shooting acquisition – application to a Gulf of Mexico data set
Authors Roald van Borselen, Rolf Baardman, Tony Martin, Bedanta Goswami and Eivind FromyrABSTRACTThe simultaneous firing of marine sources can provide a significant uplift in terms of acquisition efficiency and data quality enhancement. However, the seismic interference resulting from one or more ‘other’ sources needs to be well understood and the appropriate processing strategies need to be developed for the method to fulfil its promise.
In this paper, a modified inversion approach is presented for the effective separation of sources in marine simultaneous shooting acquisition. The method aims to distribute all energy in the simultaneous shot records by reconstructing the individual shot records at their respective locations. The method is applied to a wide azimuth data set acquired in the Gulf of Mexico where two sources out of four in total were fired simultaneously. Results demonstrate that the individual sources can be separated satisfactory, both at the prestack and post‐stack level.
-
-
-
Randomized marine acquisition with compressive sampling matrices
Authors Hassan Mansour, Haneet Wason, Tim T.Y. Lin and Felix J. HerrmannABSTRACTSeismic data acquisition in marine environments is a costly process that calls for the adoption of simultaneous‐source or randomized acquisition ‐ an emerging technology that is stimulating both geophysical research and commercial efforts. Simultaneous marine acquisition calls for the development of a new set of design principles and post‐processing tools. In this paper, we discuss the properties of a specific class of randomized simultaneous acquisition matrices and demonstrate that sparsity‐promoting recovery improves the quality of reconstructed seismic data volumes. We propose a practical randomized marine acquisition scheme where the sequential sources fire airguns at only randomly time‐dithered instances. We demonstrate that the recovery using sparse approximation from random time‐dithering with a single source approaches the recovery from simultaneous‐source acquisition with multiple sources. Established findings from the field of compressive sensing indicate that the choice of the sparsifying transform that is incoherent with the compressive sampling matrix can significantly impact the reconstruction quality. Leveraging these findings, we then demonstrate that the compressive sampling matrix resulting from our proposed sampling scheme is incoherent with the curvelet transform. The combined measurement matrix exhibits better isometry properties than other transform bases such as a non‐localized multidimensional Fourier transform. We illustrate our results with simulations of ‘ideal’ simultaneous‐source marine acquisition, which dithers both in time and space, compared with periodic and randomized time‐dithering.
-
-
-
Multisource least‐squares migration of marine streamer and land data with frequency‐division encoding
Authors Yunsong Huang and Gerard T. SchusterABSTRACTMultisource migration of phase‐encoded supergathers has shown great promise in reducing the computational cost of conventional migration. The accompanying crosstalk noise, in addition to the migration footprint, can be reduced by least‐squares inversion. But the application of this approach to marine streamer data is hampered by the mismatch between the limited number of live traces/shot recorded in the field and the pervasive number of traces generated by the finite‐difference modelling method. This leads to a strong mismatch in the misfit function and results in strong artefacts (crosstalk) in the multisource least‐squares migration image. To eliminate this noise, we present a frequency‐division multiplexing (FDM) strategy with iterative least‐squares migration (ILSM) of supergathers. The key idea is, at each ILSM iteration, to assign a unique frequency band to each shot gather. In this case there is no overlap in the crosstalk spectrum of each migrated shot gather m(x, ωi), so the spectral crosstalk product m(x, ωi)m(x, ωj) =δi, j is zero, unless i=j. Our results in applying this method to 2D marine data for a SEG/EAGE salt model show better resolved images than standard migration computed at about 1/10th of the cost. Similar results are achieved after applying this method to synthetic data for a 3D SEG/EAGE salt model, except the acquisition geometry is similar to that of a marine OBS survey. Here, the speedup of this method over conventional migration is more than 10. We conclude that multisource migration for a marine geometry can be successfully achieved by a frequency‐division encoding strategy, as long as crosstalk‐prone sources are segregated in their spectral content. This is both the strength and the potential limitation of this method.
-
-
-
Multi‐source least‐squares reverse time migration
Authors Wei Dai, Paul Fowler and Gerard T. SchusterABSTRACTLeast‐squares migration has been shown to improve image quality compared to the conventional migration method, but its computational cost is often too high to be practical. In this paper, we develop two numerical schemes to implement least‐squares migration with the reverse time migration method and the blended source processing technique to increase computation efficiency. By iterative migration of supergathers, which consist in a sum of many phase‐encoded shots, the image quality is enhanced and the crosstalk noise associated with the encoded shots is reduced. Numerical tests on 2D HESS VTI data show that the multisource least‐squares reverse time migration (LSRTM) algorithm suppresses migration artefacts, balances the amplitudes, improves image resolution and reduces crosstalk noise associated with the blended shot gathers. For this example, the multisource LSRTM is about three times faster than the conventional RTM method. For the 3D example of the SEG/EAGE salt model, with a comparable computational cost, multisource LSRTM produces images with more accurate amplitudes, better spatial resolution and fewer migration artefacts compared to conventional RTM. The empirical results suggest that multisource LSRTM can produce more accurate reflectivity images than conventional RTM does with a similar or less computational cost. The caveat is that the LSRTM image is sensitive to large errors in the migration velocity model.
-
-
-
Efficient least‐squares imaging with sparsity promotion and compressive sensing
Authors Felix J. Herrmann and Xiang LiABSTRACTSeismic imaging is a linearized inversion problem relying on the minimization of a least‐squares misfit functional as a function of the medium perturbation. The success of this procedure hinges on our ability to handle large systems of equations – whose size grows exponentially with the demand for higher resolution images in more and more complicated areas – and our ability to invert these systems given a limited amount of computational resources. To overcome this ‘curse of dimensionality’ in problem size and computational complexity, we propose a combination of randomized dimensionality‐reduction and divide‐and‐conquer techniques. This approach allows us to take advantage of sophisticated sparsity‐promoting solvers that work on a series of smaller subproblems each involving a small randomized subset of data. These subsets correspond to artificial simultaneous‐source experiments made of random superpositions of sequential‐source experiments. By changing these subsets after each subproblem is solved, we are able to attain an inversion quality that is competitive while requiring fewer computational and possibly, fewer acquisition resources.
Application of this concept to a controlled series of experiments shows the validity of our approach and the relationship between its efficiency – by reducing the number of sources and hence the number of wave‐equation solves – and the image quality. Application of our dimensionality‐reduction methodology with sparsity promotion to a complicated synthetic with a well‐log constrained structure also yields excellent results underlining the importance of sparsity promotion.
-
-
-
Illumination properties and imaging promises of blended, multiple‐scattering seismic data: a tutorial
Authors A.J. Berkhout, D.J. Verschuur and G. BlacquièreABSTRACTIn traditional seismic surveys the firing time between shots is such that the shot records do not interfere in time. In the concept of blended acquisition, however, shot records do overlap, allowing denser source sampling and wider azimuths in an economic way. A denser shot sampling and wider azimuths make that each subsurface gridpoint is illuminated from a larger number of angles and will therefore improve the image quality in terms of signal‐to‐noise ratio and spatial resolution. In this tutorial, we show that, even with very simple blending parameters like time delays, the incident wavefield at a specific subsurface gridpoint represents a dispersed time series with a complex code. For shot record migration purposes, this time series should be designed such that it facilitates a stable inversion process. In a next step, we explain how the illumination property of the incident wavefield can be further improved by utilizing the surface‐related multiples. This means that these multiples can be exploited to improve the incident wavefield by filling angle gaps in the illumination and/or by extending the range of angles (‘natural blending’). In this way the energy contained in the multiples now contributes to the image, rather than decreasing its quality. One remarkable consequence of this property is that the benefits to be obtained from the improved illumination depend on the detector locations in acquisition geometries as well. We explain how to quantify the contribution of the blended surface multiples to the incident wavefield for a blended source configuration. In addition, we explain how blended measurements can be directly used in an angle‐dependent migration process with a bifocal, least‐squares imaging condition. The result is a densely sampled reflectivity function in the ray parameter domain, also improving the capability of velocity estimation. We will show examples to illustrate the theory.
-
-
-
Full waveform inversion and distance separated simultaneous sweeping: a study with a land seismic data set
Authors René‐Édouard Plessix, Guido Baeten, Jan Willem de Maag, Fons ten Kroode and Zhang RujieABSTRACTDense, wide‐aperture and broad frequency band acquisition improves seismic imaging and potentially allows the use of full waveform inversion for velocity model building. The cost of dense acquisitions however limits its applicability. Blended or simultaneous shooting could lead to a good compromise between cost and dense acquisition, although the cross‐talk between simultaneous sweeps may reduce imaging capabilities. Onshore, a compromise is achieved with distance separated simultaneous sweeping acquisition, because the shots are easily separated when the processing focuses on pre‐critical reflected events. Full waveform inversion for velocity model building however relies on post‐critical reflected, refracted and diving events. These events can interfere in a distance separated simultaneous sweeping acquisition. By using a single vibrator, single receiver data set recorded in Inner Mongolia, China, a distance separated simultaneous sweeping data set is created to study the robustness of full waveform inversion in this acquisition context. This data set is well suited for full waveform inversion since it contains frequencies down to 1.5 Hz and offsets up to 25 km. Full waveform inversion after a crude deblending of the distance separated simultaneous sweeping data set leads to a result very similar to the one obtained from the single vibrator, single receiver data set. The inversion of the blended data set gives a slightly less good result because of the cross‐talk but it is still quite satisfactory.
-
-
-
Application of multi‐source waveform inversion to marine streamer data using the global correlation norm
Authors Yunseok Choi and Tariq AlkhalifahABSTRACTConventional multi‐source waveform inversion using an objective function based on the least‐square misfit cannot be applied to marine streamer acquisition data because of inconsistent acquisition geometries between observed and modelled data. To apply the multi‐source waveform inversion to marine streamer data, we use the global correlation between observed and modelled data as an alternative objective function. The new residual seismogram derived from the global correlation norm attenuates modelled data not supported by the configuration of observed data and thus, can be applied to multi‐source waveform inversion of marine streamer data. We also show that the global correlation norm is theoretically the same as the least‐square norm of the normalized wavefield. To efficiently calculate the gradient, our method employs a back‐propagation algorithm similar to reverse‐time migration based on the adjoint‐state of the wave equation. In numerical examples, the multi‐source waveform inversion using the global correlation norm results in better inversion results for marine streamer acquisition data than the conventional approach.
-
-
-
Attenuating crosstalk noise with simultaneous source full waveform inversion★
Authors Antoine Guitton and Esteban DíazABSTRACTFor some acquisition geometries, the cost of Full Waveform Inversion (FWI) can be considerably reduced by inverting simultaneously encoded shots. Encoded‐shot strategies have the undesirable effect of leaving crosstalk noise in the final result. For FWI, changing the coding sequence periodically mitigates this effect. Another alternative is to use preconditioning, whereby the gradient is smoothed at every iteration along predefined directions. Preconditioning steers the solution towards accurate models while attenuating crosstalk artefacts. It also increases convergence speed and robustness to noise present in the data.
-
-
-
Convergence analysis of a coherency‐constrained inversion for the separation of blended data
Authors Panagiotis Doulgeris, Kenneth Bube, Gary Hampson and Gerrit BlacquièreABSTRACTBlended or simultaneous sources have been the focus of a good deal of interest recently. In conventional acquisition the time intervals between successive sources are large enough to avoid interference in time. In blended acquisition, temporal overlap between source responses is allowed. The procedure of retrieving data as if they were acquired in the conventional way is called deblending. This is an essential step if standard processing flows are to be applied.
Several inversion techniques have been proposed for solving this ill‐posed problem. We study the properties of an iterative estimation and subtraction algorithm that integrates a coherency‐pass filter in a dedicated iteration. We begin by stating the problem we wish to solve and develop a new, more general, interpretation of the method. We then apply an algebraic analysis of the iteration to establish the convergence characteristics of the algorithm. In order to facilitate this analysis, the notion of leakage subspace is introduced, i.e., a subspace where energy that cannot be uniquely assigned to one of the sources resides. We find that a unique solution exists, if, and only if, there is no leakage subspace. If a unique solution does not exist, then the iteration converges to a least‐norm solution contaminated by the projection of the initial guess onto the leakage subspace. The insights gained by this analysis lead us to the development of a simple tool that can provide valuable information during the design of a blended survey. Finally we present results from the application of this method to real blended marine data and then draw our conclusions.
-
-
-
Iterative method for the separation of blended seismic data: discussion on the algorithmic aspects
Authors Araz Mahdad, Panagiotis Doulgeris and Gerrit BlacquièreABSTRACTRecently much attention has been given to the possibility of shooting in an overlapping fashion, the so‐called blended or simultaneous acquisition. In conventional acquisition the time intervals between successive sources are large enough to avoid interference in time. In blending, a temporal overlap between source responses is allowed. This additional degree of freedom in survey design has the potential to significantly reduce seismic acquisition costs while maintaining or improving the data quality. Deblending is the procedure of retrieving data as if they were acquired in the conventional, unblended way. This is an essential step in the case where standard processing flows are applied. Several methods have been proposed to perform data separation with the majority of them falling into two categories. The first category consists in methods that filter out the blending noise by arranging seismic data in some domain, whereas inversion techniques fall into the second category. We recently introduced an iterative estimation and subtraction algorithm that integrates elements of both categories. This method exploits the fact that the character of the blending noise differs in different domains, e.g., it is coherent in the common source domain but incoherent in the common receiver, common‐offset or common midpoint domains. Up to now, our method relies on the interaction with a human operator. The automation of the thresholding process is addressed in this paper, leading to a hands‐off algorithm for the separation of blended data, optimized for both efficiency and effectiveness. We found that one of the major limiting factors is the edge artefacts generated by the coherence‐pass filter. The effectiveness of the coherence‐pass filter has a considerable influence on the convergence of our deblending algorithm. This is shown by testing different coherence‐pass filters on marine data as well as introducing errors in the coherence‐pass process. We show that these data estimation errors can be handled properly and the best result is obtained using a τ‐p filter as a coherence‐pass filter. Furthermore, very promising results are obtained on numerically‐blended land data.
-
-
-
On the relation between seismic interferometry and the simultaneous‐source method
Authors Kees Wapenaar, Joost van der Neut and Jan ThorbeckeABSTRACTIn seismic interferometry the response to a virtual source is created from responses to sequential transient or simultaneous noise sources. Most methods use crosscorrelation, but recently seismic interferometry by multidimensional deconvolution (MDD) has been proposed as well. In the simultaneous‐source method (also known as blended acquisition), overlapping responses to sources with small time delays are recorded. The crosstalk that occurs in imaging of simultaneous‐source data can be reduced by using phase‐encoded sources or simultaneous noise sources, by randomizing the time interval between the shots, or by inverting the blending operator. Seismic interferometry and the simultaneous‐source method are related. In this paper we make this relation explicit by deriving deblending as a form of seismic interferometry by MDD. Moreover, we discuss a deblending algorithm for blended data acquired at the surface.
-
Volumes & issues
-
Volume 72 (2023 - 2024)
-
Volume 71 (2022 - 2023)
-
Volume 70 (2021 - 2022)
-
Volume 69 (2021)
-
Volume 68 (2020)
-
Volume 67 (2019)
-
Volume 66 (2018)
-
Volume 65 (2017)
-
Volume 64 (2015 - 2016)
-
Volume 63 (2015)
-
Volume 62 (2014)
-
Volume 61 (2013)
-
Volume 60 (2012)
-
Volume 59 (2011)
-
Volume 58 (2010)
-
Volume 57 (2009)
-
Volume 56 (2008)
-
Volume 55 (2007)
-
Volume 54 (2006)
-
Volume 53 (2005)
-
Volume 52 (2004)
-
Volume 51 (2003)
-
Volume 50 (2002)
-
Volume 49 (2001)
-
Volume 48 (2000)
-
Volume 47 (1999)
-
Volume 46 (1998)
-
Volume 45 (1997)
-
Volume 44 (1996)
-
Volume 43 (1995)
-
Volume 42 (1994)
-
Volume 41 (1993)
-
Volume 40 (1992)
-
Volume 39 (1991)
-
Volume 38 (1990)
-
Volume 37 (1989)
-
Volume 36 (1988)
-
Volume 35 (1987)
-
Volume 34 (1986)
-
Volume 33 (1985)
-
Volume 32 (1984)
-
Volume 31 (1983)
-
Volume 30 (1982)
-
Volume 29 (1981)
-
Volume 28 (1980)
-
Volume 27 (1979)
-
Volume 26 (1978)
-
Volume 25 (1977)
-
Volume 24 (1976)
-
Volume 23 (1975)
-
Volume 22 (1974)
-
Volume 21 (1973)
-
Volume 20 (1972)
-
Volume 19 (1971)
-
Volume 18 (1970)
-
Volume 17 (1969)
-
Volume 16 (1968)
-
Volume 15 (1967)
-
Volume 14 (1966)
-
Volume 13 (1965)
-
Volume 12 (1964)
-
Volume 11 (1963)
-
Volume 10 (1962)
-
Volume 9 (1961)
-
Volume 8 (1960)
-
Volume 7 (1959)
-
Volume 6 (1958)
-
Volume 5 (1957)
-
Volume 4 (1956)
-
Volume 3 (1955)
-
Volume 2 (1954)
-
Volume 1 (1953)