- Home
- A-Z Publications
- Geophysical Prospecting
- Previous Issues
- Volume 66, Issue 5, 2018
Geophysical Prospecting - Volume 66, Issue 5, 2018
Volume 66, Issue 5, 2018
-
-
Seismic attenuation, normal moveout stretch, and low‐frequency shadows underlying bottom simulating reflector events
Authors José M. Carcione, Ayman N. Qadrouh, Hervé Perroud, Davide Gei, Jing Ba and Stefano PicottiABSTRACTIn many cases, the seismic response of bottom‐simulating reflectors is characterised by low frequencies called “low‐frequency shadow”. Generally, this phenomenon is interpreted as attenuation due to partial saturation with free gas. Actually, this frequency loss may have multiple causes, with a normal moveout stretch as a possible candidate. To analyse this phenomenon, we compute synthetic seismograms by assuming a lossy bottom‐simulating layer, with varying quality factor and thickness, bounded by the upper hydrate‐brine/gas‐brine and lower gas‐brine/brine interfaces. First, we estimate the shift of the centroid frequency of the power spectrum as a function of the travelled distance of the seismic pulse. Then, we perform one‐dimensional numerical experiments to quantify the loss of frequency of the seismic event below the bottom‐simulating reflector as a function of the quality factor of the bottom‐simulating layer and its thickness (due to wave interference). Then, we compute shot gathers to obtain the stacked section, with and without the normal moveout stretch correction and with and without the presence of wave attenuation in the bottom‐simulating layer. The results indicate that the low‐frequency shadow due to the normal moveout stretch is stronger than that due to attenuation and may constitute a false indicator of the presence of gas. In fact, often, the low‐frequency shadow overlies events with higher frequencies, in contradiction with the physics of wave propagation. This is particularly evident when the low‐frequency shadow is so extensive that the presence of high frequencies below cannot be justified by the acquisition geometry.
-
-
-
An application of waveform denoising for microseismic data using polarization–linearity and time–frequency thresholding
By Jubran AkramABSTRACTNoise suppression or signal‐to‐noise ratio enhancement is often desired for better processing results from a microseismic dataset. In this paper, a polarization–linearity and time–frequency‐thresholding‐based approach is used for denoising waveforms. A polarization–linearity filter is initially applied to preserve the signal intervals and suppress the noise amplitudes. This is followed by time–frequency thresholding for further signal‐to‐noise ratio enhancement in the S transform domain. The parameterisation for both polarization filter and time–frequency thresholding is also discussed. Finally, real microseismic data examples are shown to demonstrate the improvements in processing results when denoised waveforms are considered in the workflow. The results indicate that current denoising approach effectively suppresses the background noise and preserves the vector fidelity of signal waveform. Consequently, the quality of event detection, arrival‐time picking, and hypocenter location improves.
-
-
-
Noise suppression for microseismic data by non‐subsampled shearlet transform based on singular value decomposition
Authors Xiaoqiang Liang, Yue Li and Chao ZhangABSTRACTThe existence of strong random noise in surface microseismic data may decrease the utility of these data. Non‐subsampled shearlet transform can effectively suppress noise by properly setting a threshold to the non‐subsampled shearlet transform coefficients. However, when the signal‐to‐noise ratio of data is low, the coefficients related to the noise are very close to the coefficients associated with signals in the non‐subsampled shearlet transform domain that the coefficients related to the noise will be retained and be treated as signals. Therefore, we need to minimise the overlapping coefficients before thresholding. In this paper, a singular value decomposition algorithm is introduced to the non‐subsampled shearlet transform coefficients, and low‐rank approximation reconstructs each non‐subsampled shearlet transform coefficient matrix in the singular value decomposition domain. The non‐subsampled shearlet transform coefficients of signals have bigger singular values than those of the random noise, which implies that the non‐subsampled shearlet transform coefficients can be well estimated by taking only a few largest singular values. Therefore, those properties of singular value decomposition may significantly help minimise overlapping of noise and signals coefficients in the non‐subsampled shearlet transform domain. Finally, the denoised microseismic data are obtained easily by giving a simple threshold to the reconstructed coefficient matrix. The performance of the proposed method is evaluated on both synthetic and field microseismic data. The experimental results illustrate that the proposed method can eliminate random noise and preserve signals of interest more effectively.
-
-
-
Joint facies and rock properties Bayesian amplitude‐versus‐offset inversion using Markov random fields
Authors James Gunning and Mark SamsABSTRACTSeismic reflection pre‐stack angle gathers can be simultaneously inverted within a joint facies and elastic inversion framework using a hierarchical Bayesian model of elastic properties and categorical classes of rock and fluid properties. The Bayesian prior implicitly supplies low frequency information via a set of multivariate compaction trends for each rock and fluid type, combined with a Markov random field model of lithotypes, which carries abundance and continuity preferences. For the likelihood, we use a simultaneous, multi‐angle, convolutional model, which quantifies the data misfit probability using wavelets and noise levels inferred from well ties. Under Gaussian likelihood and facies‐conditional prior models, the posterior has simple analytic form, and the maximum a‐posteriori inversion problem boils down to a joint categorical/continuous non‐convex optimisation problem. To solve this, a set of alternative, increasingly comprehensive optimisation strategies is described: (i) an expectation–maximisation algorithm using belief propagation, (ii) a globalisation of method (i) using homotopy, and (iii) a discrete space approach using simulated annealing. We find that good‐quality inversion results depend on both sensible, elastically separable facies definitions, modest resolution ambitions, reasonably firm abundance and continuity parameters in the Markov random field, and suitable choice of algorithm. We suggest usually two to three, perhaps four, unknown facies per sample, and usage of the more expensive methods (homotopy or annealing) when the rock types are not strongly distinguished in acoustic impedance. Demonstrations of the technique on pre‐stack depth‐migrated field data from the Exmouth basin show promising agreements with lithological well data, including prediction accuracy improvements of 24% in and twofold in density, in comparison to a standard simultaneous inversion. Much clearer and extensive recovery of the thin Pyxis gas field was evident using stronger coupling in the Markov random field model and use of the homotopy or annealing algorithms.
-
-
-
Marine, seabed, and land seismic equipment for broadband acquisition: a review
More LessABSTRACTThe broadband capabilities of marine, seabed, and land seismic equipment are reviewed with respect to both the source and the receiver sides. In marine acquisition, the main issue at both ends of the spectrum relates to ghosts occurring at the sea surface. Broadband deghosting requires towing at variable depth to introduce notch diversity or using new equipment like multi‐component and/or low‐noise streamers. As a result, a doubling of the bandwidth from about three to six octaves (2.5–200 Hz) has been achieved. Such improvement is not yet observed for seabed surveys in spite of deghosting being a standard process on the receiver side. One issue may be related to the coupling of the particle motion sensor, particularly at high frequencies.
For land acquisition, progress came from the vibrators. New shakers and control electronics using broadband sweeps made it possible to add two more octaves to the low‐frequency signal (from 8 to 2 Hz). Whereas conventional 10 Hz geophones are still able to record such low frequencies, 5 Hz high gain geophones or digital accelerometers enhance them to keep the signal above the noise floor. On the high end of the bandwidth, progress is not limited by equipment specifications. Here, the issue is related to a low signal‐to‐noise ratio due to the strong absorption that occurs during signal propagation. To succeed in enlarging the bandwidth, these improved equipment and sweeps must be complemented by a denser spatial sampling of the wavefield by point–source and point–receiver acquisition.
-
-
-
Measuring accelerometer dynamic range from seismic data using Allan deviation
Authors Matt McDonald and Patrik StigborgABSTRACTWe propose a new approach for calculating the dynamic range of an accelerometer based on an Allan deviation analysis of production seismic data. This test is intended as a field audit technique and does not require an unconditioned dataset from a low‐noise environment. We first show that Allan deviation can measure white noise levels using two commercial accelerometers. The analysis accurately captures the manufacturing noise density specifications and known relationships between white noise, preamplifier gain, and group forming. We then show that a production seismic dataset is suitable for an Allan deviation analysis because the results are not critically affected by a recording filter. Finally, we illustrate the proposed technique by calculating the dynamic range of an accelerometer channel in a seismic streamer using a production dataset.
-
-
-
Broadband imaging via direct inversion of blended dispersed source array data
Authors Matteo Caporal, Gerrit Blacquière and Mikhail DavydenkoABSTRACTAlthough seismic sources typically consist of identical broadband units alone, no physical constraint dictates the use of only one kind of device. We propose an acquisition method that involves the simultaneous exploitation of multiple types of sources during seismic surveys. It is suggested to replace (or support) traditional broadband sources with several devices individually transmitting diverse and reduced frequency bands and covering together the entire temporal and spatial bandwidth of interest. Together, these devices represent a so‐called dispersed source array.
As a consequence, the use of simpler sources becomes a practical proposition for seismic acquisition. In fact, the devices dedicated to the generation of the higher frequencies may be smaller and less powerful than the conventional sources, providing the acquisition system with increased operational flexibility and decreasing its environmental impact. Offshore, we can think of more manageable boats carrying air guns of different volumes or marine vibrators generating sweeps with different frequency ranges. On land, vibrator trucks of different sizes, specifically designed for the emission of particular frequency bands, are preferred. From a manufacturing point of view, such source units guarantee a more efficient acoustic energy transmission than today's complex broadband alternatives, relaxing the low‐ versus high‐frequency compromise. Furthermore, specific attention can be addressed to choose shot densities that are optimum for different devices according to their emitted bandwidth. In fact, since the sampling requirements depend on the maximum transmitted frequencies, the appropriate number of sources dedicated to the lower frequencies is relatively small, provided the signal‐to‐noise ratio requirements are met. Additionally, the method allows to rethink the way to address the ghost problem in marine seismic acquisition, permitting to tow different sources at different depths based on the devices' individual central frequencies. As a consequence, the destructive interference of the ghost notches, including the one at 0 Hz, is largely mitigated. Furthermore, blended acquisition (also known as simultaneous source acquisition) is part of the dispersed source array concept, improving the operational flexibility, cost efficiency, and signal‐to‐noise ratio.
Based on theoretical considerations and numerical data examples, the advantages of this approach and its feasibility are demonstrated.
-
-
-
Robust dual‐sensor data sea‐surface ghost attenuation for quality control purposes
Authors Philippe Caprioli and Ralf FerberABSTRACTThree‐dimensional receiver ghost attenuation (deghosting) of dual‐sensor towed‐streamer data is straightforward, in principle. In its simplest form, it requires applying a three‐dimensional frequency–wavenumber filter to the vertical component of the particle motion data to correct for the amplitude reduction on the vertical component of non‐normal incidence plane waves before combining with the pressure data. More elaborate techniques use three‐dimensional filters to both components before summation, for example, for ghost wavelet dephasing and mitigation of noise of different strengths on the individual components in optimum deghosting. The problem with all these techniques is, of course, that it is usually impossible to transform the data into the crossline wavenumber domain because of aliasing. Hence, usually, a two‐dimensional version of deghosting is applied to the data in the frequency–inline wavenumber domain. We investigate going down the “dimensionality ladder” one more step to a one‐dimensional weighted summation of the records of the collocated sensors to create an approximate deghosting procedure. We specifically consider amplitude‐balancing weights computed via a standard automatic gain control before summation, reminiscent of a diversity stack of the dual‐sensor recordings. This technique is independent of the actual streamer depth and insensitive to variations in the sea‐surface reflection coefficient. The automatic gain control weights serve two purposes: (i) to approximately correct for the geometric amplitude loss of the Z data and (ii) to mitigate noise strength variations on the two components. Here, Z denotes the vertical component of the velocity of particle motion scaled by the seismic impedance of the near‐sensor water volume. The weights are time‐varying and can also be made frequency‐band dependent, adapting better to frequency variations of the noise. The investigated process is a very robust, almost fully hands‐off, approximate three‐dimensional deghosting step for dual‐sensor data, requiring no spatial filtering and no explicit estimates of noise power. We argue that this technique performs well in terms of ghost attenuation (albeit, not exact ghost removal) and balancing the signal‐to‐noise ratio in the output data. For instances where full three‐dimensional receiver deghosting is the final product, the proposed technique is appropriate for efficient quality control of the data acquired and in aiding the parameterisation of the subsequent deghosting processing.
-
-
-
A probabilistic model for ghost delay time estimation based on recording geometry
Authors James Rickett and Inês CecílioABSTRACTIn order to deconvolve the ghost response from marine seismic data, an estimate of the ghost operator is required. Typically, this estimate is made using a model of in‐plane propagation, i.e., the ray path at the receiver falls in the vertical plane defined by the source and receiver locations. Unfortunately, this model breaks down when the source is in a crossline position relative to the receiver spread. In this situation, in‐plane signals can only exist in a small region of the signal cone.
In this paper, we use Bayes' theory to model the posterior probability distribution functions for the vertical component of the ray vector given the known source–receiver azimuth and the measured inline component of the ray vector. This provides a model for the ghost delay time based on the acquisition geometry and the dip of the wave in the plane of the streamer. The model is fairly robust with regard to the prior assumptions and controlled by a single parameter that is related to the likelihood of in‐plane propagation.
The expected values of the resulting distributions are consistent with the deterministic in‐plane model when in‐plane likelihood is high but valid everywhere in the signal cone. Relaxing the in‐plane likelihood to a reasonable degree radically simplifies the shape of the expected‐value surface, lending itself for use in deghosting algorithms. The model can also be extended to other plane‐wave processing problems such as interpolation.
-
-
-
Evolution of deghosting process for single‐sensor streamer data from 2D to 3D
Authors Zhigang Zhang, Hassan Masoomzadeh and Bin WangABSTRACTIn marine acquisition, reflections of sound energy from the water–air interface result in ghosts in the seismic data, both in the source side and the receiver side. Ghosts limit the bandwidth of the useful signal and blur the final image. The process to separate the ghost and primary signals, called the deghosting process, can fill the ghost notch, broaden the frequency band, and help achieve high‐resolution images. Low‐signal‐to‐noise ratio near the notch frequencies and 3D effects are two challenges that the deghosting process has to face. In this paper, starting from an introduction to the deghosting process, we present and compare two strategies to solve the latter. The first is an adaptive mechanism that adjusts the deghosting operator to compensate for 3D effects or errors in source/receiver depth measurement. This method does not include explicitly the crossline slowness component and is not affected by the sparse sampling in the same direction. The second method is an inversion‐type approach that does include the crossline slowness component in the algorithm and handles the 3D effects explicitly. Both synthetic and field data examples in wide azimuth acquisition settings are shown to compare the two strategies. Both methods provide satisfactory results.
-
-
-
Three‐dimensional receiver deghosting of seismic streamer data using L1 inversion and redundant extended radon dictionary
Authors Yimin Sun and Eric VerschuurABSTRACTIn this paper, we propose a novel three‐dimensional receiver deghosting algorithm that is capable of deghosting both horizontal and slanted streamer data in a theoretically consistent manner. Our algorithm honours wave propagation phenomena in a true three‐dimensional sense and frames the three‐dimensional receiver deghosting problem as a Lasso problem. The ultimate goal is to minimise the mismatch between the actual measurements and the simulated wavefield with an L1 constraint applied in the extended Radon space to handle the underdetermined nature of this problem. We successfully demonstrate our algorithm on a modified three‐dimensional EAGE/SEG Overthrust model and a Red Sea marine dataset.
-
-
-
Sparsity‐enhanced wavelet deconvolution
Authors Ralf Ferber and Ekeabino MomohABSTRACTWe propose a three‐step bandwidth enhancing wavelet deconvolution process, combining linear inverse filtering and non‐linear reflectivity construction based on a sparseness assumption. The first step is conventional Wiener deconvolution. The second step consists of further spectral whitening outside the spectral bandwidth of the residual wavelet after Wiener deconvolution, i.e., the wavelet resulting from application of the Wiener deconvolution filter to the original wavelet, which usually is not a perfect spike due to band limitations of the original wavelet. We specifically propose a zero‐phase filtered sparse‐spike deconvolution as the second step to recover the reflectivity dominantly outside of the bandwidth of the residual wavelet after Wiener deconvolution. The filter applied to the sparse‐spike deconvolution result is proportional to the deviation of the amplitude spectrum of the residual wavelet from unity, i.e., it is of higher amplitude; the closer the amplitude spectrum of the residual wavelet is to zero, but of very low amplitude, the closer it is to unity. The third step consists of summation of the data from the two first steps, basically adding gradually the contribution from the sparse‐spike deconvolution result at those frequencies at which the residual wavelet after Wiener deconvolution has small amplitudes. We propose to call this technique “sparsity‐enhanced wavelet deconvolution”. We demonstrate the technique on real data with the deconvolution of the (normal‐incidence) source side sea‐surface ghost of marine towed streamer data. We also present the extension of the proposed technique to time‐varying wavelet deconvolution.
-
-
-
Monitoring temporal variations in instrument responses in regional broadband seismic network using ambient seismic noise
Authors Fang Ye, Jun Lin, Zhaomin Shi and Shixue LyuABSTRACTHigh‐quality broadband data are required to promote the development of seismology research. Instrument response errors that affect data quality are often difficult to detect from visual waveform inspection alone. Here, we propose a method that uses ambient noise data in the period range of 5−25 s to monitor instrument performance and check data quality in situ. Amplitude information of coda waves and travel time of surface waves extracted from cross‐correlations of ambient noise are used to assess temporal variations in the sensitivity and poles–zeros of instrument responses. The method is based on an analysis of amplitude and phase index parameters calculated from pairwise cross‐correlations of three stations, which provides multiple references for reliable error estimates. Index parameters calculated daily during a two‐year observation period are evaluated to identify stations with instrument response errors in real time. During data processing, initial instrument responses are used in place of available instrument responses to simulate instrument response errors, which are then used to verify our results. The coda waves of noise cross‐correlations help mitigate the effects of a non‐isotropic field and make the amplitude measurements quite stable. Additionally, effects of instrument response errors that experience pole–zero variations on monitoring temporal variations in crustal properties appear statistically significant of velocity perturbation and larger than the standard deviation. Monitoring seismic instrument performance helps eliminate data pollution before analysis begins.
-
Volumes & issues
-
Volume 72 (2023 - 2024)
-
Volume 71 (2022 - 2023)
-
Volume 70 (2021 - 2022)
-
Volume 69 (2021)
-
Volume 68 (2020)
-
Volume 67 (2019)
-
Volume 66 (2018)
-
Volume 65 (2017)
-
Volume 64 (2015 - 2016)
-
Volume 63 (2015)
-
Volume 62 (2014)
-
Volume 61 (2013)
-
Volume 60 (2012)
-
Volume 59 (2011)
-
Volume 58 (2010)
-
Volume 57 (2009)
-
Volume 56 (2008)
-
Volume 55 (2007)
-
Volume 54 (2006)
-
Volume 18 (1970 - 2006)
-
Volume 53 (2005)
-
Volume 52 (2004)
-
Volume 51 (2003)
-
Volume 50 (2002)
-
Volume 49 (2001)
-
Volume 48 (2000)
-
Volume 47 (1999)
-
Volume 46 (1998)
-
Volume 45 (1997)
-
Volume 44 (1996)
-
Volume 43 (1995)
-
Volume 42 (1994)
-
Volume 41 (1993)
-
Volume 40 (1992)
-
Volume 39 (1991)
-
Volume 38 (1990)
-
Volume 37 (1989)
-
Volume 36 (1988)
-
Volume 35 (1987)
-
Volume 34 (1986)
-
Volume 33 (1985)
-
Volume 32 (1984)
-
Volume 31 (1983)
-
Volume 30 (1982)
-
Volume 29 (1981)
-
Volume 28 (1980)
-
Volume 27 (1979)
-
Volume 26 (1978)
-
Volume 25 (1977)
-
Volume 24 (1976)
-
Volume 23 (1975)
-
Volume 22 (1974)
-
Volume 21 (1973)
-
Volume 20 (1972)
-
Volume 19 (1971)
-
Volume 17 (1969)
-
Volume 16 (1968)
-
Volume 15 (1967)
-
Volume 14 (1966)
-
Volume 13 (1965)
-
Volume 12 (1964)
-
Volume 11 (1963)
-
Volume 10 (1962)
-
Volume 9 (1961)
-
Volume 8 (1960)
-
Volume 7 (1959)
-
Volume 6 (1958)
-
Volume 5 (1957)
-
Volume 4 (1956)
-
Volume 3 (1955)
-
Volume 2 (1954)
-
Volume 1 (1953)