- Home
- Conferences
- Conference Proceedings
- Conferences
73rd EAGE Conference and Exhibition - Workshops 2011
- Conference date: 23 May 2011 - 27 May 2011
- Location: Vienna, Austria
- ISBN: 978-90-73834-13-2
- Published: 27 May 2011
1 - 20 of 129 results
-
-
SimSrc, Sparseness and Compressive Sensing: a Confluence of Ideas
More LessRecently, seemingly unrelated topics such as high-productivity source acquisition, sparse seismic data representations and computational simultaneous sources have been understood to be intimately related. In acquisition, SimSrc (originally for simultaneous sources), a technique which allows many sources to be active at the same time, has proven beneficial in land acquisition and is being studied for OBC, marine, VSP and other types of data acquisition. Introduced in the late 1990s, shots are typically encoded with a randomized relative timing to aid in discrimination between the interfering shots. In the last decade, similar techniques have been used to computational advantage by handling more than one shot at the same time in the same processing domain. Similarly, the inherent sparseness of the seismic data has motivated compressive data representations that have been used extensively in seismic data manipulations for some time. Most recently, outside the geophysics domain, techniques known as Compressive Sensing have emerged in mathematics and engineering that bear similarities to these geophysical applications. In this workshop, we explore these methods to arrive at a better understanding of the relationships.
-
-
-
From Wiggins’ MED to Compressive Sensing: Sparsity in Seismic Data Processing
More LessThe field of compressive sensing has revitalized interest in the development of methods that use sparse representation models for seismic data processing. However, it should be noted that sparsitydriven methods for seismic signal processing are not new to geophysicists. I would like to use this opportunity to discuss early ideas of sparse signal representation for blind deconvolution (MED), velocity analysis, high-resolution Radon transforms, and Fourier synthesis for wavefield reconstruction. I will also discuss their connection to new developments arising under the field of compressive sensing, paying particular attention to the conceptual simplicity of these methods, assumptions and proven industry-strength results rather than to algorithmic considerations.
-
-
-
Seislet Transform and Seislet Frame: Tools for Compressive Representation of Seismic Data
Authors Sergey Fomel, Yang Liu and Jilin UniversityDigital wavelet transform (DWT) is a well-known tool for characterizing piecewise-smooth signals and is based on predicting smooth signals. Seismic signals are not smooth. However, they are predictable. Seislet transform is a digital wavelet-like transform, which is tailored specifically for representing seismic data. Its construction is based on the notion of signal prediction. Seislet transform uses predictions of sinusoids (applicable to seismic data in the F-X domain), plane waves (applicable to 2-D or 3-D seismic data in the T-X domain), or reflection events (applicable to prestack seismic data). We combine statistical or physical predictability of seismic signals with the lifting scheme of DWT to define the seislet transform. One can view the seislet transform as decomposition into multiscale orthogonal basis functions aligned with seismic events. When multiple interfering events are present in the data, it is also possible to follow all of them simultaneously by turning the seislet basis into an overcomplete representation (a tight frame). Even though the seislet frame is overcomplete, it can be constrained to have only a small number of significant coefficients and, therefore, to provide an optimally sparse representation. The sparsity is easily demonstrated by comparing the seislet transform and frame with the classic transforms, such as Fourier and DWT. The classic DWT is equivalent to the seislet transform with a zero frequency (in 1-D) or zero slope (in 2-D). The sparsity of the transform domain provides not only an effective seismic data compression tool but also a way for designing efficient data analysis algorithms. Traditional geophysical data analysis tasks, such as signal-noise separation and data regularization, are conveniently formulated in the transform domain, where the signal is sparse. When applied in the offset direction on prestack data, the seislet transform finds an additional application in optimal stacking of seismic records.
-
-
-
Acquisition and Processing of Multiple-simultaneous-source Data
Authors Ian Moore, Claudio Bagaini and Mark DalyLand acquisition has some significant advantages over its marine counterpart with respect to simultaneous source data. In particular, it is practical to employ relatively large numbers of sources, and these sources do not have the constraints implicit in the continuous motion that is an inherent feature of towed-marine acquisition. The large channel counts available with modern land recording systems allow the sources to be spread over a significant area, and modern control systems allow complex rules to be used to determine optimum firing patterns dynamically, in order to maximize efficiency whilst meeting whatever constraints are necessary to ensure good data quality. We consider an orthogonal land geometry consisting of a fixed receiver spread and a number of source lines. The available sources operate under rules that dictate when sources can fire simultaneously (with dithers), taking into account factors such as the distance between the sources and the distribution of the sources which are "on station". The resultant dataset consists of a collection of traces, each of which may, in principle, contain contributions from any number of sources (up to the total available). Whilst the shot spacing on each source line is specified, the traces corresponding to those shot points will not in general be contiguous within the dataset, and the resulting complexity is significantly higher than that of a typical marine simultaneous-source dataset. It is not, for example, possible to obtain subsets of the dataset that involve contributions from only two (or a few) source lines. Until we have algorithms capable of processing simultaneous-source data directly, the simplest approach to processing these data is to separate the sources early in the sequence. Consequently, we compare results using passive separation, separation based on random noise attenuation, and separation based on sparse inversion. For the last case, we consider each source line to constitute a separate logical source, leading to a separation problem that involves as many sources as source lines. This method is shown to be practical, and to give superior results to the other methods.
-
-
-
Maximizing Acquisition Efficiency through Simultaneous Source Technology
Authors Ray Abma and Mark FosterSimultaneous source acquisition can significantly improve the efficiency of seismic data acquisition (Howe et al 2008, Howe et al 2009, Bouska et al 2008). BP has developed seismic acquisition technologies like DS3, ISS™ and ISSN™ which can efficiently deliver high-quality data in desert terrains, such that land 3D can be acquired for exploration and development purposes at costs approaching conventional 3D marine towed-streamer surveys. In addition to the very high productivity rates, simultaneous source technology enables the economic acquisition of well sampled, high fold wide azimuth surveys which delivers the observed improvement in data quality, and together with new noise separation techniques, like BP’s SSI technique (Abma et al 2010), these methods are being used for development quality 3D surveys where high quality pre-stack data and wide azimuth attributes are required. We will show examples of simultaneous source acquisition from BP’s surveys to date, and review the experience gained as well as looking at future opportunities and challenges. To summarize our experience, it appears that we can acquire seismic surveys with simultaneous sources in a manner that is faster, cheaper, and better than conventional surveys.
-
-
-
Tackling the “Data Deluge”: a Dimensionality-reduction Approach
More LessCurrent-day imaging and inversion technology increasingly relies on faithful samplings and simulations of seismic wavefields. This reliance on full sampling and high-fidelity wavefield simulations strains our acquisition and processing systems and overcoming this impediment is becoming one of the main challenges faced by our industry. By using randomized dimensionality-reduction techniques, we propose a new strategy where acquisition and computational costs are no longer dictated by the sampling grid but by transform-domain compressibility of the image. To arrive at this result, we combine recent findings from machine learning / stochastic optimization—where (nonlinear) inversions are carried out on random subsets of data—and compressive sensing—where data that permit compressible representations are deliberately subsampled. The key idea of the stochastic approximation is to reduce computational costs by computing each gradient update on a different randomly selected subset of data. In seismic exploration, this corresponds to carrying out migrations with one or a few incoherent supershots made of superpositions of random source-encoded experiments. While this approach introduces source cross talk, it has been applied successfully to reduce the cost of least-squares migration and full-waveform inversion because it reduces the number of wave-equation solves. However, the method is sensitive to noise and relies on relatively large numbers of supershots and wave simulations to get reasonable results.
-
-
-
Blended Acquisition: Increased Information Density per Shot Record
Authors Gerrit Blacquiere, Eric Verschuur and TU DelftIn traditional seismic surveys the firing time between shots is such that the shot records do not interfere in time. However, in the concept of blended acquisition the records do overlap, allowing denser source sampling and wider azimuths in an economic way. A denser shot sampling and wider azimuths make that each subsurface gridpoint is illuminated from a larger number of angles, and will therefore improve the image quality in terms of signal-to-noise ratio and spatial resolution. We show that – even with very simple blending parameters like time delays – the incident wavefield at a specific subsurface gridpoint represents a dispersed time series with a `complex code'. In deblending, this time series is decomposed in the individual source contributions. In blended shot record migration, however, this time series is not decomposed but the complete series is inverted for in a wavefield deconvolution process. We also show that the information density of shot records can be further increased by considering surface-related multiples as signal, using the double illumination concept. This means that these multiples can be exploited, leading to improvements in the angle range of the incident wavefield at each gridpoint. In this way the energy contained in the multiples now contributes to the image, rather than decreasing its quality. This important property will be illustrated by examples.
-
-
-
Seismic Simultaneous Sources in Land: Challenges and Opportunities
Authors Panos G. Kelamis, Peter I. Pecholcs, Constantine Tsingas and Shoudong Huoessing of this optimally designed seismic blended dataset without applying any deblending algorithm produces satisfactory results. To improve the prestack analysis such as first break picking, noise removal and velocity analysis, deblending methodologies and workflows were also applied. In this paper, we present processing results related to land simultaneous sources acquisition. Novel deblending schemes will be shown along with their effectiveness in a production environment. Statics, surface consistency, noise attenuation and velocity estimation are seen as the main challenges for the processing of land blended data. Current processing schemes rely mostly on deblending so that conventional workflows can be subsequently employed for data analysis. Full blended data processing in land is still an open issue. Current practice dictates that we have to have well in advance nearsurface and velocity macromodels in order to proceed. Migration will do the rest. The real question therefore is: a) Do we develop tools to fully process blended data? or b) Do we develop effective deblending algorithms & proceed in a conventional manner? We all know the answer…it can be found in the middle!!! Of course everything starts with acquisition…do we play the game safe and acquire optimally distance-separated data or go wild and acquire data in a random fashion? Processing now becomes the key. In short, seismic acquisition and processing must be considered simultaneously!!
-
-
-
Role of Simultaneous Source Technology in Seismic Industry
Simultaneous source method has been investigated in the past in the context of efficient seismic data acquisition; however recently due to its computational efficiency in imaging, simulation and inversion the momentum in this field has increased. At ExxonMobil, simultaneous source technology has been an active research topic for many years in several areas ranging from acquisition, imaging and recently in full-wavefield inversion (FWI).
-
-
-
An Overview of Airborne Electromagnetics for 3D High-resolution Mapping
By Esben AukenAirborne electromagnetics (AEM) is an efficient tool for mapping the subsurface. AEM delivers a very high data coverage compared to the costs, and the direct outputs are high-resolution resistivity images of the subsurface. In particular the time domain methods (TEM) are well suited for mapping of the salt-fresh water boundary in coastal zones, aquifers, paleo-channels, mineralisations and general structural geological mapping. The development of AEM systems and data processing systems has been intensive during the past 10– 15 years because of better electronics and faster and more flexible computers. While most AEM systems in the 1980s were limited to detection of mineralisations, modern system generations measures full spectra, yields absolutely calibrated data and accurate descriptions of the system transfer function. This, in combination with much enhanced inversion and forward algorithms, makes AEM a suitable candidate for mapping targets which so far have not be possible.
-
-
-
Integrating Gravity and Electromagnetics with Seismic for Near Surface Characterization in Saudi Arabia
More LessNear surface conditions in Saudi Arabia represent the major challenge for acquisition of reliable and meaningful land seismic data. In Saudi Aramco, a major effort is underway to investigate the benefits of integrating gravity and electromagnetic data with seismic data to better estimate near surface velocities for processing large 3D seismic volumes. In 2010 a gravity and electromagnetic acquisition program was carried out in three areas characterized by different near surface geologic conditions. The type of methodologies being employed consist of dedicated high-end electromagnetic and gravity acquisition specifications, geophysical data integration via simultaneous joint inversion, and seismic processing with advanced imaging workflows such as pre-stack redatuming (time) and pre-stack depth migration. Well log analysis in shallow boreholes provides the local petrophysical relationships among velocity, resistivity and density to be used in a simultaneous joint inversion scheme. Given the shallow targets, the resolution offered by the electromagnetic and gravity data is typically within the wavelength of the velocity anomalies affecting seismic imaging. Therefore, near surface non-seismic data act as an ideal complementary dataset to seismic. Results obtained to date reveal density and resistivity anomalies correlated with regions of poor seismic data quality. EM and gravity data analysis and inversion are being carried out in a singledomain approach as well as by applying quantitative simultaneous joint inversion schemes with seismic travel-time data. The generated near surface multi-parameter models are used to correct the seismic data with successive reprocessing in time and depth domains. Encouraging results are being observed from the reprocessing indicating that the multi-physics data and the quantitative integration schemes are succeeding in addressing the near surface velocity estimation problem.
-
-
-
Near Surface Challenges for Processing 3D Seismic Surveys
More LessAround the world, low-relief structures and stratigraphic traps are becoming more important since most of the large structural traps have been drilled. These play types require accurate near surface velocity models for depth conversion. Reservoir characterization is also a key goal. The use of horizontal wells for field development continues to grow at an accelerating pace. Seismic attribute maps can be extremely valuable for placing horizontal wells. But these attributes will be of little value if their quality is degraded by near surface effects. Energy penetration, scattering, source generated noise, surface generated multiples, statics, and source and receiver coupling are some of the long standing near surface issues that continue to present challenges for land seismic imaging. In arid environments, the near surface can be up to 700m thick with near surface challenges such as sand dunes, topography, karsted carbonates, dry river beds, outcropping refractors, velocity reversals, anhydrites, and layered basalts. Karsts can be cavernous, but are usually collapsed and filled. Air filled karsts above the water table are particularly problematic. A wide range of technologies are now being pursued to better characterize the near surface. These include traveltime tomography, early arrival waveform inversion, joint inversion of seismic data with gravity and electromagnetic data, and surface wave inversion. For complex near surfaces that require these technologies to define the velocity model, the conventional statics approach will not be valid. Imaging technologies such as redatuming will be required. Advances in seismic acquisition technology such as ultra-high channel, high density, single sensor recording, and low frequency vibrators, will play a key role in developing solutions for near surface characterization.
-
-
-
Surface Consistent Surface-Wave Inversion
More LessA tomographic method (SWIPER) is used to invert surface-seismic data to estimate variable surfacewave properties. Surface consistency of multi-source, multi-receiver data is exploited to decompose the data in the frequency domain into frequency-dependent propagation (i.e., velocity and attenuation) effects and variable source- and receiver-coupling effects. The inversion can be performed for single modes (linear optimization) or simultaneously for multiple modes (nonlinear optimization). Including source- and receiver-coupling variations improves the ability to estimate velocity and attenuation. Further improvements in the estimation are made by constraining the parameters to be a smooth function of frequency. The estimated model parameters, such as velocity dispersion relations, can be used to predict the multi-mode ground roll and subtract it from the data with little damage to reflections or to invert for a near-surface depth model. The properties of the near-surface vary rapidly in both vertical and horizontal directions. Consequently, the behavior of ground roll also varies both with frequency and with position along the surface. Traditional methods of surface-wave analysis, such as the MASW, have limited resolution because they effectively average properties over the maximum source-toreceiver distance in the gather. For example, because slant stack sums over the traces in the gather, variability within the gather is averaged. Additional limitations arise from the difficulty in picking the maximum amplitudes in the transform domain because of limited velocity resolution, noise, and interference of multiple modes. Fully exploiting the multi-shot and multi-receiver aspect of 3-D surface seismic data allows the determination of variable surface-wave parameters within a fine grid of surface cells using direct-ray tomography. At the higher frequencies, however, multi-mode behavior and their interference must be included. Amplitude and phase effects are coupled and must be determined by a nonlinear optimization method. We show the ability to determine for a receiver-interval sized grid multi-mode velocity and attenuation parameters, which after horizontal smoothing match those determined with a beam-forming method.
-
-
-
Advances in Petroleum Industry Seismic Acquisition Technology
By Dave MonkWhen it comes to acquisition of seismic data onshore there are really only 2 basic attributes which can be addressed which impact how data is acquired. The first is the acquisition geometry, the relationship between sources and receivers. This of course includes the issues of arrays at both the source and receiver end of the raypath. Recent developments in acquisition technology have led to far more flexability in receiver geometries, and the indication is that we will have the capability of recording up to 1 million traces in each shot record within 5 years. Additionally we have seen development of simultaneous source methods which have allowed far greater flexability in source geometry as well. The second fundamental issue goiverned by acquisition technology is the bandwidth of the data acquired. This is limited by both source and receiver, and recent developments have aimed at extending the bandwidth of both as we move towards more sophisticated processing including full waveform inversion. Recently there has been considerable work in extending the low frequency components of the bandwidth. In this presentation a road map of how acquisition may evolve is presented which highlights both the geometry and bandwidth changes we are likely to see in the future through new technology.
-
-
-
Surface Wave Analysis in Laterally Varying Media
Authors Laura Valentina Socco, Paolo Bergamo and Daniele BoieroSurface wave analysis applied to near surface characterisation has had a tremendous development in the last decade. From individual inversions of single dispersion curves extracted from on purpose acquired small scale datasets, the technique has evolved to the analysis of wide datasets of dispersion curves extracted from large scale seismic data acquired for body wave exploration (Socco et al., 2010). The evolution of the method poses new challenges and provides new opportunities for retrieving near surface velocity models in complex geological environments. In particular, innovative approaches for processing and inversion of surface wave data have been developed to improve the reliability and the spatial resolution of the velocity models. If data are not acquired on purpose for surface wave analysis, careful data evaluation is required before processing to assess that acquisition parameters and equipment are suitable for retrieving good quality dispersion curves. After preliminary evaluations, data should be processed to extract dispersion curves taking great care about the effects of lateral variations and the presence of higher modes and other guided waves that could be included in the inversion. Before inverting the curves, apriori information should be considered to build up consistent initial models optimising the parameterisation according to reliable investigation depth, vertical and lateral resolution. Inversion should be performed considering the set of dispersion curves as a unique dataset to provide internally consistent pseudo-2D/3D velocity models and including any available a priori information. Experimental uncertainties should be also used to account for data quality. Joint inversion with other geophysical data can significantly improve the reliability of the final models and the amount of retrieved information. Sensitivity analysis may provide an indication about reliability of estimated model parameters.
-
-
-
High-Resolution Characterization of the Near Surface: GPR and Seismic Reflection Methods
More LessGround penetrating radar (GPR) and shallow seismic reflection (SSR) methods offer high-resolution imaging of the subsurface. Recent technological advances in instrumentation as well as methods (data acquisition, processing and analysis) have allowed sub-meter characterization of near-surface heterogeneous properties in three-dimensions (3D). As hydrocarbon exploration pursues challenging imaging objectives, such as low-relief structures and fracture zones, accurate near-surface characterization has become increasingly important, both as reservoir analogues and as a way to better understand how complex near-surface conditions affect images from deeper layers. GPR and SSR methods developed for engineering and groundwater applications offer the opportunity for technology exchange between near-surface geophysics and exploration geophysics. We present an overview of research conducted at The University of Kansas in: i) GPR high-resolution imaging of reservoir analogues, mapping the distribution of heterogeneous carbonate lithofacies, imaging fractures and time-lapse monitoring the flow of fluids, and ii) developments in automated 3D shallow seismic reflection data acquisition that allow sub-meter resolution imaging of the weathered zone, subsurface stratigraphy and depth to bedrock.
-
-
-
Time Domain Early Arrival Waveform Inversion
By Jie ZhangSeismic early arrivals recorded on the surface include sufficient velocity structural information in the near-surface area. Unlike later arrivals, they are less contaminated by the elastic effects such as surface waves or converted waves. Thus, the early arrivals can be better delineated as an acoustic wave propagation problem. Time domain early arrival waveform inversion by applying an acoustic finite-difference method to simulate a small window of wavefield, and employing a conjugate gradient method to invert the shallow near-surface velocity structure is then more computationally efficient and practical than the full waveform inversion that intends to invert the entire subsurface velocity model. It could help to resolve complex near-surface velocity structures that cannot be inverted by just traveltimes alone. However, time domain early arrival waveform inversion also faces with a number of serious challenges. They include: 1) numerical simulation accuracy to handle the surface topography effects in the forward finite-difference modeling; 2) velocities too low in the nearsurface area, requiring substantial small grids for stable finite-difference simulation; 3) large amplitude variations along the surface due to the near-surface attenuation and various complex structural effects. To address these issues, we developed a variable-grid finite-difference approach for forward modeling, and stable shot-based amplitude balancing algorithm to process both data and synthetics in time domain. Numerical and real data examples demonstrate that the waveform inversion method can handle low velocity zones, hidden layers, and strong lateral velocity variations.
-
-
-
Tectonic Evolution and Burial History of the Makó trough, Hungary: Implications for the Exploration of Juvenile Unconventional Petroleum Systems in the Pannonian Basin
Authors Gábor Bada, Orsolya Sztanó and Roderick J. WallisThe Makó trough in Hungary is a sedimentary depression formed in the extensional Pannonian basin during the last 15-20 Ma. It represents a young HT/HP system with >6 km thick basin fill and has been recognized as the location of major unconventional hydrocarbon resources. Such accumulations are regarded as “unconventional” when economic production is only possible by means of some sort of stimulation technique, usually hydraulic fracturing. Hydrocarbons in this setting do not accumulate conventionally in structural or stratigraphic traps, but in pervasive cells. Due to the geological setting of the Makó trough, the hydrocarbon cell forms a relatively continuous zone marked by considerable internal lithological and petrophysical variability. Due to its novelty and complexity, the exploration of this unconventional resource demands the concurrent application of a wide range of geological and geophysical methods. Evaluation of such petroleum systems strongly rely on the reconstruction of the tectonosedimentary evolution of the host basin, the understanding of the subsidence, burial, thermal and maturation history, and the timing and mechanism of hydrocarbon generation and related abnormal pressure development. In this contribution, highlights are on the latest models for basin evolution and petroleum system development are presented.
-
-
-
A Charge History of Natural Gases from the Rotliegend Deposits in the Western Poland as an Effect of Hydrocarbon Generation and Expulsion in the Carboniferous Source Rocks: Previous and New Models
By D. BotorThe Rotliegend basin (the Lower Permian), extending from the UK to Poland (Gast et al., 2010) has a significant hydrocarbon potential with most probably still not found hydrocarbon reserves, particularly in Poland (e.g., Burzewski et al., 2009; Karnkowski 2007; Pletsch et al., 2010). This giant basin is called the Southern Permian Basin and its Polish part is distinguished as the Polish Permian Basin (Figure 1). It began to develop in the latest Carboniferous, but its development continued during the entire Mesozoic up to the Cretaceous/Paleogene time when the basin inversion occurred (Karnkowski, 1999). The geological evolution in the study area was controlled mainly by the Teisseyre-Tornquist Zone (TTZ) which forms a border between the East European Craton and the Palaeozoic foldbelts and terranes of western and central Europe (Mazur et al., 2005; 2006). Rotliegend basin is superimposed onto the Carboniferous Variscan zone and its foredeep (Karnkowski, 1999; 2007; Mazur et al., 2005). Carboniferous strata occur in Poland in the almost entire substratum of the Rotliegend basin (Figure 1). The regional seal in the Polish part of the basin of the Rotliegend is constituted by the Zechstein evaporates (Figure 1).
-
-
-
Remaining Hydrocarbon Potential and Exploration Strategy in SW Part of Pannonian Basin, Croatia
Authors Lilit Cota and Ivica VulamaCroatian part of Pannonian Basin (PB) has reached moderately high to high level of exploration maturity in some parts. As a consequence of that it is expected that most upcoming discoveries will be made within the small to medium size range (10 MMboe avg.). Sava, Drava, East Slavonija and northern part of Northwestern Croatia depression can be regarded as mature areas for exploration, while southern part of Northwestern Croatia, Southwest Sava and southeastern part of Sava are still under-explored areas. According to several studies, statistical and basin volumetric analysis there is still approx. 46 % of remaining hydrocarbon volumes to be discovered in PB. Past 60 years exploration period statistics also show that approximately 85 % of exploratory wells outside producing fields investigated section between 0-3500 m. Only 11 % of the wells drilled down to 5500 m. In most cases shallow portion of the sediments from 0 m to 1500 m were not addressed properly for shallow gas targets. Majority of the wells were drilled on structural traps, however latest positive INA drilling records indicate that significant stratigraphic traps potential is yet to be revealed. Deeper, less explored section between 3500 m – 5000 m is today becoming more interesting for unconventional gas reservoirs exploration but also for some conventional reservoirs.
-