- Home
- Conferences
- Conference Proceedings
- Conferences
73rd EAGE Conference and Exhibition - Workshops 2011
- Conference date: 23 May 2011 - 27 May 2011
- Location: Vienna, Austria
- ISBN: 978-90-73834-13-2
- Published: 27 May 2011
1 - 100 of 129 results
-
-
SimSrc, Sparseness and Compressive Sensing: a Confluence of Ideas
More LessRecently, seemingly unrelated topics such as high-productivity source acquisition, sparse seismic data representations and computational simultaneous sources have been understood to be intimately related. In acquisition, SimSrc (originally for simultaneous sources), a technique which allows many sources to be active at the same time, has proven beneficial in land acquisition and is being studied for OBC, marine, VSP and other types of data acquisition. Introduced in the late 1990s, shots are typically encoded with a randomized relative timing to aid in discrimination between the interfering shots. In the last decade, similar techniques have been used to computational advantage by handling more than one shot at the same time in the same processing domain. Similarly, the inherent sparseness of the seismic data has motivated compressive data representations that have been used extensively in seismic data manipulations for some time. Most recently, outside the geophysics domain, techniques known as Compressive Sensing have emerged in mathematics and engineering that bear similarities to these geophysical applications. In this workshop, we explore these methods to arrive at a better understanding of the relationships.
-
-
-
From Wiggins’ MED to Compressive Sensing: Sparsity in Seismic Data Processing
More LessThe field of compressive sensing has revitalized interest in the development of methods that use sparse representation models for seismic data processing. However, it should be noted that sparsitydriven methods for seismic signal processing are not new to geophysicists. I would like to use this opportunity to discuss early ideas of sparse signal representation for blind deconvolution (MED), velocity analysis, high-resolution Radon transforms, and Fourier synthesis for wavefield reconstruction. I will also discuss their connection to new developments arising under the field of compressive sensing, paying particular attention to the conceptual simplicity of these methods, assumptions and proven industry-strength results rather than to algorithmic considerations.
-
-
-
Seislet Transform and Seislet Frame: Tools for Compressive Representation of Seismic Data
Authors Sergey Fomel, Yang Liu and Jilin UniversityDigital wavelet transform (DWT) is a well-known tool for characterizing piecewise-smooth signals and is based on predicting smooth signals. Seismic signals are not smooth. However, they are predictable. Seislet transform is a digital wavelet-like transform, which is tailored specifically for representing seismic data. Its construction is based on the notion of signal prediction. Seislet transform uses predictions of sinusoids (applicable to seismic data in the F-X domain), plane waves (applicable to 2-D or 3-D seismic data in the T-X domain), or reflection events (applicable to prestack seismic data). We combine statistical or physical predictability of seismic signals with the lifting scheme of DWT to define the seislet transform. One can view the seislet transform as decomposition into multiscale orthogonal basis functions aligned with seismic events. When multiple interfering events are present in the data, it is also possible to follow all of them simultaneously by turning the seislet basis into an overcomplete representation (a tight frame). Even though the seislet frame is overcomplete, it can be constrained to have only a small number of significant coefficients and, therefore, to provide an optimally sparse representation. The sparsity is easily demonstrated by comparing the seislet transform and frame with the classic transforms, such as Fourier and DWT. The classic DWT is equivalent to the seislet transform with a zero frequency (in 1-D) or zero slope (in 2-D). The sparsity of the transform domain provides not only an effective seismic data compression tool but also a way for designing efficient data analysis algorithms. Traditional geophysical data analysis tasks, such as signal-noise separation and data regularization, are conveniently formulated in the transform domain, where the signal is sparse. When applied in the offset direction on prestack data, the seislet transform finds an additional application in optimal stacking of seismic records.
-
-
-
Acquisition and Processing of Multiple-simultaneous-source Data
Authors Ian Moore, Claudio Bagaini and Mark DalyLand acquisition has some significant advantages over its marine counterpart with respect to simultaneous source data. In particular, it is practical to employ relatively large numbers of sources, and these sources do not have the constraints implicit in the continuous motion that is an inherent feature of towed-marine acquisition. The large channel counts available with modern land recording systems allow the sources to be spread over a significant area, and modern control systems allow complex rules to be used to determine optimum firing patterns dynamically, in order to maximize efficiency whilst meeting whatever constraints are necessary to ensure good data quality. We consider an orthogonal land geometry consisting of a fixed receiver spread and a number of source lines. The available sources operate under rules that dictate when sources can fire simultaneously (with dithers), taking into account factors such as the distance between the sources and the distribution of the sources which are "on station". The resultant dataset consists of a collection of traces, each of which may, in principle, contain contributions from any number of sources (up to the total available). Whilst the shot spacing on each source line is specified, the traces corresponding to those shot points will not in general be contiguous within the dataset, and the resulting complexity is significantly higher than that of a typical marine simultaneous-source dataset. It is not, for example, possible to obtain subsets of the dataset that involve contributions from only two (or a few) source lines. Until we have algorithms capable of processing simultaneous-source data directly, the simplest approach to processing these data is to separate the sources early in the sequence. Consequently, we compare results using passive separation, separation based on random noise attenuation, and separation based on sparse inversion. For the last case, we consider each source line to constitute a separate logical source, leading to a separation problem that involves as many sources as source lines. This method is shown to be practical, and to give superior results to the other methods.
-
-
-
Maximizing Acquisition Efficiency through Simultaneous Source Technology
Authors Ray Abma and Mark FosterSimultaneous source acquisition can significantly improve the efficiency of seismic data acquisition (Howe et al 2008, Howe et al 2009, Bouska et al 2008). BP has developed seismic acquisition technologies like DS3, ISS™ and ISSN™ which can efficiently deliver high-quality data in desert terrains, such that land 3D can be acquired for exploration and development purposes at costs approaching conventional 3D marine towed-streamer surveys. In addition to the very high productivity rates, simultaneous source technology enables the economic acquisition of well sampled, high fold wide azimuth surveys which delivers the observed improvement in data quality, and together with new noise separation techniques, like BP’s SSI technique (Abma et al 2010), these methods are being used for development quality 3D surveys where high quality pre-stack data and wide azimuth attributes are required. We will show examples of simultaneous source acquisition from BP’s surveys to date, and review the experience gained as well as looking at future opportunities and challenges. To summarize our experience, it appears that we can acquire seismic surveys with simultaneous sources in a manner that is faster, cheaper, and better than conventional surveys.
-
-
-
Tackling the “Data Deluge”: a Dimensionality-reduction Approach
More LessCurrent-day imaging and inversion technology increasingly relies on faithful samplings and simulations of seismic wavefields. This reliance on full sampling and high-fidelity wavefield simulations strains our acquisition and processing systems and overcoming this impediment is becoming one of the main challenges faced by our industry. By using randomized dimensionality-reduction techniques, we propose a new strategy where acquisition and computational costs are no longer dictated by the sampling grid but by transform-domain compressibility of the image. To arrive at this result, we combine recent findings from machine learning / stochastic optimization—where (nonlinear) inversions are carried out on random subsets of data—and compressive sensing—where data that permit compressible representations are deliberately subsampled. The key idea of the stochastic approximation is to reduce computational costs by computing each gradient update on a different randomly selected subset of data. In seismic exploration, this corresponds to carrying out migrations with one or a few incoherent supershots made of superpositions of random source-encoded experiments. While this approach introduces source cross talk, it has been applied successfully to reduce the cost of least-squares migration and full-waveform inversion because it reduces the number of wave-equation solves. However, the method is sensitive to noise and relies on relatively large numbers of supershots and wave simulations to get reasonable results.
-
-
-
Blended Acquisition: Increased Information Density per Shot Record
Authors Gerrit Blacquiere, Eric Verschuur and TU DelftIn traditional seismic surveys the firing time between shots is such that the shot records do not interfere in time. However, in the concept of blended acquisition the records do overlap, allowing denser source sampling and wider azimuths in an economic way. A denser shot sampling and wider azimuths make that each subsurface gridpoint is illuminated from a larger number of angles, and will therefore improve the image quality in terms of signal-to-noise ratio and spatial resolution. We show that – even with very simple blending parameters like time delays – the incident wavefield at a specific subsurface gridpoint represents a dispersed time series with a `complex code'. In deblending, this time series is decomposed in the individual source contributions. In blended shot record migration, however, this time series is not decomposed but the complete series is inverted for in a wavefield deconvolution process. We also show that the information density of shot records can be further increased by considering surface-related multiples as signal, using the double illumination concept. This means that these multiples can be exploited, leading to improvements in the angle range of the incident wavefield at each gridpoint. In this way the energy contained in the multiples now contributes to the image, rather than decreasing its quality. This important property will be illustrated by examples.
-
-
-
Seismic Simultaneous Sources in Land: Challenges and Opportunities
Authors Panos G. Kelamis, Peter I. Pecholcs, Constantine Tsingas and Shoudong Huoessing of this optimally designed seismic blended dataset without applying any deblending algorithm produces satisfactory results. To improve the prestack analysis such as first break picking, noise removal and velocity analysis, deblending methodologies and workflows were also applied. In this paper, we present processing results related to land simultaneous sources acquisition. Novel deblending schemes will be shown along with their effectiveness in a production environment. Statics, surface consistency, noise attenuation and velocity estimation are seen as the main challenges for the processing of land blended data. Current processing schemes rely mostly on deblending so that conventional workflows can be subsequently employed for data analysis. Full blended data processing in land is still an open issue. Current practice dictates that we have to have well in advance nearsurface and velocity macromodels in order to proceed. Migration will do the rest. The real question therefore is: a) Do we develop tools to fully process blended data? or b) Do we develop effective deblending algorithms & proceed in a conventional manner? We all know the answer…it can be found in the middle!!! Of course everything starts with acquisition…do we play the game safe and acquire optimally distance-separated data or go wild and acquire data in a random fashion? Processing now becomes the key. In short, seismic acquisition and processing must be considered simultaneously!!
-
-
-
Role of Simultaneous Source Technology in Seismic Industry
Simultaneous source method has been investigated in the past in the context of efficient seismic data acquisition; however recently due to its computational efficiency in imaging, simulation and inversion the momentum in this field has increased. At ExxonMobil, simultaneous source technology has been an active research topic for many years in several areas ranging from acquisition, imaging and recently in full-wavefield inversion (FWI).
-
-
-
An Overview of Airborne Electromagnetics for 3D High-resolution Mapping
By Esben AukenAirborne electromagnetics (AEM) is an efficient tool for mapping the subsurface. AEM delivers a very high data coverage compared to the costs, and the direct outputs are high-resolution resistivity images of the subsurface. In particular the time domain methods (TEM) are well suited for mapping of the salt-fresh water boundary in coastal zones, aquifers, paleo-channels, mineralisations and general structural geological mapping. The development of AEM systems and data processing systems has been intensive during the past 10– 15 years because of better electronics and faster and more flexible computers. While most AEM systems in the 1980s were limited to detection of mineralisations, modern system generations measures full spectra, yields absolutely calibrated data and accurate descriptions of the system transfer function. This, in combination with much enhanced inversion and forward algorithms, makes AEM a suitable candidate for mapping targets which so far have not be possible.
-
-
-
Integrating Gravity and Electromagnetics with Seismic for Near Surface Characterization in Saudi Arabia
More LessNear surface conditions in Saudi Arabia represent the major challenge for acquisition of reliable and meaningful land seismic data. In Saudi Aramco, a major effort is underway to investigate the benefits of integrating gravity and electromagnetic data with seismic data to better estimate near surface velocities for processing large 3D seismic volumes. In 2010 a gravity and electromagnetic acquisition program was carried out in three areas characterized by different near surface geologic conditions. The type of methodologies being employed consist of dedicated high-end electromagnetic and gravity acquisition specifications, geophysical data integration via simultaneous joint inversion, and seismic processing with advanced imaging workflows such as pre-stack redatuming (time) and pre-stack depth migration. Well log analysis in shallow boreholes provides the local petrophysical relationships among velocity, resistivity and density to be used in a simultaneous joint inversion scheme. Given the shallow targets, the resolution offered by the electromagnetic and gravity data is typically within the wavelength of the velocity anomalies affecting seismic imaging. Therefore, near surface non-seismic data act as an ideal complementary dataset to seismic. Results obtained to date reveal density and resistivity anomalies correlated with regions of poor seismic data quality. EM and gravity data analysis and inversion are being carried out in a singledomain approach as well as by applying quantitative simultaneous joint inversion schemes with seismic travel-time data. The generated near surface multi-parameter models are used to correct the seismic data with successive reprocessing in time and depth domains. Encouraging results are being observed from the reprocessing indicating that the multi-physics data and the quantitative integration schemes are succeeding in addressing the near surface velocity estimation problem.
-
-
-
Near Surface Challenges for Processing 3D Seismic Surveys
More LessAround the world, low-relief structures and stratigraphic traps are becoming more important since most of the large structural traps have been drilled. These play types require accurate near surface velocity models for depth conversion. Reservoir characterization is also a key goal. The use of horizontal wells for field development continues to grow at an accelerating pace. Seismic attribute maps can be extremely valuable for placing horizontal wells. But these attributes will be of little value if their quality is degraded by near surface effects. Energy penetration, scattering, source generated noise, surface generated multiples, statics, and source and receiver coupling are some of the long standing near surface issues that continue to present challenges for land seismic imaging. In arid environments, the near surface can be up to 700m thick with near surface challenges such as sand dunes, topography, karsted carbonates, dry river beds, outcropping refractors, velocity reversals, anhydrites, and layered basalts. Karsts can be cavernous, but are usually collapsed and filled. Air filled karsts above the water table are particularly problematic. A wide range of technologies are now being pursued to better characterize the near surface. These include traveltime tomography, early arrival waveform inversion, joint inversion of seismic data with gravity and electromagnetic data, and surface wave inversion. For complex near surfaces that require these technologies to define the velocity model, the conventional statics approach will not be valid. Imaging technologies such as redatuming will be required. Advances in seismic acquisition technology such as ultra-high channel, high density, single sensor recording, and low frequency vibrators, will play a key role in developing solutions for near surface characterization.
-
-
-
Surface Consistent Surface-Wave Inversion
More LessA tomographic method (SWIPER) is used to invert surface-seismic data to estimate variable surfacewave properties. Surface consistency of multi-source, multi-receiver data is exploited to decompose the data in the frequency domain into frequency-dependent propagation (i.e., velocity and attenuation) effects and variable source- and receiver-coupling effects. The inversion can be performed for single modes (linear optimization) or simultaneously for multiple modes (nonlinear optimization). Including source- and receiver-coupling variations improves the ability to estimate velocity and attenuation. Further improvements in the estimation are made by constraining the parameters to be a smooth function of frequency. The estimated model parameters, such as velocity dispersion relations, can be used to predict the multi-mode ground roll and subtract it from the data with little damage to reflections or to invert for a near-surface depth model. The properties of the near-surface vary rapidly in both vertical and horizontal directions. Consequently, the behavior of ground roll also varies both with frequency and with position along the surface. Traditional methods of surface-wave analysis, such as the MASW, have limited resolution because they effectively average properties over the maximum source-toreceiver distance in the gather. For example, because slant stack sums over the traces in the gather, variability within the gather is averaged. Additional limitations arise from the difficulty in picking the maximum amplitudes in the transform domain because of limited velocity resolution, noise, and interference of multiple modes. Fully exploiting the multi-shot and multi-receiver aspect of 3-D surface seismic data allows the determination of variable surface-wave parameters within a fine grid of surface cells using direct-ray tomography. At the higher frequencies, however, multi-mode behavior and their interference must be included. Amplitude and phase effects are coupled and must be determined by a nonlinear optimization method. We show the ability to determine for a receiver-interval sized grid multi-mode velocity and attenuation parameters, which after horizontal smoothing match those determined with a beam-forming method.
-
-
-
Advances in Petroleum Industry Seismic Acquisition Technology
By Dave MonkWhen it comes to acquisition of seismic data onshore there are really only 2 basic attributes which can be addressed which impact how data is acquired. The first is the acquisition geometry, the relationship between sources and receivers. This of course includes the issues of arrays at both the source and receiver end of the raypath. Recent developments in acquisition technology have led to far more flexability in receiver geometries, and the indication is that we will have the capability of recording up to 1 million traces in each shot record within 5 years. Additionally we have seen development of simultaneous source methods which have allowed far greater flexability in source geometry as well. The second fundamental issue goiverned by acquisition technology is the bandwidth of the data acquired. This is limited by both source and receiver, and recent developments have aimed at extending the bandwidth of both as we move towards more sophisticated processing including full waveform inversion. Recently there has been considerable work in extending the low frequency components of the bandwidth. In this presentation a road map of how acquisition may evolve is presented which highlights both the geometry and bandwidth changes we are likely to see in the future through new technology.
-
-
-
Surface Wave Analysis in Laterally Varying Media
Authors Laura Valentina Socco, Paolo Bergamo and Daniele BoieroSurface wave analysis applied to near surface characterisation has had a tremendous development in the last decade. From individual inversions of single dispersion curves extracted from on purpose acquired small scale datasets, the technique has evolved to the analysis of wide datasets of dispersion curves extracted from large scale seismic data acquired for body wave exploration (Socco et al., 2010). The evolution of the method poses new challenges and provides new opportunities for retrieving near surface velocity models in complex geological environments. In particular, innovative approaches for processing and inversion of surface wave data have been developed to improve the reliability and the spatial resolution of the velocity models. If data are not acquired on purpose for surface wave analysis, careful data evaluation is required before processing to assess that acquisition parameters and equipment are suitable for retrieving good quality dispersion curves. After preliminary evaluations, data should be processed to extract dispersion curves taking great care about the effects of lateral variations and the presence of higher modes and other guided waves that could be included in the inversion. Before inverting the curves, apriori information should be considered to build up consistent initial models optimising the parameterisation according to reliable investigation depth, vertical and lateral resolution. Inversion should be performed considering the set of dispersion curves as a unique dataset to provide internally consistent pseudo-2D/3D velocity models and including any available a priori information. Experimental uncertainties should be also used to account for data quality. Joint inversion with other geophysical data can significantly improve the reliability of the final models and the amount of retrieved information. Sensitivity analysis may provide an indication about reliability of estimated model parameters.
-
-
-
High-Resolution Characterization of the Near Surface: GPR and Seismic Reflection Methods
More LessGround penetrating radar (GPR) and shallow seismic reflection (SSR) methods offer high-resolution imaging of the subsurface. Recent technological advances in instrumentation as well as methods (data acquisition, processing and analysis) have allowed sub-meter characterization of near-surface heterogeneous properties in three-dimensions (3D). As hydrocarbon exploration pursues challenging imaging objectives, such as low-relief structures and fracture zones, accurate near-surface characterization has become increasingly important, both as reservoir analogues and as a way to better understand how complex near-surface conditions affect images from deeper layers. GPR and SSR methods developed for engineering and groundwater applications offer the opportunity for technology exchange between near-surface geophysics and exploration geophysics. We present an overview of research conducted at The University of Kansas in: i) GPR high-resolution imaging of reservoir analogues, mapping the distribution of heterogeneous carbonate lithofacies, imaging fractures and time-lapse monitoring the flow of fluids, and ii) developments in automated 3D shallow seismic reflection data acquisition that allow sub-meter resolution imaging of the weathered zone, subsurface stratigraphy and depth to bedrock.
-
-
-
Time Domain Early Arrival Waveform Inversion
By Jie ZhangSeismic early arrivals recorded on the surface include sufficient velocity structural information in the near-surface area. Unlike later arrivals, they are less contaminated by the elastic effects such as surface waves or converted waves. Thus, the early arrivals can be better delineated as an acoustic wave propagation problem. Time domain early arrival waveform inversion by applying an acoustic finite-difference method to simulate a small window of wavefield, and employing a conjugate gradient method to invert the shallow near-surface velocity structure is then more computationally efficient and practical than the full waveform inversion that intends to invert the entire subsurface velocity model. It could help to resolve complex near-surface velocity structures that cannot be inverted by just traveltimes alone. However, time domain early arrival waveform inversion also faces with a number of serious challenges. They include: 1) numerical simulation accuracy to handle the surface topography effects in the forward finite-difference modeling; 2) velocities too low in the nearsurface area, requiring substantial small grids for stable finite-difference simulation; 3) large amplitude variations along the surface due to the near-surface attenuation and various complex structural effects. To address these issues, we developed a variable-grid finite-difference approach for forward modeling, and stable shot-based amplitude balancing algorithm to process both data and synthetics in time domain. Numerical and real data examples demonstrate that the waveform inversion method can handle low velocity zones, hidden layers, and strong lateral velocity variations.
-
-
-
Tectonic Evolution and Burial History of the Makó trough, Hungary: Implications for the Exploration of Juvenile Unconventional Petroleum Systems in the Pannonian Basin
Authors Gábor Bada, Orsolya Sztanó and Roderick J. WallisThe Makó trough in Hungary is a sedimentary depression formed in the extensional Pannonian basin during the last 15-20 Ma. It represents a young HT/HP system with >6 km thick basin fill and has been recognized as the location of major unconventional hydrocarbon resources. Such accumulations are regarded as “unconventional” when economic production is only possible by means of some sort of stimulation technique, usually hydraulic fracturing. Hydrocarbons in this setting do not accumulate conventionally in structural or stratigraphic traps, but in pervasive cells. Due to the geological setting of the Makó trough, the hydrocarbon cell forms a relatively continuous zone marked by considerable internal lithological and petrophysical variability. Due to its novelty and complexity, the exploration of this unconventional resource demands the concurrent application of a wide range of geological and geophysical methods. Evaluation of such petroleum systems strongly rely on the reconstruction of the tectonosedimentary evolution of the host basin, the understanding of the subsidence, burial, thermal and maturation history, and the timing and mechanism of hydrocarbon generation and related abnormal pressure development. In this contribution, highlights are on the latest models for basin evolution and petroleum system development are presented.
-
-
-
A Charge History of Natural Gases from the Rotliegend Deposits in the Western Poland as an Effect of Hydrocarbon Generation and Expulsion in the Carboniferous Source Rocks: Previous and New Models
By D. BotorThe Rotliegend basin (the Lower Permian), extending from the UK to Poland (Gast et al., 2010) has a significant hydrocarbon potential with most probably still not found hydrocarbon reserves, particularly in Poland (e.g., Burzewski et al., 2009; Karnkowski 2007; Pletsch et al., 2010). This giant basin is called the Southern Permian Basin and its Polish part is distinguished as the Polish Permian Basin (Figure 1). It began to develop in the latest Carboniferous, but its development continued during the entire Mesozoic up to the Cretaceous/Paleogene time when the basin inversion occurred (Karnkowski, 1999). The geological evolution in the study area was controlled mainly by the Teisseyre-Tornquist Zone (TTZ) which forms a border between the East European Craton and the Palaeozoic foldbelts and terranes of western and central Europe (Mazur et al., 2005; 2006). Rotliegend basin is superimposed onto the Carboniferous Variscan zone and its foredeep (Karnkowski, 1999; 2007; Mazur et al., 2005). Carboniferous strata occur in Poland in the almost entire substratum of the Rotliegend basin (Figure 1). The regional seal in the Polish part of the basin of the Rotliegend is constituted by the Zechstein evaporates (Figure 1).
-
-
-
Remaining Hydrocarbon Potential and Exploration Strategy in SW Part of Pannonian Basin, Croatia
Authors Lilit Cota and Ivica VulamaCroatian part of Pannonian Basin (PB) has reached moderately high to high level of exploration maturity in some parts. As a consequence of that it is expected that most upcoming discoveries will be made within the small to medium size range (10 MMboe avg.). Sava, Drava, East Slavonija and northern part of Northwestern Croatia depression can be regarded as mature areas for exploration, while southern part of Northwestern Croatia, Southwest Sava and southeastern part of Sava are still under-explored areas. According to several studies, statistical and basin volumetric analysis there is still approx. 46 % of remaining hydrocarbon volumes to be discovered in PB. Past 60 years exploration period statistics also show that approximately 85 % of exploratory wells outside producing fields investigated section between 0-3500 m. Only 11 % of the wells drilled down to 5500 m. In most cases shallow portion of the sediments from 0 m to 1500 m were not addressed properly for shallow gas targets. Majority of the wells were drilled on structural traps, however latest positive INA drilling records indicate that significant stratigraphic traps potential is yet to be revealed. Deeper, less explored section between 3500 m – 5000 m is today becoming more interesting for unconventional gas reservoirs exploration but also for some conventional reservoirs.
-
-
-
Experiences of ADX-Energy in the SouthEast Pannonian Basin, Romania - Stratigraphic Opportunities in a Mature Area
Authors Paul Fink and Szilamér KovácsThe general consensus for the Neogene Pannonian Basin is that it is in a late mature hydrocarbon exploration stage and almost all valid structural traps have been drilled. Any remaining potential of economic interest was to be found in deep pre-Tertiary traps, unconventional traps such as basin centred & shale gas and stratigraphic traps (e.g., Tari et al. (2006) and Clayton et al. (1994)). Whilst the deliberate search for stratigraphic traps with modern exploration tools is well underway and with documented success in the Hungarian and Austrian parts of the basin (Arzmueller et al. (2006)) , this is probably not universally true for the Serbian and Romanian part of the Pannonian basin. (The only published paper of a modern 3D seismic based sequence stratigraphic interpretation on the Romanian side known to the authors is Velescu et. al. (2005)). Given that several stratigraphic discoveries have however been drilled by chance and proven to work also in Romania, there is a case to be made for remaining exploration potential in stratighraphic traps in an area which has seen hardly any 3D seismic acquisition and little high resolution modern 2D seismic. (This of course does not exclude overlooked potential in classic structural traps and unconventional plays). ADX-Energy (ADX) has therefore participated in the 2010 Romanian licensing round and won an exploration block (Parta, 1,221 sqkm area) in the Southeast Pannonian basin of Western Romania (Figure 1).
-
-
-
Oldest Hydrocarbon-producing Fields in the Vienna Basin (Austria)
More LessOil Exploration in the Vienna Basin started in the 1930ies in Austria. But the first oil discovery was made 1914 near Egbel due to sipping gas spills near the surface (Janoschek, 1942). Geologists concluded that the Sarmatian layers from the neighbour country are continuing to the South. The successful well in Slovakia led to drilling campaigns in the central part of the Vienna Basin in Austria. The first productive well (Gösting I) was drilled in 1931-1932 (Friedl, 1936). During this time lots of oil fields, owned by various companies, were developed in the Vienna Basin. Since 1995, OMV is holding nearly all the licenses of the Austrian part of the Vienna Basin. Only three small licenses remained in the ownership of RAG-Austria and two of them (Gaiselberg and RAG-Field; Figure 1) are still producing.
-
-
-
New Insights Into The Hydrocarbon System Of The Getic Depression, Romania: Implications For Exploration
The Getic Depression represents the foothills of the Southern Carpathians, north of the plains of Moesia, typically described as the foreland of the Southern Carpathians (Fig. 1). The eastern part of the depression is interpreted as the continuation, albeit on a smaller scale, of the Eastern Carpathians. Conversely, the western part of the depression corresponds to a Paleogene to Early Miocene strike-slip basin (i.e. the Getic Basin) developed on the contact zone between the Carpathians and Moesia and thrusted over Moesia during the Mid Miocene. The Getic Depression is a mature petroleum province with thousands of wells drilled and several fields discovered since the exploration started more than 100 years ago (Fig. 1). The shallow structural plays have been intensively drilled in the past. In turn, only a small number of wells targeted deep objectives typically located at more than 4 km depth. These deep wells had only limited success, but indicated the presence of a working petroleum system.
-
-
-
Exploration in and Below the Thrust Front of the Eastern Alps
More LessThe foreland basin of the Eastern Alps (Fig. 1) is intensively explored since the mid-50s. RAG´s second exploration well was drilled 1955 in the eastern imbricates based on single fold 2D lines on a proposed anticlinal structure. The anticlinal structure was defined by surface geology, shallow pilot wells and 1 fold 2D seismic lines, but the well missed the structure. Since 1955 more than 1000 wells have been drilled in the Upper Austrian and Bavarian part of the Molasse basin, the vast majority in the undeformed part of the basin. Only a few wells were drilled in the Molasse imbricate zone. Reason of failure in the imbricated Molasse were seismically poor defined structures in the complex deformed Molasse imbricates but also the poorly understood depositional history and the prediction of reservoir rocks. The structural and sedimentological features of the Molasse imbricates were illuminated by the implementation of seismic volume based interpretation of large 3D-seismic surveys, acquired during the last decade.
-
-
-
The Modern Exploration Methods Application Experiences in Vienna Basin and East Slovakian Basin
Authors Branislav Šály, Ivan Hlavatý and Vladimír JureňaCrucial question is: “Are there any reasons to run exploration in mature highly exploited basins?” There are few reasons: - Lot of analogies from known fields available - Infrastructure available, - Possible alternative use of small reserves or low capacity fields (co generation units) Vienna Basin and East Slovakia Basin present traditional exploration areas in Slovak Republic. The first exploration well in Vienna Basin, and also in the whole Slovakia, was drilled in 1913, in the vicinity of Gbely. In East Slovakian Basin the exploration for hydrocarbons started in the middle of last century. Since that time is the exploration and production ongoing. Main progress of the modern exploration methods started in the last decade of 20th century. It has been accompanied by realizing of 3D seismic measurement, analyzing of seismic attributes, using of modern methods of seismic interpretation and implementing of sequence stratigraphy. Also risk analysis was involved in our prospect evaluation. Identification of the new tectonic structures or even development of a new tectonic pattern (in ESB), new play concept and consequently discovering of new hydrocarbon traps are the main results of application modern exploration methods.
-
-
-
Wedge Top Exploration In Romanian East Carpathians - Seismic Image Of Triangle Zone
Authors A.L. Stan, Serghie Mihalache and Rodica Mihaela SolgaRomania has an extensive experience in oil and gas exploration. Almost, entire geological units with oil and gas bearing potential are covered by 2D and 3D seismic information and thousands of wells were drilled. Rompetrol owns five exploration licenses (figure 1) on blocks: EP I - 5 Gresu, EP I- Nereju, E III Focsani, E IV -5 Satu Mare and E IV – 3 Zegujani. For Gresu and Nereju blocks Concession Agreements were signed with National Agency for Mineral Resources (NAMR) in 1997 by Anschutz Romania Corporation in 1997, company bought by 73rd EAGE Conference & Exhibition incorporating SPE EUROPEC 2011 Vienna, Austria, 23-26 May 2011 Forest Romania Corporation in 1998. In 2005, both licenses were transferred from Forest Oil Corporation to Rompetrol SA. Rompetrol SA is 100% title holder in Gresu and Nereju Blocks.
-
-
-
Exploration in the Vienna Basin (Austria) – Tools and Methods
Authors Philipp Strauss, Klaus Pelz and Wolfgang SiedlPetroleum companies have been exploring the Vienna Basin for oil and gas for nearly 100 years. OMV owns two Austrian concession areas since the early 1960ies: the Waschberg Zone (with the Flysch and Molasse) and the Austrian part of the Vienna Basin. The exploration areas are vertically divided into three levels: level 1 with Neogene strata (Miocene sediments), level 2 holding units formed during alpine orogeny (Calcareous Alps and Flysch) and level 3 representing the autochthonous units of the European basement. Level 2 and 3 are summarized as Pre-Neogene. The geological and tectonic setting for each of these three levels is diverse with a different age, style of deformation, and lithology; hence the methodology and strategy of exploration is adapted to each level. For level 1, the Miocene strata, 3D seismic interpretation, seismic attributes and sequence stratigraphy are standard tools for exploration. In contrast, for the Pre- Neogene (level 2 & 3) geological and structural modelling are more important and seismic interpretation is applicable for structural mapping only due to medium or poor quality of seismic data.
-
-
-
Slope-toe Turbidite Systems Related to Aggradational - Progradational Sequences: Potential Stratigraphic Traps in the Makó Trough, Pannonian Basin, Hungary
Authors Orsolya Sztanó, Péter Szafián, Gábor Bada, Daniel W. Hughes and Roderick J. WallisThough the Makó Trough is best known as the location of major unconventional gas accumulations, the thick Neogene to Quaternary sedimentary successions may contain conventional HC resources as well. Structurally controlled traps are widespread on the neighbouring basement highs and not likely to occur in the basin interior. However, stratigraphic traps, untested so far, were identified in relation to the basin-filling progradational slope system. The style and pace of slope advancing are the key to understand sand delivery to and formation of potential reservoirs in the deep parts of the basin.
-
-
-
Pre-Tertiary Play types of the NW Pannonian Basin, Hungary
Authors G. Tari, Mike Peffer and Gabor VargaCompared to the rest of the Pannonian Basin system, the Hungarian part of the Danube Basin (Fig. 1) appears to be underexplored. Most of the hydrocarbon exploration efforts concentrated on the Tertiary basin fill and, typically, the well penetrations into the pre-Tertiary “basement” were only on the order of a few tens of meters. Whereas the presence of a pre-Tertiary Alpine thrust-fold belt beneath the Miocene to Recent basin fill has been suggested by several authors based on the interpretation of regional 2D reflection seismic data sets, unfortunately, the supposed overthrusts were never proven by drilling. A deliberate search for complex structural traps within the Alpine nappe system, however, provided several recent discoveries beneath the Tertiary basin fill of the nearby Vienna Basin (Fig. 1).
-
-
-
Western Getic Depression, Romania: New Architecture and Hydrocarbon Potential
Authors M. Tilita, D. Tambrea, Danubian Energy Consulting, A. Boscaneanu and Rompetrol S.AThe Getic Depression/Basin stands out as an old and prolific area for hydrocarbons in the Central and Eastern Europe. Our case study area refers to its western part, also known in the petroleum “language” as Zegujani area. The basin evolved during Tertiary overlapping the contact of South Carpathian and Moesian Platform, being in fact the South Carpathian orogen foredeep. Deposition started with Upper Cretaceous cycles and was continuous, excepting a few hiatuses, up to Pliocene. Despite the overall compressional tectonic regime, much of the tectonic evolution on the Getic western part was conditioned by the evolution of Timok transtensional system.. Most of the exploration works were focused on the central-eastern part of the Getic Depression, motivated by the discovery of several large oil fields (e.g., Ticleni field), while the western part was somehow neglected, due to the lack of commercial discoveries.
-
-
-
Exploration of a Palaeogene Syn- to early Post Orogenic Deep-Marine Basin Play, Kamchia Depression (Eastern Onshore Bulgaria)
The Kamchia Depression is the easternmost onshore part of the Balkan Foredeep in Bulgaria (Figure 1). It has a long history of petroleum exploration and production numerous wells drilled, often before seismic. There has been limited gas production to date, but there have been numerous shows of gas and gas condensate from a range of late Mesozoic to Oligocene reservoirs, in some cases with high recorded flow rates. Exploration plays exist in the (i) allochthonous and para-autochthonus Balkan Thrust belt; (ii) autochthonous Mesozoic and early Palaeogene normal fault block highs in the foreland (analogous to the offshore Galata Field, Currie et al., (2010)), and (iii) the syn- to early postorogenic fore-Balkan trough (the Kamchia Basin) The focus of this study are thick (up to 1100m) Palaeogene turbidites deposited prior to, during and after the emplacement of the Balkan thrust sheets. Legacy well data shows numerous thick and porous sandstone units, many of which flowed gas but were not developed or were drilled prior to seismic acquisition and were not drilled in optimal positions. The distribution of reservoir sands has been enigmatic, with rapid changes in facies and thickness presenting a major risk in ongoing exploration. The key exploration challenges in this region have been understanding and predicting (i) the rapid changes in facies, sand presence and reservoir quality and (ii) charge and seal effectiveness for individual sands intervals.
-
-
-
How Reliable is Statistical Wavelet Estimation?
Authors Jonathan E. Edgar and Mirko van der BaanWavelet phase mismatches frequently occur between final processed seismic data and synthetics created from well logs. During processing, deterministic zero-phase wavelet shaping corrections are often favoured over statistical approaches. The remaining phase mismatches are eliminated through additional phase corrections using well logs as ground truth. Thus a phase match between the data and synthetics is forced. Irrespective of the validity of this phase correction method, well logs are not always available and can predict different phase corrections at nearby locations. Thus, there is a need for a wavelet estimation method that can reliably predict phase from the seismic data, without reliance on well log control. Such a method can be used for phase extrapolation away from wells, serve as a quality control tool, or even act as a standalone wavelet estimation technique. We test three current statistical wavelet estimation methods against the deterministic method of seismic-to-well ties. Specifically, we explore the extent to which the choice of method influences the estimated wavelet phase, with the aim of finding a statistical method which consistently predicts a phase in agreement with that obtained from well logs (Figure 1). We question whether well logs are always the optimum source of wavelet phase information and advocate the use of statistical methods as a complementary tool or reliable alternative.
-
-
-
Robust Surface-Consistent Deconvolution: Creating Inversion Ready Land Data
More LessWe define the concept of “inversion ready seismic data” to mean that the embedded wavelet in the data is everywhere the same, has a known phase (preferably zero-phase) and is convolved with the reflectivity series of the earth. Effects of factors such as variable noise, multiples, near surface absorption and non-white reflectivity leave deconvolved land data far from the ideal. Surface consistent deconvolution is one important step in land data processing toward preparing the data for inversion and AVO. Over the years, various methods of surface-consistent deconvolution have been used in order to mitigate the effect of noise on the deconvolution operators. I limit the discussion in this paper to spiking deconvolution for operator generation, because the use of predictive or so-called gap deconvolution operators supply little or no possibility for increasing the phase stability of land data.
-
-
-
Near-surface Variations and Onshore Time-lapse Seismic
More LessDifferent from the offshore time-lapse seismic, convincing successes for onshore environments have been reported mainly for shallow reservoirs with large changes in acoustic impedance. This report describes three onshore 4D seismic trials. It is shown that the near-surface variations hamper the measurement of small reservoir changes. Some additional measures in acquisition may have to be taken, as current processing technology is not yet able to solve for the rapid spatial and temporal changes in the near-surface. These measures include an increase in multiplicity, direct calibration of dynamite shots and acquisition with multiple, adequately sampled, buried receivers, which are required to reconstruct the seasonally changing ghosts. Processing offers scope as well to improve repeatability, such as improved tools to derive detailed near-surface models.
-
-
-
Anelastic AVO: Open Issues in Quantitative Inversion of Dispersive Reflectivity Data
Authors Kristopher A. Innanen and Chris BirdA practitioner of AVO analysis might plausibly take an interest in anelastic reflection coefficients, which are the subject of this paper, for one of two reasons. One might wish to protect an elastic AVO procedure from errors due to unaccounted for anelastic influences. Or, one might seek to use the variations in such coefficients to infer additional properties of the target. The following discussion is largely insensitive to which of these two attitudes is taken: direct anelastic inverse theory (e.g., Innanen, 2011) leads both to formulas for estimation of fully anelastic parameters (e.g., QP and QS), and, simultaneously, to formulas for estimation of more standard elastic parameters, and the latter are “protected” from that particular source of error. Its focus will be on anelastic AVO/AVF in a theoretical and experimental context, ongoing implementation efforts, and open issues whose resolution will be necessary for anelastic AVF/AVO/AVA inverse theory to be considered in any sense complete.
-
-
-
Another Look at AVO Killers
Authors B. Milkereit, E. Bongajum and J. HuangFor plane waves, systematic amplitude variations with offset/angle (AVO/AVA) depend on changes in the P-wave, S-wave, density and Poisson's ratio at the plane interface (Young and Braile, 1976). AVO-trends and variations are used in hydrocarbon exploration as fluid/gas indicators (Castagna et al., 1998; Shuey, 1985). The simple 2-layer plane wave approximation may lead to potential pitfalls in the interpretation and inversion of AVO trends (Allen and Peddy, 1993). In land seismics, a number of geological situations introduce a high level of uncertainty for AVO analysis. Here we investigate (1) heterogeneity scale at the reservoir level and (2) significant attenuation (scattering or intrinsic) above the reservoir. Because AVO measurements are obtained from prestack seismic data, noise levels are often high. In addition, source and receiver characteristics, near source effects and geometrical spreading must be included for true amplitude processing (Haase and Stewart, 2010). The effects of heterogeneities on seismic wave propagation can be described in terms of different propagation regimes: quasi-homogeneous for heterogeneities too small to be "seen" by seismic waves, Rayleigh scattering, Mie scattering and small-angle scattering. These scattering regimes cause characteristic amplitude, phase and travel time fluctuations important for the analysis of AVO trends. Wave propagation through heterogeneous media depends on the distribution of physical rock properties (matrix, pore space, fluid/gas composition).
-
-
-
3D Symmetric Sampling on Land of Sparse Acquisition Geometries
More LessFor successful AVO analysis the quality and reliability of the prestack data has to be ensured by a combination of high-quality acquisition and powerful processing. Therefore, it is appropriate to consider some ingredients of data acquisition that are necessary to achieve reliable reflection amplitudes. Based on two earlier papers (Vermeer, 2010a; 2010b), this paper discusses various ways to improve prestack data quality. I start with a review of the symmetric sampling technique which is in my view the best way of selecting acquisition parameters, followed by a discussion of various aspects of parameter choice that influence final prestack land data quality.
-
-
-
Vibroseis Wavelet Estimation
Authors Zhouhong Wei, F. Phillips and INOVA Geophysical Equipment LimitedOver the years, Vibroseis method has become the principal data acquisition method in land seismic exploration. For half a century this technology has achieved great success in land seismic exploration. However, this method seems to reach its limits as the search for energy resources continues. Many practical issues arising from field operations have remained theoretically unexplained, for example, inaccurate wavelet estimation. This paper focuses on a new vibrator-ground model to simulate the filtering effects produced by the coupling system of the baseplate and the ground as well as the coupling system between the captured ground mass near the vibrator baseplate and the surrounding earth. This new model is referred as the Vibrator-Coupled Ground Model in this paper. Experimental results show that the weighted-sum ground force, when filtered by the Vibrator-Coupled Ground Model, is proportional to the far-field particle velocity whereas the unfiltered signals are not.
-
-
-
Geophysics Models
More LessThe title of this workshop, “Effective models leading to business solutions”, gives a good description of the purpose of modelling. The modelling should address the important uncertainties we want to reduce. It should give the result of integration of all the separate knowledge pieces that we have into a response which can’t be predicted from individual information pieces on their own.
-
-
-
Geological Models
More LessA smart and accurate representation of geology is a key stage for obtaining, from the modelling process, reliable results on which costly investments can be decided. First of all, the predictive value of the model is strongly dependent on the quality of the general architecture of the model. Stacking pattern analysis on cores and logs wells data help to identify the significant sedimentary surfaces –sequence boundaries, maximum flooding surfaces - that bound the stratigraphic sequences. Sequence stratigraphy controls correlations from well-to-well, but also the stratigraphic architecture and sedimentary cycle development that drive the facies distribution in the reservoir. Obviously, stratigraphic units at reservoir scale give the main features for model layering, together with the identification of flow units, vertical permeability barriers, and well completions.
-
-
-
Effective Petrophysics models leading to Business Decisions
By John OwensThe business decision for this exercise is “Where to drill an infill well?”. It is proposed that the key petrophysical consideration involves the evolution of subsurface realisations combined with lessons learned during the production life of the field as shared between the subsurface team. Attention is paid to three key learning points: • What did the team “know” prior to first production and how has subsequent drilling and production changed the understanding of the reservoir? • How does the current understanding influence the decision for the infill well location? • How to react to the unexpected during drilling of the infill well?
-
-
-
Effective Reservoir Management models leading to Business Decisions
By Mike KingReservoir management can be thought of as a sequence of activities in which we: 1) Build a model, or class of models, that embodies our understanding of the reservoir description and reservoir processes to obtain predictions of how the field will perform 2) Use these models to make a business decision, e.g., the placement of an infill well 3) Measure the reservoir performance after the action is taken, and either revise or validate the models being used for subsequent predictions Each step of this process can be enhanced through the use of multiple models. When we work with more than a single model we’re better able to represent our uncertainty in performance prediction and to expose the risk associated with the business decision. We are also better able to design a surveillance program by attempting to distinguish between different subsurface models. Examples will be drawn from North Sea experience with mature assets on the success (or failure) of these strategies.
-
-
-
Interactive Exercise on a Field Data Set
Authors Patrick Corbett, Glynn Williams, Olivier Gosselin, Thierry Coleau and Mike ChristieThis interactive seminar will provide a field data set for working in small teams. The work will build on the morning workshop presentations from experts in their respective fields – geological modelling, petrophysics, geophysics and petroleum engineering. These experts together with the above convenors will endeavour to lead the participants through the exercise – doubtless ensuring that each team will lead to different outcomes. The teams will compare the results with previously worked up models. In this way, insights into the various interpretations and various interpretation methods can be made. The objective of the work will be to focus on a typical oil field business decision - e.g. to identify the optimum infill well drilling location(s). A presentation on some of the real challenges within the field will be presented at the conclusion of the workshop. Through this session and the contributions of the morning sessions – it is intended that the cross-disciplinary understanding of all participants will benefit from the workshop. The EAGE/SPE AGORA principles are that all technical voices are heard democratically – much in the spirit of the original Athens AGORA – before reaching the appropriate decisions.
-
-
-
Towards Joint Inversion of Electromagnetic, Seismic, and Production Data for Reservoir Characterization and Monitoring
Authors Aria Abubakar, Tarek M. Habashy, Lin Liang, Guozhong Gao, Jianguo Liu, Maokun Li and Guangdong PanThere are variety of measurements that may illuminate the reservoir with varying coverage and resolution such as: Electromagnetic (EM) (controlled-source EM (CSEM), magnetotelluric (MT), surface to borehole (STB), and cross-well EM), Seismic (surface seismic, cross-well seismic and vertical seismic profiling (VSP)), gravity (surface and borehole), and production history/well testing data. Each measurement on its own will provide incomplete information due to the non-uniqueness and limited spatial resolution associated with their interpretations. However, when they are integrated together and combined with other measurements such as near-wellbore data, they may provide considerable values: to enable inference of pertinent reservoir properties, to enhance the predictive capacity of a reservoir model, and to help us in making appropriate field management decisions with reduced uncertainty. In this presentation, we will present an overview of joint inversion approaches for integrating EM, seismic, and production data. For reservoir characterization applications, we will present both joint structural and petrophysical algorithms for integrating EM and seismic data (CSEM & surface seismic and cross-well EM & cross-well seismic). For reservoir monitoring applications we will present EM data (both for cross-well and STB) inversion algorithms constrained by the fluid-flow simulator (ECLIPSE). In the inversion for both EM and seismic we employ full nonlinear approach (or the so-called full-waveform inversion) so that we can utilize all the information from the data. We will also discuss challenges, advantages, and disadvantages of these approaches by using some test cases.
-
-
-
Integrating Crosswell Electromagnetic Imaging and Reservoir Data for Dynamic Modeling of Water Injection in Carbonate Reservoir
More LessCrosswell Electromagnetic (EM) tomography is a recently developed technology to estimate the formation resistivity distribution in the interwell volume. The data are acquired by fixing receivers in one well, and measuring the magnetic field as a solenoid source broadcasts a continuous sinusoidal signal as it moves up the well in a second well some distance. The resistivity distribution is then estimated through non-linear inversion of the data with respect to an initial resistivity model. In this project time-lapse crosswell EM images are used to monitor apparent saturation changes in a water injection pilot in basal low-reservoir quality units of a giant carbonate field in the Middle East. The evolution of water saturation is deduced from the inverted resistivity distributions. Note that to obtain a detailed image of apparent saturation changes, the initial model incorporates realistic representations of the small-scale heterogeneities common to carbonate reservoirs. These include detailed thickness variations and thin dense layers interbedded in some reservoir units. The highly constrained images are then compared to dynamic reservoir simulation results derived from models of various levels of complexity. Simulations show that if flow barriers are not included between various reservoir units, the injected water will move upward across the reservoir units which is inconsistent with the EM. Successive adjustments were therefore applied on the dynamic model to honor the EM images as well as the injection pressure results. Properly constrained with seismic geologic and flow data the EM results provide useful information about the location and behavior of the fluid front in the formation and allow identifying the adequate level of geological details that needs to be preserved in a dynamic model.
-
-
-
Feasibility Analysis of Surface-to-reservoir Electromagnetics for Waterflood Monitoring
Authors Daniele Colombo, Mike Jervis and Thierry TonellotWe analyze, by means of a synthetic model, the feasibility of detecting the electromagnetic (EM) field variations related to waterflood in a large Saudi Arabian Jurassic reservoir. We utilize a 3D structural reservoir model and derive the geoelectric parameters from careful analysis of well logs acquired at the Saudi Aramco Technology Test Site. The resistivity variations as a result of water flooding are derived using characteristic parameters and injection water salinity of the field. We model a geometry consisting of a radial surface galvanic source and four EM receivers located at the reservoir level. The full EM field is modeled in the time domain and the horizontal and vertical electric field (Ex and Ez) and horizontal crossline magnetic field (dBy/dt) components are interpreted and analyzed. Results indicate that all the modeled fields show substantial variations as a result of water saturation changes with field strength values above the noise level expected for EM sensors. The results will be validated next by modeling more complex and realistic 3D patterns of water saturation in the reservoir, as derived directly from reservoir simulation. Given that one of the modeled components is the vertical electric field, the electrical anisotropy of the overburden is expected to play a significant role in the response and will be taken into consideration in the next round of modeling. Modeling results also suggest that the type of borehole EM sensors currently available in the industry may not be adequate for surface-to-reservoir EM applications.
-
-
-
Stochastic Inversion of CSEM and Seismic Data for Reservoir Properties with the Neighbourhood Algorithm
Authors M. Fliedner and S. TreitelStochastic (“Monte Carlo”) methods like the Genetic Algorithm (GA) and Simulated Annealing (SA) have become increasingly popular for the inversion of geophysical data. Only forward modelling is needed to evaluate the objective function. In addition to a “best” model, some stochastic methods yield statistical information about the range of acceptable models for a given error tolerance by estimating Bayesian integrals of the posterior probability density distribution (PPD). Having a statistically significant sampling of the model space and the associated error surfaces (PPDs) rather than a single “best” model allows us to assess the reliability and resolution power of different inversions given the available data and prior knowledge of geologically reasonable constraints on the expected solution. This is particularly important when we try to combine different types of data, e.g. seismic and electromagnetic measurements. Such joint inversions are particularly useful for reservoir characterization as a single type of measurement is not sensitive to all the parameters of interest. Seismic data provide a high resolution view of the subsurface and are sensitive to rock porosity, they are usually insensitive to the different fluids in a reservoir. Electromagnetic data, on the other hand suffer from low resolution, but are very sensitive to changes in saturation due to the high resistivity of hydrocarbons compared to brine (Hoversten et al. 2006). Through appropriate rock physics transforms, CSEM data can help to constrain the low-frequency trend of the seismic properties (Mukerji et al. 2009).
-
-
-
Capacitive Electric Field Measurements for Geophysics
Authors A.D. Hibbs, Dickey R.P., Derby K., Petrov T., Lathrop D., Rusakov N., Krupka M.A. and and Markel J.Over the last ten years Quasar Federal Systems (QFS) has developed capacitive electric field sensors for use on land, airborne, and underwater. Airborne, these sensors have enabled the first measurement of the E-field in air at the μV/m level and the first three-axis measurement of the electric field. Underwater the sensors have produced the lowest reported voltage noise while exhibiting exceptional stability and robustness over time. On land, highly accurate measurements of the electric field can be made in dry sand without the addition of water or electrolytes to increase the local ground conductivity. QFS has created a division, Quasar Geophysical Technologies to commercialize its technology for geophysical science applications. The fundamental features, present performance and projected limits of capacitive E-field sensing will be reviewed. The benefit for specific problems in practical E-field measurements will be discussed and examples from recent field tests presented.
-
-
-
Stochastic Inversion of Seismic and Electromagnetic Data for CO2 Saturation Prediction
More LessStochastic inversion of seismic (AVA) and electromagnetic (CSEM) data are used to predict reservoir porosity and CO2 saturation. The inversion uses Markov Chain Monte Carlo (MCMC) sampling techniques coupled with statistical rock-physics models. The parameters estimated are, Vp/Vs, acoustic impedance, density, porosity, water saturation and a Lithology indicator. Smoothing is achieved by use of a spatial correlation length in a Markov Random Field representation of the Lithology indicator. The algorithm is demonstrated using a detailed 2D synthetic model constructed for benchmarking avo inversion algorithms that has been adapted to replace hydrocarbon with CO2 in the reservoir sands. Synthetic seismic and CSEM data are used to test the resolution of porosity and CO2 saturation predictions under a range of experimental variables. Three types of rock-physics models are considered; 1) linear regressions between variables, 2) Gaussian distribution fits to clusters of variables in two dimensions, and 3) N dimensional multivariate covariance distributions, where N is the total number of inversion parameters. The choice of rock physics model, the proximity of wells used for rock physics, and data noise levels all effect the quality of the porosity and CO2 saturation prediction. Predictions of porosity and CO2 saturation are better when the porosity and saturation are included in the inversion (1 step inversion) compared to inverting only for geophysical parameters followed by a stochastic estimation of porosity and CO2 saturation given the geophysical parameters (2 step inversion).
-
-
-
3D Reservoir Model and Resource Estimation for a North Sea Oil Field from Quantitative Seismic and CSEM Interpretation
Understanding reservoir properties and fluid distribution is the aim of petroleum geophysics, in particular in the field appraisal and development phases. The ultimate goal is not only to identify and delineate hydrocarbon charged reservoirs, but to quantitatively determine the volume and distribution of oil and gas contained. Since no single measurement has the required response properties to achieve this, it is now recognized that integration of different types of data with complimentary sensitivity will be essential. In this study, we describe a quantitative, joint interpretation of 3D seismic and 3D controlled source electromagnetic (CSEM) data from the Troll western oil province (TWOP), an oil and gas field in the Norwegian North Sea.
-
-
-
Strategies for Reservoir Characterization and Production Monitoring using Controlled Source Electromagnetic Data
Authors Evert Slob, Marwan Wirianto, Jürg Hunziker and and Wim MulderExploiting the fact that in a marine environment the source is continuously in action while towed behind a boat has improved CSEM capability as a direct resistor indicator. We have taken this synthetic source aperture concept one step further to show on numerically modeled data that uncertainties in source location and orientation are eliminated using a processing procedure called interferometry by multi-dimensional deconvolution. This procedure also eliminates the effects of the sea-surface. This procedure could work well for acquisition according to the present industry practice, under realistic uncertainties in receiver location and orientation, and realistic levels of noise. This is a data-driven procedure that requires properly recorded data. In case some data are not properly recorded due to receiver clipping, a hybrid model-driven data-driven approach must be used. We show some simple 2D examples to illustrate the concept of this procedure, including its drawbacks and advantages for characterization and monitoring purposes.
-
-
-
3D Inversion of Time-lapse CSEM Data from Dynamic Reservoir Simulations of the Harding field, North Sea
Authors Michael S. Zhdanov, Noel Black, Alexander V. Gribenko, Glenn A. Wilson and Ed MorrisRecent studies have inferred the feasibility of time-lapse controlled-source electromagnetic (CSEM) methods for the monitoring of offshore oil and gas fields. The time-lapse CSEM inverse problem is highly constrained though inherently 3D since the geometry of the reservoir is established prior to production from high resolution seismic surveys; rock and fluid properties are measured from well logs; and multiple history matched production scenarios are contained in dynamic reservoir models. Using Archie’s Law, rock and fluid properties from dynamic reservoir simulations of the Harding field in the North Sea were converted to resistivity, from pre-production in 1996 to decommissioning in 2016. CSEM data were simulated for each state. We demonstrate how 3D inversion can be used for monitoring the oil-water contact from pre-production to end of oil production in 2011, and for monitoring of the gas-water contact 2011 to 2016 during gas production. In particular, we show that focusing regularization is able to recover sharp resistivity contrasts across the oil-water and gas-water boundaries, whereas smooth regularization fails to recover an adequate resistivity contrast.
-
-
-
SAR Interferometry Applications: the Outlook for Sub Millimeter Measurements
By F. RoccaOptical leveling campaigns, tiltmeters, GPS and InSAR are geodetic techniques used to detect and monitor surface deformation phenomena. In particular, InSAR data from satellite radar sensors are gaining increasing attention for their cost-effectiveness and unique technical features, making it possible the monitoring of large areas, even revisiting the past. Moreover, more advanced InSAR techniques (PSInSARTM, SqueeSARTM) developed in the last decade are capable of providing millimeter precision, comparable to optical leveling, and a high spatial density of displacement measurements, over long periods of time without need of installing equipment or otherwise accessing the study area. Thanks to the high density and quality of the measurements the PSInSAR data can be successfully used in geophysical inversion, to measure the permeability of oil reservoirs and/or to evaluate the possibilities and risks due to seismic faulting in the sequestration of CO2. In these cases, the precision, the sub weekly frequency of the measurements and the time required for the data to be available are the most important aspects, more relevant than the spatial resolution.
-
-
-
Stereo Satellite Elevation Mapping Accuracy and Application
More LessThe new generation of 50cm resolution stereo satellite photos are demonstrated to have relative horizontal accuracies in the range of 10cm in 10km. Using sophisticated image matching, signal enhancement and noise attenuation methods “borrowed” from the field of oil and gas exploration seismic processing, elevation maps with better than 50cm vertical accuracy, on 1m centers, can be produced from these 50cm stereo satellite photos. We call our processing of Digital Elevation Models (DEMs) from stereo satellite photos “geophysical processing” to differentiate it from the conventional photogrammetric stereo photo elevation mapping methods. This new geophysical stereo satellite elevation processing method produces stereo satellite DEMs with significantly better horizontal resolutions and vertical accuracies then the conventional photogrammetric processes. The resolution and accuracy of these high resolution geophysical stereo satellite DEMs has been demonstrated with thousands of ground survey points and by direct comparison with LiDAR DEMs. In areas of sparse vegetation the 1m posted DEMs produced from 50cm stereo satellite photos have demonstrated elevation accuracies of better than 50cm RMSE. These stereo satellite DEMs have resolutions and accuracies similar to high quality LiDAR DEMs.
-
-
-
Monitoring Oil and Gas Facilities: Use of Natural Reflectors and Artificial Corners Reflectors
Authors M. de Farago, G. Cooksley, M. de Faragó and J. Garcia RoblesInSAR based ground motion monitoring of oil and gas facilities, including pipelines, plants and LNG terminals, contributes to production planning and the safety of operations. Factors such as seismicity, landslides, coastal erosion or anthropogenic effects such as the oil and gas activities themselves may cause infrastructure to be affected by ground motion, which may in turn pose a threat to the surrounding population and wildlife, or the efficiency of the infrastructure itself. The PSI technique is an efficient tool for assessing and monitoring the effects of the aforementioned on the infrastructure and the surrounding area.
-
-
-
SqueeSAR Surface Displacement Measurements for Reservoir Monitoring and Modelling in the InSalah Project
Authors A. Ferretti, A. Fumagalli, F. Novali, C. Prati, F. Rocca and A. RucciKnowledge of the structure controlling the fluid/gas flow at the reservoir layer is critical in many activities, such as petroleum/gas extraction and carbon capture sequestration (CCS). To this end, timelapse geophysical observations are considered as an important instrument to better understand the fluid flow in the subsurface. In the last decade, a new remote-sensing technology called PSInSAR™ - based on the use of satellite radar data - is receiving an increasing attention, thanks to its capability to provide accurate, large-scale surface deformation measurements with millimetric precision. The utility of such data for reservoir monitoring and modeling, has been proved in the InSalah project, one of the three most famous CCS project. SAR data has been used to track the injected CO2 [1], to monitor possible fault reactivation and to estimate the effective permeability of a producing gas reservoir.
-
-
-
Multitemporal Lidar Monitoring of Landslides
By Franco CorenThis paper’s aim is to demonstrate the possibility to successful apply high resolution multitemporal LiDAR to landslide monitoring in the special case of an active, large earthflow characterised by rapid to moderate rate of movement (the Valoria landslide, Northern Apennines, Italy). The Valoria landslide is a large, active earthflow which mostly involves low-plasticity scaly clays (Manzi et al., 2004; Corsini et al., 2006). It has been completely reactivated in 2001, and since then it has been intermittently active with displacements that in one season could be in the order of hundreds of meters. This recent evolution has caused a significant modification in the slope morphology, with quite distinct depletion and accumulation zones. Landslide occurrence is related to a variety of factors such as underlying geology, mechanical properties of soil and rocks, degree of weathering, groundwater conditions, and the presence (or absence) of geological structures such as joints, faults, and shear zones (Fell et al., 2000). Because of this complexity, landslide monitoring is commonly adopted both in the early detection of risk factors and as an effective tool for landslide hazard management and analysis (Sassa & Canuti, 2008).
-
-
-
The Role of Gravity Gradiometry in the Rremote Sensing Tool Kit
By Duncan BateThe natural variations in Earth’s gravitational field can provide the explorationist with information on the density variations in the subsurface. With careful consideration, and often integration with other data, this can in turn be used to improve our understanding of the geological setting and where valuable resources may be found. The use of gravity measurements has long been present in exploration geophysics. Originally this involved the field geophysicist walking the ground with the instrumentation to record the data. However, advances were made allowing the measurements to be taken from a moving platform far above the surface (sea, air, or satellite). As the gravity technique is passive, recording only the natural properties of Earth it is well suited to a remote sensing deployment. This reduces cost and allows large areas to be covered quickly, safely and with no direct contact with the ground.
-
-
-
Cross-validation and Integration with Ground Based Geophysical Data
Authors A. Laake, C. Strobbia, A. Cutts, L. Velasco and M. SheneshenReservoir mapping in the Gulf of Suez petroleum system is challenging because rifting fragmented the reservoirs by rift-parallel and transfer faults leaving the reservoirs confined to stratigraphic, structural, and combined traps. We have developed a technique to address this challenge that integrates fault outcrop mapping using satellite image interpretation, seismic near-surface characterization techniques such as Rayleigh wave velocity mapping and ray parameter interferometry, as well as ant tracking of faults and geobody delineation on a prestack time-migrated (PSTM) cube. The technique utilizes a combination of geographic information system (GIS) and geological modelling software for surface/subsurface integration. The joint analysis of Rayleigh wave data with satellite imagery provides a nearsurface structural geological model. The suite of near-surface geological products is enabled by the acquisition, processing, and interpretation of point-receiver seismic data. Detailed shallow structural geology could be imaged in the near surface, a data regime that is conventionally masked by the acquisition noise from the seismic acquisition. The shallow geological model comprises shallow lithological horizons as well as fault zones, the mapping of which may assist the mitigation of shallow drilling risks. The integration of surface and subsurface structural mapping provides the tectonic framework for delineation of the reservoirs in the rift-faulted environment of the Gulf of Suez.
-
-
-
An Overview of Numerical Methods Suitable for Geophysical Imaging
Authors J. Virieux, R. Brossier H. Calandra, V. Etienne, S. Operto, A. Ribodetti and R-E PlessixModelling methods are nowadays at the heart of any geophysical interpretation approach. These are heavily relied upon by imaging techniques in elastodynamics and electromagnetism, where they are crucial for the extraction of subsurface characteristics from ever larger and denser datasets. While highfrequency or one-way approximations are very powerful and efficient, they reach their limits when complex geological settings and solutions of full equations are required for high resolution imaging. A review of three important formulations will be carried out during this presentation: the spectral method, which is very efficient and accurate but generally restricted to simple earth structures, and often layered earth structures; the pseudo-spectral, finite-difference and finite-volume methods based on strong formulation of the partial differential equations, which are easy to implement and currently represent a good compromise between accuracy, efficiency and flexibility; and the continuous or discontinuous Galerkin finite-element methods that are based on the weak formulation, which lead to more accurate earth representations and therefore to more accurate solutions, although with higher computational costs and more complex use. The choice between these different approaches is still difficult and depends on the applications. Guidelines are given here through discussion of the requirements for imaging/ inversion.
-
-
-
Seismic Wave Extrapolation in Isotropic and Anisotropic Media using Lowrank Symbol Approximation
More LessWe consider the problem of constructing a wave extrapolation operator in a variable and possibly anisotropic medium. Our construction involves Fourier transforms in space combined with the help of a lowrank approximation of the space-wavenumber wave-propagator matrix. A lowrank approximation implies selecting a small set of representative spatial locations and a small set of representative wavenumbers. We present a mathematical derivation of this method, a description of the lowrank approximation algorithm, and numerical examples which confirm the validity of the proposed approach. Wave extrapolation using lowrank approximation can be applied to seismic imaging by reverse-time migration in 3D heterogeneous isotropic or anisotropic media.
-
-
-
3-D Parallel Frequency-domain Visco-acoustic Wave Modelling based on a Hybrid Direct/Iterative Solver
More LessWe present a parallel domain decomposition method based on a hybrid direct-iterative solver for 3D frequency-domain modelling of visco-acoustic waves. Frequency-domain seismic modelling reduces to the solution of a large and sparse system of linear equations for each frequency. The hybrid directiterative approach aims to overcome the memory requirement and the limited scalability of direct-solver approaches, on the one hand, and to iteratively solve better-conditioned system than in global iterative approaches on the other hand. The domain decomposition is based upon the algebraic Schur complement method. The reduced Schur complement system is solved with the global minimum residual method (GMRES) and is preconditioned by an algebraic additive Schwarz preconditioner. The MUMPS direct solver is used to factorize local impedance matrices defined on each subdomain. Simulations in the salt models for frequencies up to 12.5 Hz show that the number of iterations increases linearly with frequencies when the grid interval is matched to the frequency and the size of the subdomains is kept constant over frequencies. This makes the time complexity of the hybrid approach similar to that of global iterative solvers. Possible improvements of the method for multi-source simulation involve the use of block iterative solver and two levels of parallelism.
-
-
-
2D/3D Seismic Modelling via a Massively Parallel Structured Approximate direct Helmholtz Solver
Authors Shen Wang, Jianlin Xia and Maarten V. De Hoop and Xiaoye LiWe consider the discretization and the solution of the 2D/3D inhomogeneous acoustic wave equation in the frequency domain, which is known as the Helmholtz equation, including variable density, transverse isotropy (VTI & TTI), and attenuation scenarios. In particular, we are concerned with solving this equation on a large physical domain, for a large number of different forcing terms in the context of the 3D seismic modeling. The advantage of seismic modeling in the frequency domain lies on that for a single frequency all solutions share a common Helmholtz operator. We resort to a parsimonious mixed grid finite difference scheme for discretizing the Helmholtz operator equipped with the Perfect Matched Layer (PML) boundaries, yielding a pattern-symmetric but non-Hermitian matrix. We make use of 2D/3D nested dissection based domain decomposition, and introduce an approximate direct solver by developing a new parallel Hierarchically Semi-Separable (HSS) matrices compression, factorization and solution approach. We cast our massive parallelization in the framework of the parallel multifrontal method. The assembly tree is partitioned into local trees, which are stored and eliminated locally at each process, and a global tree, whose elimination arouses massive communications among processes. The entire Helmholtz solver is a parallel hybrid between the multifrontal and HSS structures. The computational complexity associated with the factorization is almost linear in the size, say N, of the matrix, assuming r is the maximum numerical rank of all off-diagonal blocks in the multifrontal procedure. We benchmark our Helmholtz solver with the state-of-the-art MUMPS solver, and show that our solver is at least one order of magnitude faster than MUMPS for the same problem and on the same computing platform. We demonstrate the efficiency and accuracy of our solver via displaying various 2D (BP2004 and BP2007 TTI models) and 3D (SEAM model) numerical examples.
-
-
-
A Hybrid Technique for 3-D Modeling of High Frequency Teleseismic Body Waves in the Earth
More LessIn the last decade, the deployment of dense regional arrays such as the USArray transportable array has considerably improved our capacity to image the interior of the Earth.
-
-
-
Application of the Discontinuous Galerkin Finite-element Method to Seismic Modelling and Imaging
Authors V. Etienne, S. Operto and J. VirieuxWe present a discontinuous Galerkin finite-element method (DG-FEM) suitable to seismic modelling and seismic imaging in large scale 3D elastic media. The method makes use of unstructured tetrahedral meshes locally refined to the medium properties (h-adaptivity) and of interpolation orders that can change from one element to another according to an adequate criterion (p-adaptivity). These two features allow us to reduce significantly the numerical cost of the simulations. While the efficiency of DG-FEM has been largely demonstrated with high interpolation orders, we favour the use of low orders more appropriate to the applications we are interested in. In particular, we address the issues of seismic modelling or seismic imaging in case of complex geological structures requiring a fine medium discretisation.
-
-
-
A Stable TTI Reverse Time Migration
More LessWe propose a stable TTI acoustic wave equation system by revisiting the anisotropic elastic equations. Based on a VTI system of equations which is equivalent to its elastic counterpart, we introduce self-adjoint differential operators in rotated coordinates to stabilize the TTI acoustic wave equations. Compared with conventional formulations, the new equation system does not add numerical complexity and can be solved by an extremely high order central finite-difference scheme. We show examples that our method provides stable and high quality TTI reverse time migration images.
-
-
-
3D Forward Modeling for Full Waveform Inversion
Authors Denes Vigh and Jerry Kapoor and Kun JiaoFinite Difference (FD) modeling has been used to simulate seismic acquisition and to find out whether imaging artifacts are interfering with real images. 3D application of the modeling got accelerated when Reverse Time Migration (RTM) became an industry wide accepted migration algorithm. At the same time the Survey evaluation and design (SED) made use of FD modeling instead of ray-based methods to prove or reevaluate particular shooting geometries for illumination and imaging. RTM imaging evolved from the isotropic to anisotropic and very quickly reached the tilted anisotropic stage. RTM continues to moving forward to orthorhombic to add complexity to the migration to produce more accurate images. In the mean time Full waveform inversion (FWI) has been a major player to update and refine velocity fields which also uses FD modeling techniques to emulate the acquired data at each stage of the iterations. The FWI has to be inline with the latest RTM propagators if one wants to use them consistently in the depth imaging. The example show how the FD modeling is used to closely model data for FWI in the different acquisition environments.
-
-
-
Experiments on Elastic Wave Modelling in Isotropic and Anisotropic Media
Authors S. Jin, Yiqing Ren and and Shengwen JinRecent advances in seismic data processing with multi-component land and ocean-bottom-node (OBN) data have shown some contributions from share waves, converted waves as well as anisotropy. To get better understanding of the elastic wave propagation in isotropic and anisotropic media, we compared the wavefields in acoustic and elastic media with and without anisotropy. The preliminary experiments on three synthetic models, i.e., wave/solid interface, free surface with topography, and near-surface low velocity layer, demonstrate the complicated wave propagations in elastic media.
-
-
-
Iterative Krylov Solution Methods for Geophysical Electromagnetic Simulations on Throughput-oriented Graphical Processing Units
Authors Michael Commer, Gregory A. Newman and Filipe R. N. C. MaiaModern graphics processing units (GPUs) are designed for efficiently manipulating computer graphics. Their highly parallel architecture makes them also suitable for compute-intensive scientific applications. To provide access to the multithreaded computational resources and associated memory bandwidth of GPUs, graphics hardware manufacturers have introduced new application programming interfaces enabling numerical calculations in a fashion similar to parallel computing paradigms.
-
-
-
Implementation of the Full Waveform Inversion on GPU Cluster
Authors Henri Calandra, Bertrand Denel, Laurent Choy and Pierre Eric BernardFor more than 30 years, seismic reflection has been the main technology used in the Oil & Gas industry. Physics is well known and is based on solving approximations of the wave equation. In addition, geophysicists are now able to find numerical implementations which are the best suited to the computer hardware. Improving velocity models is still very challenging. It is the key to get an accurate seismic depth image. Mainly based on asymptotic approximations, seismic velocity analysis tools have some limitations when dealing with complex velocity models and complex wave propagation paths. Recent progress in high performance computing gives the ability to revive the Full Waveform Inversion formulation for refining velocity models. This technology, known since the mid 80's, is highly challenging due to the non linearity and the non uniqueness of solution. It is based on the minimization of an objective function measuring the difference between computed and observed data. The minimization process is very CPU intensive. Indeed, this is an iterative method which requires the computation of the derivative of the objective function with respect to the model parameters at each iteration by cross correlating the back propagated residual wavefield with the corresponding forward source propagated wavefield.
-
-
-
Shale Resource Development: From Exploration to Rejuvenation
By Usman AhmedNo two shale reservoirs are the same; however, deployment of a sound life cycle approach to shale gas development can lead to incremental return on investment. In the exploration phase some of the critical issues to address include the use of seismic attributes to identify potential high TOC (total organic content) and levels of Vitrinite reflectance. Acoustic impedance has been correlated to levels of TOC. During the appraisal phase, typically the operator focuses on reservoir assessment, trial investigation and sometimes consolidating land positions. Initial reservoir assessment and trial investigation is very important to define the resource base and exploitation strategy and as such logging, coring and testing becomes important. Two items that define shale gas development phase are horizontal wells and hydraulic fracturing to maximize reservoir contact. Geo-mechanical modeling plays a key role in defining the azimuth, orientation and length of laterals plus the size and stages of hydraulic fracture treatment. Optimization of the fracture treatments are then aided by real time micro-seismic monitoring. Addressing all these technical issues will ultimately have to make economic sense. During the production phase where full scale production is ongoing, some of the key challenges include corrosion / scaling wellbore and equipment, lift optimization (in case there is some fluid production), perforation plugging to name a few. Hence, some of the key solutions to address here include microbial control, scale and corrosion inhibition, friction reduction, H2S scavengers, lift system monitoring and optimization. The production phase comes to a point where the field needs to be rejuvenated (the rejuvenation phase) to enhance the recovery. Some of the key challenges include refinement of reservoir assessment and deployment of techniques to exploit additional reserves. Potential deployment solutions include intervention, remediation, re-fracturing and refine well placement (this may very well include reducing well spacing). Specific challenges and associated solutions will be different for different shale plays at all of the five life cycle phases. Hence, clearly understanding and recognizing these specific challenges and associated solutions are keys to incremental return on investment. Usman Ahmed is Vice President, Reservoir Themes & Solutions and Chief Reservoir Engineer. In this capacity Usman leads BHI’s reservoir-driven, cross-BHI solutions in unconventional reservoirs, mature oil and gas fields, performance completions, intelligent reservoirs and carbonates. Usman has more than 30 years of practical petroleum engineering experience from prior roles with Schlumberger, TerraTek and his own reservoir and production engineering consulting firm. He holds a B. Sc. and M. Sc. (both in Petroleum Engineering) from Texas A&M University. Usman has contributed to the industry through more than fifty publications, textbooks and patents and has held many technical and professional leadership roles.
-
-
-
Geology and Shale Gas Potential of European Sedimentary Basinsan Overview
Authors Hans-Martin Schulz and Brian HorsfieldMany parts of Europe contain prime targets for shale gas exploration. But compared to North America, Europe has a much more complex and compartimentalized setting of geological units. 510 Tcf shale-gas resources were estimated for Western Europe in 1997 by H.-H. Rogner, but the latest estimation by IHS CERA offers a much more optimistic number of 1000 Tcf (”Europe May Match North America in Unconventional Gas Promise”). Palaeozoic and Mesozoic black shales that are attractive for shale gas exploration occur in many European basins where conventional production is declining, an underutilized gathering infrastructure exists and markets are accessible.
-
-
-
Night-time Hunting for Furtive Animals: Data Availability Challenges in International Exploration for Partially Understood Shale Resource Plays
More LessThe recent success of shale resource plays in North America has given rise to an interest in these plays around the world. Europe has seen an almost feverish rush towards securing acreage and large tracts of land have been contracted out for shale exploration in Poland, France, Germany and elsewhere. Unfgfortunately, the exploration results from these licenses are not equally fast forthcoming and, thus far, have not always been very positive. Recently, critical voices have suggested that development of shale resources in Europe may happen at a significantly lower pace than has been the case in the US for a combination of geological, environmental, economical and regulatory reasons.
-
-
-
Production Analysis of Gas Shale Wells: Different Solutions for Different Data Scenarios
More LessProduction analysis is required to evaluate well productivity, forecast production ahead and calculate economics for budgetary reasons. Depending on the amount of data on hands, production analysis may be performed using classic statistical techniques, such as decline curve analysis where production trends are found from historical data and extrapolated over an extended period of time, typically with terminal declines. This form of forecasts is frequent when wells are not well characterized. Extensions of these forms of decline can be the linear flow forms, as it is well established in the literature that cumulative production is a linear function of square root of time. However this is only valid over a certain period of time, until boundary dominated flow prevails. Having more information of the actual reservoir allows to create numerical models which capture gas flow behaviors in nano Darcy rock. Time dependant desorption for instance, transient flow in the matrix cells, or permeability dependency to pressure may be modeled. It is then possible to history match clean up periods and initial gas flow, to better forecast long term performance. In cases where the hydraulic fracture network is characterized with microseismic, better solutions with numerical models are also pertinent. We will demonstrate in this discussion this various techniques through case examples.
-
-
-
New Method for an Easy Use of Stochastic Process-Based Models Such as Flumy to Reproduce a Fluvial Meandering Reservoir
Authors Isabelle Cojan, Jacques Rivoirard and Fabien Ors and Didier RenardFLUMY is a comprehensive modeling tool dedicated to meandering fluvial systems. The simulation is based on the migration of the centerline, based on hydraulic equations. The model reproduces the growth of meander loops, including chute and neck cut-off. Associated deposits correspond to point bars built up in relation to the migration, sand and mud plugs filling the abandoned channel in association to neck cut-off. Aggradation is controlled by the frequency and intensity of overbank floods, resulting in coarse sandy deposits at the bottom of the channel, deposition of silty levees in the vicinity of the channel and shale alluvium further away in the floodplain. Levee breaching results in crevasse splays and may generate avulsion, creating a new channel path. To be operational, process-based models such as FLUMY for fluvial meandering reservoirs make use of a limited number of key parameters, say a dozen. However choice of the parameters is most often a difficult issue as many interactions exist between these: sand body extension is largely related to migration rate and frequency of avulsion, the latter being influenced by the aggradation rate controlling the floodplain topography and giving specific sand ratio. In addition, most of the process parameters cannot be directly inferred from data or analogues.
-
-
-
Recent Advances in Geostatistical Tools in Order to Capture Realistic Geological Features
Authors Biver Pierre and Sergey NaumovIn practical operational studies, facies stochastic modelling is often performed with traditional geostatistical algorithms such as Sequential Indicator Simulation or Truncated Gaussian Simulation. In this presentation, we will focus on recent improvements of geostatistical techniques used to improve geological realism, and we will stress on the key points for their use in operational context. As a basis of discussion for the workshop, we will briefly address: the use and improvements of Multiple Point Statistics; the generalization of Truncated Gaussian Simulation with efficient Pluri- Gaussian Truncation; and the Boolean models with flexible objects.
-
-
-
The Importance of Re-Creating Realistic Reservoir Architecture in 3-D Sub-Surface Models: Lessons from Outcrop Studies
Authors Paul Davies, Huw Williams, Simon Pattison and Andrea MoscarielloThe objective of this study is to define the best approach to recreate realistic reservoir architecture in sub-surface models of wave-dominated shoreface/deltaic deposits. To achieve this goal, a fullydeterministic 3-D model based on 10 measured sections and fully continuous photo-panoramas was first created which exactly matches the sand and shale architecture in a well-exposed series of outcrops in Utah, USA. This model would be used as the benchmark against which all test model results could then be judged. The exceptionally well-exposed, near-horizontal Campanian strata of the Book Cliffs in eastern Utah were chosen as the basis for this study as they provide ideal outcrop analogues which have previously been used to develop, test and refine many sedimentary and stratigraphic models, including the principles and concepts of sequence stratigraphy. To approximate the sparsity of real sub-surface data, only three input well data points were used in all the models. Three different stochastic modelling techniques have been tested to try and capture the sediment body continuity and architecture observed in the outcrop. In addition, a deterministic model was built using only the same three log sections, supplemented only by geological knowledge about the palaeogeography, but strictly following a set of correlation rules and guidelines derived from the outcrop.
-
-
-
Three-Dimensional Numerical Modelling of Clinoforms within Deltaic and Shoreface Reservoirs
Key factors influencing fluid flow and reservoir behaviour include facies architecture and heterogeneity distribution conditioned to stratal surfaces. Within shallow-marine reservoirs, clinoforms are one common type of stratal surface. Clinoforms are palaeoseaward-dipping surfaces whose geometry preserves the depositional morphology of the delta-front or shoreface slope, and whose position reflects shoreline progradation history. Clinoform surfaces control aspects of facies architecture within parasequences, such as facies interfingering, which strongly affects the permeability architecture, due to the association of facies types with major permeability contrasts (Howell et al., 2008; Sech et al., 2009; Jackson et al., 2009). Clinoform surfaces can also act as barriers or baffles to flow (Figure 1), where there is calcite cementation, mica concentration, intense bioturbation, or mudstone and siltstone deposition along them, which further modify permeability architecture (Howell et al., 2008; Jackson et al., 2009). Under certain displacement conditions, it is therefore important to include clinoforms in reservoir models. However, standard reservoir modelling techniques are not well suited to capturing clinoform surfaces, particularly if they are numerous, below seismic resolution and/or difficult to correlate between wells. We present a new numerical algorithm that generates multiple clinoform surfaces within a volume defined by two bounding surfaces, for example a delta-lobe deposit or shoreface parasequence. The geometry and spacing of the clinoform surfaces is specified by the user, and the surfaces can be conditioned to well data where this is available.
-
-
-
Rule-Based Modelling of Modern-Day and Past Carbonate Shoals Environments
Authors Claude-Alain Hasler, Erwin W. Adams and Brigitte M. VlaswinkelCarbonate reservoirs tend to be particularly heterogeneous because, besides physical sedimentary processes, biological and chemical inputs play an important role in shaping the initial depositional architecture. In addition to these processes, diagenetic overprints, which can either follow or cross cut the primary depositional architecture, introduce another level of complexity by altering the primary depositional porosity and permeability. This combination of multiple processes interacting in both a temporal and spatial sense creates carbonate reservoir characteristics with complex architectures and heterogeneities on various scales, i.e. geobodies. As a result, pattern-based modelling is not always applicable, especially because of the difficulties arising in the identification and definition of discrete geobodies. In order to better predict and model geobodies, research developments are steered toward process-oriented methods for interwell modelling. This extended abstract presents a process-oriented approach using cellular automata applied to interwell-modelling.
-
-
-
Mps Facies Modelling – A Promising Technique Tested in Practice
Authors Christian Höcker and Adriaan JanszenSince the synthesis by Caers & Zhang (2002), facies modelling with Multiple-Point Statistics (MPS) has been recognised as a method that can substantially change subsurface modelling. Main advantages are that MPS can generate much more realistic and coherent facies patterns than 2-point statistic methods, that it can be driven with fewer and simpler parameters than genetic or object-based modelling – all this while being easily conditioned to external data. Despite these promising characteristics and implementation in standard platforms for subsurface modelling the technique has not yet been widely used in the oil and gas industry. With performance bottlenecks gradually being removed, the hurdles to successful MPS simulations may be the lack of suitable training images, in particular for modelling of 3D features, and training or guidance to address the somewhat different conceptualisation of MPS modelling runs. We have applied MPS modelling to very different depositional environments with demanding requirements, ranging from carbonate platforms through glacial settings and channelised deposits in meander belts and turbiditic settings. Below a number of topics are discussed which involve the handling of non-stationarity through the auxiliary or trend properties. The shown results were obtained using Impala MPS in JewelSuite.
-
-
-
Surface-Based Reservoir Modelling
More LessConventional grid- or pixel-based reservoir modelling algorithms have been developed to account for uncertainty associated with sparse subsurface data and to produce geological models which can be upscaled onto regular, structured grids suitable for flow simulation. However, the use of a regular grid to model the distribution of facies types and petrophysical properties limits the spatial resolution of the geological model, so these methods can fail to capture key geological heterogeneities, particularly those at small length-scales and/or with complex geometries. The upscaling step also results in a loss of fidelity. Outside the world of reservoir modelling, other disciplines have moved away from using simple structured grids to characterize complex spatial geometries. A common approach is to use unstructured meshes to discretize space within volumes defined by surfaces. We present a surface-based approach to reservoir modelling in which any geological heterogeneity that impacts on the spatial distribution of petrophysical properties, is modelled as one or more discrete volumes bounded by surfaces.
-
-
-
Testing the Multi Point Statistic Modelling Approach Using Outcrop Data
Authors Robert Kil and Andrea MoscarielloThe island of Favignana provides an exceptional three-dimensional insight of the internal architecture of a Lower Pleistocene complex bioclastic calcarenitic wedge. Field data indicate that these calcarenites consist of foramol association, originated during cool to temperate water conditions in a high-energy, storm dominated, open shelf environment. Predominant sedimentary structures vary from small-scale trough cross bedding, large foresets to large scours filled by structureless massive bioclastic material. Overall a main transport direction to the SE indicates the prograding nature of these deposits. Based on the sedimentological characteristics and reservoir properties, this sedimentary complex is believed to be a relevant analogue for several important hydrocarbon fields worldwide in North Africa, Middle East and South America such as the Perla Field in Venezuela.
-
-
-
The Impact of More Determinism on a Reservoir Model
More Lesse design driven in some cases) the geologist applies a variety of statistical based techniques to derive the subsurface reservoir architecture. The stochastic model then serves as a media to which can have a tendency to convince ourselves that we have improved our understanding of the subsurface (“I used my stochastic modelling tool and integrated all my log data therefore it must be geologically valid”). In this case study from the Niger Delta, the establishment of deterministic-based geological models through the use of the outcrop analogue observation principles combined with core-based and logbased facies associations are used to supersede the more common practise of stochastic (pixel-based) geological models. The aim of this paper is to demonstrate how thinking more about our conceptual geological model and less on the application of the statistical algorithm (and their inherent workflow) with the data that we have available, there is the possibility to add orders of magnitude of determinism into the geological model making the reservoir description more realistic and predictable.
-
-
-
Markov Chain / Transition Probability Approach for Geo-Cellular Modeling of Carbonate Depositional Geobodies: Proof-of-Concept Markov Random Field Simulator
Authors Brigitte Vlaswinkel, Sam Purkis and Nuno GraciasA reservoir model based on subsurface data as well as appropriate (reservoir) analogs is a critical field management tool, and therefore should accurately incorporate features affecting fluid storage, distribution and flow. However, the complexity of specifically carbonate reservoirs makes detailed direct-characterisation of their 3D heterogeneity difficult, and this problem is worsened in the subsurface where lateral constraint on facies architecture is typically poor to non-existent, while information on vertical stacking maybe rich. A tantalizing strategy to mitigate this disjoint lies in the use of Walther’s Law, which offers the possibility of using vertical transitions to elucidate lateral juxtaposition motifs (Doveton 1994; Parks et al. 2000; Elfeki & Dekking 2005; Riegl & Purkis 2009). The implication being that a reservoir model, competent at least in terms of transition statistics, could be built against information harvested down-core. Coupled with simple Markov theory, it can be deduced that under such conditions, comparable facies frequencies and transition probabilities will link vertical and lateral facies stacks. If one can be quantified, the other can be solved for. It therefore follows that 2D or even 3D Markov chain models can be developed by assuming that spatial variability in any direction can be characterised by a 1D Markov chain. Although this may seem like a tenuous theoretical leap, the assumption here is merely that Markov chains might characterise spatial variability not only in the vertical but in other stratigraphic directions such as dip or strike (Carle et al. 1998). In this paper we present a proof-of-concept Markov Random Field Simulator (MRFS) and test its 3D geo-cellular modelling capabilities against a Cretaceous carbonate outcrop dataset. The assumption Walther’s Law can be applied has been verified successfully beforehand, based on two case studies that call upon (carbonate) data from the Cretaceous, Miocene and Modern (Purkis et al., submitted).
-
-
-
Seismic Monitoring of CO2 Injection into a Depleted Gas Reservoir – Otway Basin Project
Authors M. Urosevic, R. Pevzner, V. Shulakov, A. Kepic, E. Caspari and S. Sharma and B. GurevichWithin the Stage I of the Otway Basin Project, of the Australian Cooperative Research Centre for Greenhouse Gas Technologies (CO2CRC), approximately 65,000 tons of CO2/CH4 mix in the ratio of 80/20 was injected into the Waarre C formation (depleted Naylor gas reservoir) over the last two years. The CO2 was produced and transported from a nearby natural accumulation, via pipeline and injected into a sandstone reservoir. The use of depleted gas fields for CO2 storage as well as CO2-based enhanced gas recovery are of global importance. Thus, the CO2CRC Otway Basin Pilot Project provides important experience in establishing whether such scenarios can be monitored by geophysical techniques, in particular seismic time-lapse methodology. Indeed injection of CO2 into a depleted gas reservoir (within residual gas saturation window) does not present favourable conditions for the application of geophysical monitoring techniques. Numerical simulation of the CO2 injection process at Otway show that changes in elasticity of the reservoir rock will be quite small and difficult to monitor even with the most powerful time-lapse (TL) seismic methodologies. Consequently, the design and implementation of the monitoring program had to address these issues. The monitoring program had two objectives: (1) to ensure detection of possible gas leakages out of the reservoir into other formations and (2) to attempt to detect changes of seismic response due to CO2 injection into the reservoir. To increase the sensitivity of TL seismic we combined 3D VSP with 3D surface seismic. For a land seismic case, we achieved excellent repeatability with 3D time lapse surveys, which at the reservoir level produced normalised RMS difference values of about 20% for surface seismic and 10% for 3D VSP, respectively. The location of the time-lapse anomaly detected at the reservoir level is broadly consistent with CO2 flow simulations. However borehole seismic measurements showed that timelapse is very small to be reliably evaluated from repeated surface seismic measurements as the anomaly is of a similar magnitude to noise, making its unique attribution to the CO2 plume difficult. One of the important outcomes of these studies is evaluation of land time-lapse seismic capabilities. New understanding and new methodologies for assessment of 3D seismic data repeatability were developed during Otway Basin tests. This helped us understand and validate time-lapse signal form the reservoir. It also enabled us to demonstrate that quality of time-lapse land surveys can be high enough to be able to detect very small, up to five thousand tonnes, leakages which is of essential importance to any monitoring program as it provides possibility for rapid mitigation.
-
-
-
Monitoring and Modelling CO2 Injection at Sleipner
Authors Rob Arts and Farid Jedari EyvaziCO2 has been injected into the Utsira Sand at Sleipner since 1996, with more than 11 million tonnes currently in the reservoir. Seismic monitoring surveys to follow the migration of the CO2 in the reservoir have been carried out in 1999, 2001, 2002, 2004, 2006 and 2008. The CO2 plume is imaged on the seismic data as a prominent multi-tier feature, comprising a number of bright sub-horizontal reflections, growing with time and interpreted as arising from up to nine discrete layers of high saturation CO2, each up to a few metres thick. Quantitative seismic interpretation of the time-lapse data has included synthetic seismic modelling to derive CO2 distributions in the reservoir (Skov et al, 2002, Arts et al., 2004). Convolution-based modelling has shown that seismic reflection amplitudes are broadly related to layer thickness via a tuning relationship. However acquisition geometry, lateral velocity changes, mode conversions and intrinsic attenuation are all likely to affect amplitudes, especially at the lower levels in the reservoir, and need to be incorporated within a rigorous quantitative analysis. Moreover, the predicted response heavilly relies on a rock physics model (Arts et al., 2004) to convert simulated saturation data to seismic impedance data. In this study we present a reservoir simulation sensitivity study based on the existing models of the Utsira Sand to mimic the spreading behavior of the CO2 as observed on the time-lapse seismic data. The transmissibility of each intra-reservoir shale layer has been estimated by matching the simulated quantity of CO2 with the seismically observed (estimated) quantity of CO2 per layer. The corresponding spreading patterns per level have been compared to the seismically observed accumulations.
-
-
-
Geophysical Monitoring of Small Scale CO2-injection into a Deep Saline Aquifer - the Ketzin Pilot Site
The first European onshore CO2 storage site has been set up at Ketzin (Germany). It is operated as a small scale pilot site, with a special focus on the validation of various geophysical, geochemical and microbial monitoring technologies. The site has been developed within the EU funded project CO2SINK, coordinated by the GFZ German Research Centre for Geosciences and supported by numerous national and EU initiatives. Injection started in June 2008. Since then, between 1.000 and 2.000 tons of CO2 per month have been injected into sandstone layers of the Triassic Stuttgart Formation at a depth of 620 m to 650 m. The total amount of injected CO2 was 45.000 tons in February 2011. Injection is planned to continue until up to 100.000 tons will have been injected into the Stuttgart Formation. Currently, food grade CO2 is being injected at Ketzin, however, the technical and contractual preparations are underway to inject a limited amount of CO2 from a sequestration pilot facility.
-
-
-
The IEAGHG Weyburn-Midale CO2 Monitoring and Storage Project: A Project Overview and a Rebuttal to Recent Accusations of a Leak from the Reservoir
More LessThe IEAGHG Weyburn-Midale CO2 Monitoring and Storage Project is completing its final year of research in the spring of 2011. The project, which is conducted in conjunction with a two commercial enhanced oil recovery CO2 floods in South-eastern Saskatchewan, Canada, began in 2000 with a first phase that confirmed the suitability of the Weyburn oilfield formation for the long-term storage of CO2. In 2005, the final phase of the project began, offering a comprehensive research program covering • site characterization • measurement, monitoring and verification of the injected CO2 • wellbore integrity • risk and performance management This final phase has also offered an integrated approach between this technical and non-technical research, and has included an examination of regulatory issues and public communications and outreach efforts. A comprehensive best practices manual – which will offer other jurisdictions and similar project proponents the best means of transitioning CO2-EOR operations into long-term storage – is to be published in late 2011.
-
-
-
Seven Years of CO2 Injection in the Nearly Depleted Gasfield K12-B
Authors V.P. Vandeweijer, L.G.H. van der Meer and F.M.M. MuldersDespite the efforts directed at the development and large scale application of sustainable energy, the world still heavily depends on fossil fuels, and will do so for at least the next decades. The challenge our world is facing is to develop technology options that allow for continued use of fossil fuels without substantial emissions of CO2 (IEA, 2000) through cost-effective capture and storage and at the same time maintaining industrial competitiveness in global markets. Subsurface storage of CO2 in geological systems could bridge the transition period required to develop sustainable sources of energy. This option is currently being studied and applied worldwide, including demonstration projects. In 2004 a demonstration project commenced where CO2 was re-injected into the gas reservoir at K12-B. This made K12-B the first location in the world where CO2 is being injected into the same reservoir from which it was produced as part of the natural gas. The K12-B gas field is located in the Dutch sector of the North Sea. The top of the reservoir lies approximately 3800 meters below sea level, and the ambient temperature of the reservoir is over 127 ºC. The K12-B gas field has been producing natural gas from 1987 onwards and is currently operated by GDF SUEZ E&P Nederland B.V. The natural gas has a CO2 content of 13% which is relatively high. Since the start of the gas production the CO2 component has been separated from the natural gas stream on-site and. since 2004 part of the separated CO2 is re-injected into the gas field. Since then numerous tests have been performed in order to investigate various aspects of underground CO2 storage in nearly depleted gas fields mainly focussing on well integrity and the behaviour of CO2 in the well and reservoir.
-
-
-
Monitoring and Quantification of Stored CO2 with Combined P-wave Velocity and Resistivity
Authors Takahiro Nakajima, Ziqiu Xue and and Toshifumi Matsuokaal., 2008, Xue et al., 2002, 2005; Xue & Ohsumi 2004a, 2004b, 2005; Onishi et al., 2006). In the most CO2 storage sites, seismic survey has been conducted to monitor the injected CO2. From the recent injection projects, seismic survey shows great results for monitoring the migration of the CO2 in the reservoirs such as Sleipner or Nagaoka site (Arts et al., 2002, 2004; Bunge et al., 2000; Davis et al., 2002; Xue et al., 2006). In Nagaoka project, studies have attempted to estimate the CO2 saturation around the observation wells by using the results of well logging and laboratory studies (Kim et al., 2009b, Nakatsuka et al., 2009, Xue et al., 2006). When estimating CO2 saturation from seismic survey, Gassmann’s theory which consisted of bulk modulus of the saturated porous rock has been often used (Gassmann, 1951). When the saturation was less than 20%, P wave velocity shows good response but when the saturation was more than 20%, P-wave velocity became less sensitive to CO2 saturation. P-velocity is not sensitive to gas saturation in high gas saturation regime (Sg>20%) for either homogenous saturation or patchy saturation with patchy size << wavelength (Xue & Lei 2006, Lei & Xue 2006). To overtake this weak point of seismic monitoring for the estimation of CO2 saturation, there is a need for estimating accurate CO2 saturation using resistivity. In this paper, laboratory experiments have been conducted to monitor combined P-wave velocity and resistivity simultaneously in porous sandstone during CO2 injection process.
-
-
-
Deep Saline Aquifers and Coal Seams as Potential sites for the CO2 Geological Storage in Italy.
More LessThe complex tectonic setting, the high seismic risk of the most part of its territory, and the volcanic risk associated to a large part of it, together with the high population density reduce the possibilities for a wide scale application of CO2 geological storage in Italy.
-
-
-
A Natural Analogue of a CO2 reservoir: the Latera Caldera (central Italy). Insight from Field Data to Numerical Modeling of Fluid Flow through a Fracture Network.
Authors S. Bigi, Alemanni A., Battaglia M., Lombardi S., Campana A., Borisova E. and Loizzo M.Fractured volumes of rock and fault zones in the upper crust are typically composed of complex brittle discontinuities network. Determining the architecture of the fracture network and the patterns and rates of fluid flow in these structural discontinuities is a three dimensional problem. Modeling of fractures is generally performed using two basic approaches, probabilistic and deterministic. The obtained 3D models are used for numerical simulation of fluid flow, in order to identify the primary controlling parameters of fault and fractures related fluid flow. 3D probabilistic model of fractures network are based of field data (orientation, spatial distribution and density), used as input parameters, whereas deterministic models are based on kinematics and dynamic constraints. In any case, their accuracy is rather limited, due to the high number of parameters that controls fracturing processes (lithology, pressure, stresses). In this work we reconstructed a 3D fracture network directly from field data from a site that is a natural reservoir of CO2, that has been selected as a test site within the “CO2GeoNet” EC project, and than used it to simulate a fluid flow with a numerical model.
-
-
-
Benefit of InSAR satellite data displacements to validate 3D reservoir pressure computation using a one way 3D fluid flow geomechanical coupled modelling at Krechba (preliminary results)
Authors J-P. Deflandre and Baroni A. and Estublier A.To successfully deploy the Carbon Capture and Storage (CCS) concept on a site, a series of milestones has necessary to be passed. One of them is to demonstrate the reliability of short term performance assessment prediction at the CO2 geological storage site. Validation must be established on a monitoring-based verification approach. It may allow after achievement to anticipate the long term fate of the CO2 within the storage complex perimeter to finally make risk management efficient. The first step consists in modelling the CO2 migration within the saline aquifer storage and the resulting reservoir pressure field of which the evolution will help in managing major risk scenarios (unsuitable fluid migration, effective stress redistribution, fault reactivation…). In the frame of the CO2ReMoVe European project, the research work program aims at developing tools and methods to reach such an objective on testing and improving them on a series of pilot sites. At In Salah (South Algeria), BP, Statoil and Sonatrach target at re-injecting the CO2 produced on a series of gas fields into the Krechba saline reservoir aquifer. Up to 1 million tons of CO2 per year will be stored during approximately 27 years. CO2 injection is achieved using three horizontal wells drilled in the northern part of the structure and started in 2004 at well Kb-501(Ringrose et al. 2009). On this industrial-scale pilot case, IFPEN worked at testing and validating a coupled fluid flowgeomechanical modelling workflow (Deflandre et al. 2010). The proposed approach presented here aims at verifying and improving the 3D fluid flow and geomechanical modelling reliability using independent InSar satellite surface displacements for comparison with computed surface displacement ones.
-
-
-
Measuring ground movement with InSAR data at In Salah
Authors A. Ferretti, A. Fumagalli, F. Novali, C. Prati, F. Rocca, A. Rucci and S. CespaKnowledge of the structure controlling the fluid/gas flow at the reservoir layer is critical in many activities, such as petroleum/gas extraction and carbon capture sequestration (CCS). To this end, timelapse geophysical observations are considered as an important instrument to better understand the fluid flow in the subsurface. In the last decade, a new remote-sensing technology called PSInSAR™ - based on the use of satellite radar data - is receiving an increasing attention, thanks to its capability to provide accurate, large-scale surface deformation measurements with millimetric precision. The utility of such data for reservoir monitoring and modeling, has been proved in the InSalah project, one of the three most famous CCS project. SAR data has been used to track the injected CO2 [1], to monitor possible fault reactivation and to estimate the effective permeability of a producing gas reservoir.
-
-
-
Enterprise Information Management and Information Governance 2011
More LessThe benefits of holistic information management are clear: Complete and trustworthy information allows business leaders to make better decisions faster. The perils of fragmented information management – unnecessary cost, risk and loss of productivity – are equally clear. We'll help you assess the "state of the art" of enterprise information management (EIM) and information governance and help you start or continue your own information management journey. The concept of remote computing for E&P began in earnest with the rise of the Dot Coms. However the infrastructural requirements; bandwidth, disk space, and even user-interfaces did not help to make it an immediate success.
-
-
-
Cloud Computing: Will it affect E and P?
By Augustin DizThe concept of remote computing for E&P began in earnest with the rise of the Dot Coms. However the infrastructural requirements; bandwidth, disk space, and even user-interfaces did not help to make it an immediate success. The nature of the compute services has been changing. Cloud Computing, a generic name for a host of services (from infrastructure to software hosting) is generating changes.
-
-
-
Adaption and use of Standard Data/Information Management methodology
By Eldar BjørgeFor many years the E&P industry has been focusing on data management of large amount of structured data (seismic, well data, production data etc). The methods and standards have been developed within the E&P industry itself. Enables have been roles and responsibilities (ownership and stewardship), standardization, data quality of large data sets and IT technology (e.g. corporate data stores for quality assured data). Emerging trends, like Enterprise Architecture and integration of structured and unstructured data, has focused on a holistic corporate view on data management. To be able to model information on an enterprise level Statoil looked into different possibilities and decided to use the DAMA1) methodology as the basis for defining IM terminology and IM functions. The DAMA framework was used to prioritize IM functions and make best common practice documents. An Enterprise Architecture project has documented the connection between processes, information and tools for all main processes. Within Information Architecture we used the recommended best practice to establish an Enterprise Data Model for all important information objects and enterprise master data. The purpose has been to make information models on enterprise and process level, available to management, projects and service providers to be used as a communication tool between business and technical personnel and as a tool for categorizing all information vital for strategic and operational business decisions or under legal requirement These activities have initiated activities within Master Data Management activities and with focus on master Data governance.
-
-
-
Managing E and P Data in Norway – The National Data Repository (NDR) Diskos. Achieving Success through Collaboration
By Eric ToogoodIn 1995 the Norwegian Petroleum Directorate (NPD) and a number of key oil companies set up a joint database for the management of digital E&P data from the Norwegian Continental Shelf (NCS), the “Diskos Database” (Knudsen,1998). This collaborative Norwegian approach to the management of E&P data by the Diskos group of companies has proved to be a big success.As oil companies in Norway have an obligation to share data when participating in production licences there was a powerful incentive to reduce the redundancy of data storage by individual companies. The database is managed on commercial terms by a trusted third party operator, selected through a competitive bid process according to Norwegian and EU procurement legislation. All data is available through secure broadband access. Data in Norway can be purchased and traded, which leads to complex challenges regarding security and the management of user access to data. It is necessary, therefore, to have a technical solution that allows for sophisticated data access controls at a very fine grained level (i.e. shot-point level for seismic data and document level for reports). The PetroBanksoftware,was developed in the early 1990’s for the Diskos group and the database still runs this technology. Initially only a few oil companies were active Diskos members, but now in its 17th year the number of oil companies and institutions using the Diskos solution has grown to well over 50.
-
-
-
Rollout of a Master Data Store concept on the Example of the Acreage Data Domain
More LessThe MDS Acreage project is the first rollout of the MDS concept in OMV. This presentation will give you an overview from the requirements specification till the final roll out of the MDS Acreage project. It will highlight the lesson learned from all project phases and suggest an outlook to the further implementations. The goal of the project was the replace an existing management report for OMV’s acreage data and to integrate this essential asset information into the Master Data Store environment. The process from the initial idea to the final rollout took more than two and a half years. The complexity of handling a project of such a long timeline was also within organizational changes of the company. Stuffing was replaced or changed positions. A totally new organization appeared in OMV and in parallel in E&P. Knowledge transfer and not loosing track of the initial project idea was one of the critical success criteria’s.
-
-
-
Data Ownership Model in DONG E and P
More LessFor many years many technical data users in the E&P industry have held the view that data was something the Data Management Departments should take care of, "just" ensuring that all the necessary data were in the right place, in the right format at the right time. Alternatively some users held the view that the data was theirs to store and use in the formats and places they chose. It is now generally accepted, at least in the G&G user environment, that data-quality and data-availability is a shared responsibility between the users and data management professionals. Breaking down the professional silos and realizing that the same data are needed many times in the E&P value chain has also put focus on agreed standards and quality assurance. In order to promote this shared responsibility and ensure compliance a Data Ownership Model has been introduced and agreed upon in DONG E&P. The Data Ownership Model clarifies and formalizes the "ownerships" and responsibilities for the data standards and the data management processes throughout the data life cycle and hence help to maintain the integrity and quality of DONG E&P's data assets. The presentation presents the Data Ownership Model and discusses the process through which it has been developed; agreed between management, users and data management; and implemented in the organization.
-