- Home
- Conferences
- Conference Proceedings
- Conferences
72nd EAGE Conference and Exhibition - Workshops and Fieldtrips
- Conference date: 14 Jun 2010 - 17 Jun 2010
- Location: Barcelona, Spain
- ISBN: 978-90-73781-87-0
- Published: 13 June 2010
51 - 100 of 105 results
-
-
Improvements in Imaging and Reduction of Uncertainty in Velocity Determination by the Use of Wide Azimuth Surveys
Authors A. Bartana and D. KosloffSeismic velocity determination has suffered from insufficient coverage of the data acquisition. For this reason only smooth long wave length components of the velocity variation can be reliably recovered. We show by means of a theoretical study that multi azimuth data has the potential to significantly improve velocity determination. In this study we examine the capability of multi azimuth acquisition in resolving small velocity anomalies by means of a 3D synthetic example. The model consists of a layered structure which contains two small velocity anomalies. The study compares the resolution when the migrated gathers contain no azimuthal information to the case when the gathers are binned both according to offset and azimuth. The results show that conventional gathers can only obtain a blurred image of the velocity anomalies, whereas with multi azimuth gathers the velocity anomalies appear distinctly.
-
-
-
Coil Shooting on Tulip discovery in Indonesia: a summary of the work done and lessons learned until now
By M. BuiaCoil Shooting [French, Cole, 1984; Durrani, 1987] is a technique in which a marine towed streamer vessel acquires an almost continuous sequence of circular "lines". The circular line geometry is repeated in the X and Y directions to build up fold, offset and azimuth distribution. This method allows for full azimuth (FAZ) acquisition using a single vessel, shooting on a continuous turn. The time between each circular line is of the order of minutes, as opposed to hours compared to conventional race-track acquisition. This results in high acquisition uptime and efficiency. Eni Indonesia and WestenGeco shot and processed through PSDM a full 3D Circular Shooting survey (Coil) over the Tulip Discovery in Indonesia between August 2008 and February 2010. Compared to “traditional” streamer surveys, the circular geometry introduces several differences and a number of new challenges, including proper offset / azimuth stacking. This paper presents the steps of the whole project: design, onboard illumination QC and final imaging results of this new “Full Azimuth” (FAZ) seismic effort.
-
-
-
CRS - More than a stack: A workflow from time to depth
Authors D. Gajewski, M. Baykulov and S. DümmongStacking is one of the most stable processes in reflection seismic data processing. Although the stacked section provides a distorted picture of the subsurface it remains the first image in the processing chain since the CMP concept was invented more than half a century ago. The stability of the stacking process results from the limited assumptions made in its derivation. Particularly no assumption on the type of model is made. This applies as well to the extension of the CMP concept the Common Refection Surface (CRS) method. Not just one but several CMP locations are considered when determining the stacking attributes which automatically accounts for the dip of the events. This improves the structural quality of the stack. Moreover, since several CMPs are considered more traces contribute to the stack. The stack is just one product of this procedure. The stacking attributes or CRS attributes are determined for each sample of the data. These attributes (three for the 2-D situation) have many important applications in seismic data processing like velocity model building, multiple suppression, pre stack data enhancement or data regularization. What started out as a stack evolved into a reflection seismic data processing workflow from time to depth producing structural images of high fidelity.
-
-
-
Neural-network based multi-azimuth processing
Authors A. Huck, P. de Groot, T. Manning and W. RietveldThis paper describes the results of a series of experiments with neural networks, dip-steered noise reduction filters and other techniques aimed at combining multi-azimuth data. The seismic data was first pre-processed by applying dip-steered noise reduction filters, amplitude correction and inter-volume trace matching for dynamic shift corrections. Then the individual azimuthal stacks were combined using first unsupervised - and then supervised neural networks using custom-made semi-automated workflows.
The main conclusions drawn from this study are that incremental improvements were achieved after consecutively: aligning the azimuth volumes, unsupervised stacking and supervised stacking. Alignment proved to be a mandatory step. Unsupervised segmentation provided a useful segment volume that highlights the area affected by stacking issues, while the same segmentation was also used for re-stacking the seismic data. Main improvements were achieved by selecting the relative weight to use for stacking. Supervised neural network stacking were further used to smoothen the transition between segment. The “MLP weighted” output is considered better than the input mazstack. The “MLP weighted” stack is perfectly fit for interpretation since no processing related artifacts were accepted. The workflow was adapted to the pre-stack domain but no additional gains were obtained.
-
-
-
Multi-dimensional data reconstruction and noise attenuation for optimal wide azimuth stack
Authors G. Poole and R. WombellOver recent years the value of wide-azimuth acquisition has been well documented. As well as significant improvements in the imaging of complex structures due to improved illumination, these data have also demonstrated benefits in the suppression of coherent and random noise and multiple energy. Two of the key factors controlling the quality of wide-azimuth datasets are high density regular sampling and good signal to noise. Using simple synthetics, we demonstrate the importance of regular sampling in the stack response. We continue by showing how data can be regularised and interpolated with 5D Fourier reconstruction to stack out more noise and improve the stack response of primary energy. In addition we highlight how the use of multi-dimensional denoising techniques can be used to enhance weak energy where the signal-to-noise ratio is a problem.
-
-
-
Least Squares Migration of Stacked Supergathers
Authors G. T. Schuster, W. Dai and G. ZhanWe show that phase-encoded shot gathers can be stacked together to form supergathers and efficiently migrated using an iterative least squares migration (LSM) method. The major problem of cross-talk can be largely eliminated by iterative stacking of the phase-encoded migrations and a multisource preconditioning factor, where random static shifts are used for the phase encoding function. Empirical results with synthetic seismic data suggests that increasing the number of stacked shot gathers requires an attendant increase in the number of LSM iterations. A key merit of phase-encoded LSM of supergathers is that, under ideal conditions, computational costs, IO, and storage requirements can be reduced by several orders of magnitude compared to conventional LSM.
-
-
-
Beam, wavelets and enhanced seismic attributes for interpretation
Authors K. Sherwood and J. SherwoodIn beam migration, it is possible to maintain a one-to-one mapping between a coherent event in unmigrated time and the corresponding event in migrated depth. The mapping is accompanied by many seismic attributes, including dip/azimuth of the reflector, angle of incidence at the reflector, raypath to the reflector, coherency, and wavefront curvature. During reconstruction, one or more of these seismic attributes can be used to filter the data, creating unique seismic volumes that aid in the model building and interpretation. The benefits derived from these volumes can lead to significant improvements in the imaging of challenging areas.
-
-
-
Anti-alias Optimal Interpolation with Priors
Authors M. Vassallo, A. Özbek, A. K. Özdemir, D. Molteni and Y. K. AlpWe introduce a new technique referred to as Optimal Interpolation with Priors, or OIP, for interpolation of irregularly sampled signals, using prior estimates of their spectral content, which is optimal in the least square sense. In this paper, after introducing this technique and describing its basic advantages with respect to other state of the art regularization techniques, we demonstrate its potential to interpolate signals that are spatially aliased, based on realistic prior information. We also propose an algorithm to obtain a reliable prior estimate of the signal spectrum. The combined use of this algorithm and OIP, referred to henceforth as Anti-Alias OIP (AA-OIP), can be applied to datasets irregularly sampled in multi-dimensional spaces.
-
-
-
Advances in onshore imaging
Authors J. -W. de Maag, H. Rynja, E. van Dedem, P. Milcik and M. van de RijzenOnshore data typically poses additional challenges for processing and imaging in comparison with offshore data. Here we can think of limited access for acquisition, more variation in source and receiver coupling, more severe random noise conditions, presence of coherent (shear) noise such as groundroll, more complicated multiple systems, processing no longer being done from a flat datum and signal distortion from a rapidly varying shallow overburden. To overcome these challenges, several advances towards a better stack are being made.. Some of these will be discussed below. Examples shown will be from two onshore datasets; a sparser Libyan survey and a high-density Wide-Azimuth survey acquired in the South of Oman.
-
-
-
Seismic stacking in a wider perspective
More LessStacking can be seen as part of the well-known correlation process: ‘stacking is zero-shift cross-correlation’. Hence, the stack yields one element out of a larger data volume. By computing this larger data volume (‘generalized stacking’), the original unstacked input can be fully recovered (‘generalized destacking’). If we look at the physics behind those mathematical transformations, generalized stacking represents a focussing process and generalized destacking represents a defocussing process. In this paper it is proposed to extend the traditional stack to focal transformation. In addition, it is proposed to formulate focal transformation in terms of constrained inversion.
-
-
-
Virtual Outcrop Models: Case Study from the Paleozoic Sandstone Reservoir and Aquifer Analogs, Saudi Arabia
Authors O. Abdullatif and M. MakkawiExcellently exposed Paleozoic Sandstone outcropping strata in central and southern Saudi Arabia provide good outcrop analog to many subsurface formations and hydrocarbon reservoirs and groundwater aquifers. The study of these outcropping rocks provides invaluable opportunity to examine different scale of sedimentary heterogeneity and to understand their impacts on reservoir and aquifer quality and their behavior in the subsurface. This might help to refine and better characterize reservoir and aquifer geological models based on subsurface information.
-
-
-
The Ainsa quarry outcrop revisited via orientation models built from LIDAR data
Authors P. Arbués, D. García-Sellés, O. Falivene, Ò. Gratacós and J. A. MuñozThe Ainsa quarry outcrop is located 1.5 km south of Ainsa town, in southern Pyrenees, Spain. The strata in the exposure are Eocene, and deposited in a submarine slope setting undergoing synchronous thrusting and related folding. They have tectonic dips of about 22º to the WSW. The succession comprises a 20 m thick turbidite sandstone body (Figure 1) sandwiched between mudstone-dominated mass-transport deposits, together representing the Ainsa-1 turbidite channel-complex. The outcrop has been the subject of numerous studies that have contributed to the global understanding of turbidite systems (Mutti and Normark, 1987; Schuppers, 1995; Clark and Pickering 1996). This succession is also important in that it has been regarded as an analogue for reservoirs in the offshore West Africa. The outcrop section is about 400 m long, and its map view shows as a very open angle, limiting the validity of 3-D reconstructions away from the outcrop face to mere extrusion. However, the quarry face is clean and allows for the observation of multiple bedding surfaces, specially the sharp soles of turbidite sandstones on top of mudstone beds (Figure 1). These surfaces were studied from their LIDAR point cloud expression. The results, a local 3-D reconstruction, will be used to revisit an existing depositional and architectural interpretation of the outcrop (Arbués et al. 2008) that had been built on the basis of conventional outcrop characterization techniques.
-
-
-
LIDAR-based 3D reconstruction and modelling of a flat-topped non-rimmed carbonate platform: Aptian, Maestrat Basin, Spain
Authors T. Bover-Arnal, O. Gratacós, D. García-Sellés, O. Falivene, R. Salas and J. A. MuñozOutcrop-scale reconstruction of depositional geometries and facies distribution of carbonate systems improves our knowledge on their heterogeneity distribution, stacking patterns and stratal architecture. The collected data and derived models can be used as analogues for characterizing and modelling potential subsurface reservoirs. Traditional sedimentological analyses in cropping out carbonate systems have limited accuracy depending on exposure conditions, accessibility or past erosive processes. On the other hand, there is a need to complement classical sedimentological approaches with quantitative characterizations and models of sedimentary bodies. In this respect, processing of three-dimensional (3D) point clouds captured by terrestrial LIght Detection And Ranging (LIDAR) technology combined with real-time kinematic global positioning system offers to field geologists the possibility to construct virtual 3D digital outcrop models (DOMs), which allow for more accurate analyses, reconstructions and quantification of the outcropping facies distribution than conventional digital terrain models. We present a LIDAR 3D DOM of an Aptian flat-topped non-rimmed carbonate platform margin from the western Maestrat Basin (Spain). The DOM served as a departing point to perform a 3D reconstruction that shows the relationship between depositional architecture and facies distribution of the carbonate system. The reconstruction not only highlights the value of digital outcrop models to characterize virtual attributes not observable in the outcrops due to the limitations of the 2D views of the exposures, but also allows to refine outcrop-scale sequence stratigraphic analyses. In addition, the 3D sequence stratigraphic approach obtained together with the 3D facies distribution model generated can be used as an analogue for the characterization of subsurface carbonate reservoirs with similar depositional profiles. The workflow of this study followed these steps: 1) Acquisition of the outcrop 3D point data set using a ground-based terrestrial LIDAR equipped with a differential GPS; 44 overlapping scans were needed to cover the entire outcrops of the flat-topped non-rimmed carbonate system characterized, each scan has associated a high-resolution photograph. 2) Mapping stratigraphic surfaces and pseudowells describing 5 lithofacies onto each individual photograph using a CAD-based tool, the mapping is carried directly onto the photographs because manipulating the images and interpreting the details is easier than directly digitizing onto the point-cloud. 3) The features mapped onto the photographs are projected into the corresponding point-cloud in order to georeference them. 4) Locally georeferenced individual point-clouds and attached interpretations were globally georeferenced by means of the UTM coordinates of each scan. 5) The stratigraphic boundaries mapped are used reconstruct the surfaces bounding stratigraphic units. 6) Population of the internal facies distribution conditioned to the pseudowells. This methodology allows to efficiently extracting information from point clouds, and resulted in the construction of a high-resolution 3D geological model displaying the stratal architecture and facies heterogeneity of sedimentary bodies, confined within a 3D sequence stratigraphical framework.
-
-
-
Application of ground-based LIDAR for the study of the Huesca Fluvial Fan (Northern Ebro Basin, Spain): modelling the Montearagón outcrop
Authors R. Calvo, P. Arbués, D. García, P. Cabello and E. RamosThe emergence of new techniques usually awakes a strong interest within the scientific community for testing its potential applications in different fields. Thus, any new tool or methodology should be validated previously to its systematic application. Validation process must be carried out in places sufficiently known in order to verify if the results are consistent with those expected. This work aims to incorporate the Light Detection And Ranging technique (LIDAR) to study sedimentary outcrops (Bellian, J. et al.; 2005). The used methodology includes both classical field study and geometric information extracted from the analysis of LIDAR-based Virtual Outcrop (VO). In this case, the chosen study area is the Montearagón outcrop. As illustrated in Fig. 1, Montearagón is located in the Northern margin of the Ebro Foreland Basin (Arenas, C. et al., 2001; Luzón, A., 2005), 11 km Northeast of Huesca (Aragón). The outcropping materials correspond to the Oligo-Miocene Huesca Fluvial Fan (Hirst, J., 1991; Nichols, G., 2004; Donselaar, M. et al., 2008), and are mainly composed by different typologies of sandstone bodies included in a matrix of flood plain shale and silt (Fig. 2). The main objective of this work is creating a 3D model of the Huesca Fluvial Fan. This model will be built using the proposed methodology and by integrating several outcrops that represent various sectors of the fan. Studying outcropping ancient fluvial systems, like the Huesca Fluvial Fan, is of big interest to the oil industry. It gives an approach to the behaviour of similar buried fluvial reservoirs that are hard to image and to model accurately.
-
-
-
Characterization of an analogue of fractured reservoir using LIDAR, GPR and conventional data
Authors M. Coll, D. García-Sellés, M. Grasmueck, G. P. Eberli, J. Lamarche and K. PomarThe Solvay quarry displays karstified and heavily fractured strata of peritidal platform carbonates of late Barremian age, that can serve as an analog to subsurface fractured reservoirs. In addition of being a potential analog, this study also aims to improve the methodology used in building of DOM (Digital Outcrop Model). The originality of the applied methodology is the integration of conventional outcrop analysis, LIDAR (Light Detection and Ranging) and GPR (Ground Penetrating Radar) data. The goal is to produce an accurate and efficient DOM that resolves the three-dimensional sub-seismic heterogeneity of the fracture distribution in the strata. Stratigraphic and fracture analysis with conventional methods was performed on about 2 km of exposed cliff faces that were subsequently scanned with the LIDAR equipment. Transversal and longitudinal 2D GPR lines and 6 GPR cubes were acquired on the quarry floor to correlate the quarry walls. The 2D GPR data were statically corrected using the GPS horizontal coordinates of the transects, high-resolution topography provided from LIDAR data, and a replacement velocity of 0.098m/ns. GPR and LIDAR data were loaded into 3D CAD software to interpret each horizon and to reconstruct the structural framework. To characterize the fracture distribution; scanline measures were performed along the quarry walls, 3D migrated GPR data was interpreted by delineating high amplitude zones originating from focused diffractions that define fracture surfaces (Grasmueck et al. 2005) and LIDAR point clouds were processed to reveal the main planes families that form the rough wall surface. Two of GPR cubes show the coexistence of four sub-vertical fracture families trending N-S, E-W, NW-SE and NE-SW. The NE-SW fracture family is not detected in the outcrop using the scanline method because the fracture is parallel to the direction of the quarry wall, however the LIDAR algorithm found two families planes oriented near this fracture family. This planes are related to the morphological features of NE-SW joints like twist hackles. The 3D fractures constructed with GPR data allow to filter and understand the planes computed with LIDAR data and to determine the sampling bias due to scanline orientation. Subsequently, the LIDAR data and the scanline measures allow to obtain a continuous distribution of the families fractures along the quarry allowing to characterize dip and azimuth variations.
-
-
-
A workflow for the automatic characterization of geological surfaces from terrestrial LIDAR data
Authors O. Falivene, D. García-Sellés, P. Arbués, O. Gratacós, J. A. Muñoz and S. TavaniPoint clouds acquired with terrestrial LIDAR are used as a digital support to accurately and precisely georeference outcrop characterizations; as well as to resolve accessibility problems, and improve outcrop characterizations. The LIDAR data allows for an efficient visualization and analysis of the outcrop in the computer, and is also useful for revisiting field data in the office or for teaching purposes. The common practice for virtual outcrop interpretation is visual identification and manual digitalization of pointsets or polylines by using 3-D CAD-like modules. Other, less generic, approaches are oriented towards the automated or semi-automated extraction of geological features, either based on the processing of intensity or other attributes of the virtual outcrop (RGB, hyperspectral) or on geometric parameters calculated from positions. In this presentation, we propose a workflow for the automatic characterization of planar surfaces (typically stratigraphic bedding or fractures) from LIDAR data. The workflow directly uses the point cloud; therefore no decimation, smoothing, intermediate triangulated or gridded surface are required; and is designed aiming to minimize user interaction to allow for a simple, fast, objective and semi-automated use. The result of the workflow is the reconstruction of planar surfaces identified in the point cloud by means of TIN surfaces, organized into families according to their orientations. These surfaces can be used as seeds for building surface-based models of the outcrop, or can be further analysed to investigate their characteristics (geometry, morphology, spacing, dimensions, intersections, etc.). The workflow is based on planar regressions carried out for each point in the point cloud. Which allow the subsequent filtering of points based on normal vector orientation, planar regression quality, relative locations of points or their relative normal vectors differences. This is aimed at individualizing planar patches with geological signification. A coarse grid search strategy is implemented to speed up neighbouring points searches and allow handling multimillion point clouds. The workflow is illustrated using synthetic and natural examples.
-
-
-
Collection, processing, interpretation and modelling of digital outcrop data using VRGS: An integrated approach to outcrop modelling
By D. HodgettsMuch focus has been given to the hardware and data collection techniques for digital outcrop analogue work. The software development has, however, been left behind, with many geoscientists relying on applications designed for civil engineering or surveying purposes. Though these approaches have yielded interesting and often impressive results, without dedicated software applications the true power of digital outcrop data will never be realised. For the past 7 years software has been under development at Manchester University dedicated to 3D digital outcrop work, with a focus of being able to use very large data sets (collected from LiDAR or other digital sources) effectively and efficiently, but importantly to integrate these approaches with more traditional data collection approaches such as sedimentary logging and field mapping. The software developed facilitates processing of point cloud data from LiDAR and satellite sources (such as Digital Elevation Models), the triangulation of that data into meshes, and interpretation on both point clouds and meshes where appropriate. Interpretation tools include typical polyline mapping tools, structural measurement tools, sedimentary logging tools as well as more automated interpretation and mesh/point cloud classification approaches. Due to the nature of the rapid and large scale data collection possible using modern surveying systems and abundance of publically available satellite imagery and DEM data, digital outcrop datasets can be very large in size. This presents problems in the time taken to interpret and extract surfaces, structures and geostatistics from these data. One solution to reduce the time needed for interpretation and classification is the application of artificial intelligence to the problem. Artificial Neural networks try and replicate the same learning process used by humans and other animals. These Artificial Neural Networks (ANN) potentially provide very powerful ways of classifying data. Examples will be given showing the application of these ANN approaches to the classification of point cloud and mesh data, in particular addressing the problems of extracting structural data on plane orientations such as fracture and bedding planes. The applications of other soft computing and artificial intelligence approaches will also be presented. Integration of multiple data sources into one environment facilitates the development of new modelling approaches. A predictive approach to surface modelling will be presented relying on the use of structural data from dip-azimuth measurements from bedding planes, and polyline interpretations from key stratal surfaces. This modelling approach relies on converging a triangulated mesh, based on the control data, onto a solution matching that input data, rather than using traditional interpolation/extrapolation approaches. With the rapid evolution of computer hardware, particularly the development of high power graphics-card based computing, the application of modern graphics-card features to the processing visualisation and rendering of large digital outcrop datasets will be demonstrated. These hardware advancements will prove of significant benefit to the geosciences, but only if software applications are written to take advantage of them.
-
-
-
Advances in Virtual Outcrop Geology
Authors J. A. Howell, S. Buckley, T. Kurz, A. Rittersbacher and A. SimaVirtual Outcrops, in which geological exposures are digital captured in a workstation, provide a new and rapidly emerging tool for the collection and analysis of field data. The advantages of virtual outcrop are primarily twofold; the rapid collection of accurate, spatially constrained measurements of geological features and, the improved visualization of outcrops which allows better correlation and mapping coupled with an ability to illustrate and communicate field observations to a wider audience. Virtual Outcrop geology is a rapidly expanding field of study which has grown over the last 10 years from photogrammetrry and basic digital mapping to the advanced data collection and visualization methods utilized today. This presentation addresses two recent developments: the collection and utilization of very large datasets and the integration of hyperspectral imagery to allow the remote mapping of lithology and mineralogy. To date the majority of photo-realistic virtual outcrops are generated from ground based lidar systems. While producing excellent results, these systems are limited by mobility and range, especially when studying very large outcrops. A solution to this problem is to mount the lidar system in a helicopter and scan the outcrop obliquely. This allows the rapid collection of very large volumes of data and has the added advantage of optimizing the angle at which both the scan and associated photos are taken, reducing the occurrence of scan-shadows. Very large virtual outcrops that cover 10s of km can be collected in hours. Despite the speed of acquisition, heli-based data presents a new set of challenges, not least the creation of very large datasets which cannot be visualized using conventional software. The acquisition, processing and utilization of these data will be illustrated with examples from fluvial and shallow marine systems from Utah. Airborne hyper-spectral imagery is an established method for remote sensing which utilizes the absorption characteristics of light outside the visible range (near infer-red) to identify mineralogy and other surface features (vegetation, land use etc). Mounting a similar camera on a tripod and obliquely scanning geological outcrops allows the remote mapping of lithology and mineralogy. The acquisition of oblique data from surfaces with significant topography has presented challenges for the processing of such data. Integration with the detailed terrain mapping provided by the lidar has allowed the spectral absorption response to be modelled and meaningful virtual outcrops, textured with quantitative mineralogical information to be produced. The results is a virtual outcrop which is textured with false coloured images that record mineralogy and can be accurately and rapidly investigated for quantitative information.
-
-
-
3D Digital Outcrop Modeling and aquifer/reservoir characterization of a slope system tufa complex. La Peña del Manto, Soria (Spain)
Field surveys had been performed on an excellent outcrop of a Quaternary perched springline (slope system) tufa complex (La Peña del Manto, Soria, Spain) integrating LIDAR (Laser Imaging Detection and Ranking), DGPS (differential global position system), GPR (Ground penetrating Radar), ERT (Electrical resistivity tomography) technologies and conventional field studies. The later include: 1) detailed GIS-based geological and geomorphological mapping; 2) description and characterization of sedimentary facies; 3) logging of stratigraphic sections; 4) palaeocurrent measurement; 5) sampling for petrographic, microtextural, geochemical and geochronological analysis; together with 5) sampling for petrophysical characterization (porosity and hydraulic conductivity analysis) of the different lithofacies that will be carried out in a following step of research project. PETREL sowftware (Schlumberger) is being used to integrate the data set and to build a digital outcrop model (DOM) and a 3D facies model of this sloped tufa complex. These models will allow the accurate reconstruction of sedimentary geometries and quantification of the spatial distribution of lithofacies and their physical properties. These models are envisaged as a highly valuable tool for unraveling the sedimentological development and evolution of the cascade tufa complex and their aquifer characterization, providing key insights for understanding the geomorphological evolution during the Quaternary of the fluvial drainage network of the area. In addition, the results will help to improve the current knowledge and understanding of tufa sedimentary systems (comparatively much less studied that other carbonate sedimentary systems) and will provide valuable information for aquifer and reservoir analogs of comparable sedimentary bodies.
-
-
-
Methods for Analyzing High Resolution 3D Digital Outcrop Geology: Deepwater Jackfork Sandstone at Big Rock Quarry, Arkansas
Authors C. L. V. Aiken, M. I. Olariu, M. Wang, J. P. Bhattacharya and J. F. FergusonQualitative facies distributions and quantitative bed/channel dimensions in three-dimensional virtual outcrops using ground-based remote sensing and analysis of terrain surfaces is a basis for geologic mapping and interpretation of deepwater deposits at Big Rock Quarry, Arkansas located in the southeastern part of the Ouachita Mountains in North Little Rock, Arkansas (Fig 1). Three-dimensional views of the lower part of the upper Jackfork Group (Olariu et al, 2008) allows three-dimensional reconstruction of facies architectural elements, stacked channels that lack levees and overflow deposits, a submarine channel complex deposited at the base of slope estimated as 9.6 km by 16 to 24 km pinching out 4 km north of the quarry. Flow indicators are oriented west-southwest.
-
-
-
Automated Methods for Fully Exploring and Interpreting LIDAR Data Points
By S. ViseurThe LIDAR scanning combined with digital photograph mapping techniques (Bellian et al. 2005) has become a privileged tool to obtain a 3D georeferenced reconstruction of an outcrop, often termed as DOM (Digital Outcrop Model). For several years, many geoscientist applications use DOMs as support for manual interpretations of strata or fractures and facies mapping. However, the LIDAR tool produces huge data sets that become easily difficult to manipulate interactively and then to interpret. A new challenge in geomodelling is then to extract, in an automated way, geological features from a DOM. Different kinds of strategies have been proposed in the litterature based on both LIDAR points or DOMs. For example, some authors have proposed to use maximum curvature values (Ahlgren et al., 2003) in order to obtain statistics about fracture networks (orientations, density). Automated detection methods have been presented in Kudelski et al. (2009), Viseur (2008) and Viseur et al. (2009). They are applied on DOMs and they aim at extracting as polygonal lines the strata or fracture paths observed along the outcrop. Other authors (Garcia-Selles et al., 2008; Franceschi et al., 2009) use properties computed (geometrical attributes) or available (intensity) from LIDAR data points to highlight or detect geological features. Finally, authors have proposed approaches based on the ”ant tracking” algorithms applied on the colours of the mapped pictures (Monsen et al., 2007). In this paper, a series of algorithms are presented. They are integrated into a workflow to fully explore and interpret numerical outcrops from data points to horizon or fracture surface constructions. Indeed, working on DOMs requires to build surfaces from very dense multivalued XYZ data points which is time consuming and generally leads to mesh decimation in order to obtain triangulated surfaces light to manipulate. These operations may damage the information contained in the topography geometry. Therefore, working directly onto the XYZ data points may be a good alternative and allows the display of subtle relief signals. Moreover, the LIDAR engin is experiencing new developments and LIDAR data points with RGB flags are increasingly provided. The proposed approach aims at first extracting as polygonal lines the limits of geological objects from the LIDAR data points. Then, surfaces may be built to model the detected fractures or strata interfaces.
-
-
-
GeoAnalysis Tools - ArcScene Extension for the Analysis of 3D Geological Outcrop Models
Authors L. White, C. Aiken, M. Alfarhan and J. ClineGeoAnalysis Tools is an ESRI ArcScene extension developed by Geological & Historical Virtual Models, LLC, based upon research performed in collaboration with the University of Texas at Dallas. GeoAnalysis Tools allows for the interactive analysis of orientation and deformation features of a 3D model of a geological outcrop. The model can be either a photorealistic solid model constructed by draping photographs on a triangulated irregular network (TIN) derived from a LIDAR point cloud or it can be the point cloud itself. Basic field measurements such as strike-dip, trend-plunge, and bedding thickness can be made on the 3D model. The program provides for the extrusion of features in a trend-plunge direction to facilitate the nature of the deformation. Down-plunge cross sections are rapidly created from traces of features such as bedding contacts and displayed in ArcMap.
-
-
-
Challenges and Pitfalls of Modelling Old and Deep Petroleum Systems: Examples from North Africa and the Middle East
Authors J. Craig, D. Grigo, A. Rebora, G. Serafini and E. TebaldiOlder and deeply buried petroleum systems are usually characterised by complex geological histories, and this is certainly the case for the Neoproterozoic and Palaeozoic petroleum systems of North Africa and the Middle East. In these systems, the efficiency of the source rocks and the potential to generate, migrate and trap hydrocarbons in a time frame that allows hydrocarbons to be retained are often the most critical risks. Hydrocarbons can usually only be trapped for a few tens of millions of years because even the most perfect seals are permeable over longer periods of time. One of the most critical issues determining the efficiency of older and/or deeply buried petroleum systems is, therefore, their burial history, and specifically the existence of a ‘late’ burial phase that can allow hydrocarbons to be generated, expelled, migrated and trapped in a suitably recent timeframe. Exceptions, such as the Neoproterozoic petroleum system of the Amadeus and Officer basins of Australia or the Late Neoproterozoic-Early Cambrian petroleum systems of Oman and the Indian Sub-continent generally occur where evaporite super-seals are present and/or where the post–trapping history is dominated by extreme tectonic stability.
-
-
-
Automated reconstructions of sedimentary basins in frontier area
Authors L. H. Rüpke and D. W. SchmidThe self consistent reconstruction of the thermal, tectonic, and stratigraphic evolution of sedimentary basins is a challenging task. Good results have been obtained (e.g., Bellingham and White 2002, Fjeldskaar et al. 2004, Kooi et al. 1992, Poplavskii et al. 2001, Rüpke et al. 2008) based on McKenzie’s pioneering work (1978). However, with the current petroleum prospects moving further and further into frontier areas, characterized by deep water and extreme stretching of the lithosphere, the McKenzie approach does not suffice any longer to obtain a valid reconstruction. Required additional physics include depth dependent stretching, formation of new oceanic crust, and mineral phase transitions. We have implemented all standard as well as these frontier area relevant mechanisms in a software package called TECMOD2D. TECMOD2D allows for automated thermotectonostratigraphic reconstructions of sedimentary basins. Key to this is the coupling of a forward model to an inverse scheme for automated parameter update. The forward model resolves simultaneously for lithosphere processes (e.g. thinning, flexure, temperature) and sedimentary basin processes (e.g. sedimentation, compaction, maturation). The inverse algorithm automatically updates crustal and mantel thinning factors as well as paleo-water depth until the input stratigraphy is fitted to a desired accuracy.
-
-
-
Basin modelling of the Ghana transform margin: implications for the structural, thermal and hydrocarbon evolution of the Tano Basin
Authors L. H. Rüpke, D. W. Schmid, E. H. Hartz and B. MartinsenThis study explores the structural and thermal evolution of the Ghana transform margin. The main objective is to explore how the opening of the Atlantic Ocean and subsequent interaction with the Mid-Atlantic Ridge (MAR) has affected the margin’s structural and thermal evolution. Two representative evolution scenarios are described: a reference case that neglects the influence of continental break up and a second scenario that does account for a possible heat influx during the passage of the MAR as well as magmatic underplating. These two scenarios have further been analyzed for the implications for the hydrocarbon potential of the region. The scenario analysis builds on a suite of 2D realizations performed with TECMOD2D, a modeling software for automated basin reconstructions. Taking the presently observed stratigraphy as input, the structural and thermal evolution of a basin is automatically reconstructed. This is achieved through the coupling of a lithosphere scale forward model with an inverse algorithm for model parameter optimization. We find that lateral heat transport from the passing MAR in combination with flexure of the lithosphere can explain the observed uplift of the margin. These results were obtained for a broken plate elasticity solution with a relative large value for the effective elastic thickness (Te=15) and necking level (15km). Lateral heat flow from oceanic lithosphere is clearly visible in elevated basement heat flow values up to 50km away from the OCT. This influx of heat does, however, not seem to have significantly affected the maturation history along the margin. Only the deepest sediments close to the OCT show slightly elevated vitrinite reflectances in simulations that account for the passage of the MAR. In conclusion, it appears that that lateral heat transport from the oceanic lithosphere is instrumental for shaping the Ghana transform margin but seems to have only limited control on the maturation history.
-
-
-
What can we expect from process-based source rock modelling: Examples from high and low resolution data sets
By U. MannIn order to be able to quantify geological processes in a distinct part of a sedimentary basin, two prerequisites are essential: first, a reasonable description of the most relevant processes taking place, and second, the description of these processes in 3 dimensions. The process-based source rock modelling software OF-Mod 3D aims at predicting source rocks units in sedimentary basins in terms of distribution and properties. It simulates the most important processes relevant for organic matter accumulation in sediments, and the interactions between them. Modelled processes are: supply and distribution of marine and terrigenous organic matter, degradation in the water column, burial efficiency at the sea floor under oxic and anoxic (oxygen minimum zones, anoxic bottom water) conditions, as well as dilution of the organic matter with siliciclastic sediments. The results can be calibrated to or just compared with analytical data from well samples. The advantage of such process-based modelling of organic sedimentation is that the process descriptions substitute to some degree for missing data and therewith the modelling has predictive power. In addition, complex parameter interactions are considered and the influence of each control parameter can be identified easily. In terms of petroleum systems modelling, it is also notable that the process-based forward modelling approach results in initial, not maturity-altered source rock properties. This is important since often geochemical data from exploration wells are heavily maturity altered and thus provide no further information on source rock properties that can be used as input into hydrocarbon generation and migration modelling.
-
-
-
Which equations to pick: a comparison of equations for calculating marine organic carbon deposition
More LessAny quantitative description of a geological process requires a mathematical model describing the relevant processes, as well as values for the input parameters. The processes involved in the deposition and preservation of marine organic matter include the flux of the primary produced organic matter from the sea surface to the sea floor, burial efficiency of the material that reaches the bed, and finally the amount of total marine organic carbon that is preserved in the deposit. These three processes are commonly modelled using empirical equations, mostly derived from fits to modern data sets. A range of equations exists for each process, derived by different authors from different data sets (although older data sets are commonly included in newer derivations). This means that a range of answers can be expected when using different combinations of equations to describe marine organic carbon deposition in a given area. The input parameters for equations describing these processes are primary productivity, water depth, sedimentation rate, and oxygen conditions at the bed. For present-day simulations, input parameters are available from measurements. This is unfortunately not the case for simulations of the geological past, but in this case they can be estimated from data measured in cores. The parameter estimation can be done using the same empirical equations as used for the process descriptions. As a range of equations exists, again a range of estimated values can be expected.Several equations and combinations of equations were used to investigate the range of answers that different approaches give. The different equations were used in a Monte Carlo simulation of the calculation of marine organic carbon values, to estimate values of primary productivity with published core data (other input parameters were obtained from the core measurements), and to simulate the spatial distribution of marine organic matter with the forward model OF-Mod (Organic Facies Model).
-
-
-
Numerical Modelling of Hydrothermal Fault-Related Dolomitization
Authors F. H. Nader, J. -M. Daniel, O. Lerat and B. DoligezClassical diagenesis studies make use of a wide range of methods and analytical techniques in order to suggest conceptual models that explain specific, relatively time-framed, diagenetic processes (like dolomitisation) and their impacts on reservoirs. Modern techniques usually combine petrographic analyses (by means of conventional, cathodoluminescence, fluoresence, and scanning electron microscopic techniques), geochemical measurements (major/trace elements, micro-probe, stable oxygen and carbon isotopes, Sr radiogenic isotopes) and fluid inclusion analyses, providing independent arguments to support the proposed model. Still, conceptual models are qualitative and do not yield "real" data for direct use by reservoir engineers for rock-typing and geomodelling. This contribution provides new insights into numerical modelling of hydrothermal dolomitisation.
-
-
-
Challenges in the modelling of hydrocarbon systems from seismic cubes
By Ø. SyltaSeismic data have for a long time been used to build geologic models for basin modeling purposes. The basin models used in migration studies have typically been built as 2D section profiles (Figure 1), but over the last 10 years we have seen 3D stratigraphic geometries being built from interpreted seismic horizons. The interpretations have been depth-converted and merged into a 3D structural framework, and the "inside" of the layers have thereafter been populated with flow properties from geological libraries. These libraries will often be very elementary in their representation of the flow properties, resulting perhaps in too simple hydrocarbon migration flow patterns in modeled basins.
-
-
-
Integration between Pore Pressure Prediction and Petroleum System Modelling Methodologies
Authors P. Sibin, M. Della Martera, M. Tonetti and C. AndreolettiThe necessity to satisfy the world needs of oil & gas presses Oil Companies to drill in conditions that are getting harder and harder in terms of geopressure environment. In exploration, pore pressure prediction is critical for the evaluation of vertical and lateral sealing, the estimation of maximum possible column of hydrocarbons in place, and consequently to rank the prospects and evaluate the economics. Moreover, the overpressures have created serious problems during drilling operations in the past and also at the present time; for all these reasons, the geopressures prediction is important to define the best well design in order to reduce NPT, costs and reservoir damages. In order to build the appropriate model and to face this complex problem in the right way, the
necessary information have to be collect, interpreted, elaborated and evaluated by several disciplines and for this kind of item, geology, geophysics and engineering have to be strongly integrated to give the best. We know from several years that information about geopressure can be derived from seismic velocities, and several relationships exist and applied with considerable success, but generally, in complex and deep areas, conventional velocity fields derived from seismic time processing are often
not accurate enough to make a correct pore pressure prediction. For this reason, several methods have been developed trying to obtain more appropriate velocity fields.
-
-
-
Seismic imaging solutions by multi-geophysical measurements and joint inversion
Authors D. Colombo and T. KehoA wide range of near surface geological features challenge seismic acquisition and processing in arid land environments: sand dunes, collapsed karsts, dry river beds, sabkas, outcropping refractors, high velocity near surface layers, velocity reversals, layered basalts and rough topography, to cite a few. These features introduce sharp velocity changes in the vertical and horizontal directions that are difficult to model by using seismic data alone (e.g. velocity inversions, karsts). As a consequence, their imprints remain in the seismic images from surface to reservoir depths. The type of problems introduced by unresolved near surface velocity anomalies range from lack of seismic image quality, to misidentification of prospective low-relief structures and to erroneous depth conversions. Conventional statics and seismic acquisition practices often fail in areas with complex near surface conditions. Therefore, new and even unconventional approaches should be considered to address the near surface challenge. Among these, non-seismic methods such as precision gravity, shallow electromagnetics (EM) and/or electrical resistivity techniques could be effective in reconstructing near surface features correlated to seismic velocity anomalies.
-
-
-
Integration of seismic, well, potential-field and geological data for ore prospecting in the Iberian Pyrite Belt
Authors J. Carvalho, P. Sousa, J. X. Matos and C. PintoaOre prospecting using gravimetric and magnetic data has become one of the traditional approaches in the last decades, often complemented with electric and electromagnetic methods. However, due to the problem of non-uniqueness inherent to potential-filed modelling, constrains provided by structural methods such as seismic reflection are often used. During the exploration of massive sulphide polimetallic minerals in the Iberian Pyrite Belt Figueira de Cavaleiros sector, located in the Sado Tertiary Basin, several gravimetric and magnetic anomalies were considered as interesting targets. In order to reduce ambiguity of the gravimetric modelling and to confirm the geological model of the area, two seismic reflection profiles were acquired. The interpretation of these profiles was assisted by three mechanical boreholes, two of them located in the research area, in order to make a seismostratigraphic interpretation. Unfortunately, the gravimetric modelling suggests that the anomaly has a lithological and structural origin and is not related with massive sulphides. Nevertheless, a good agreement between the seismic and potential-field data was achieved and new insights into the geological model for the region were obtained form this work, with accurate data about the Tertiary cover and Palaeozoic basement.
-
-
-
Possibilities for multidisciplinary, integrated approaches in near-surface geophysics
By R. GhoseWe show that it is possible, under certain boundary conditions, to integrate different methods or disciplines based on the underlying physics to address near-surface characterization challenges. The benefits are improved efficiency and marked enhancements in accuracy and reliability. Very divergent disciplines (e.g. small-strain seismic VS and large-strain geotechnical CPT qc) can be integrated provided there is a convexity in the property domain. It is important to reduce different observations to comparable scales. The integration approaches based on poroelastcity theories show promising results on field data even at low frequencies and appear to be robust against noise and uncertainties in data and physical models.
-
-
-
The use of structurally coupled cooperative inversion in conjunction with cluster analysis towards a comprehensive subsurface characterization
Authors T. Günther, C. Rücker and M. Müller-PetkeThe use of multiple physical principles and data is a common rule in geophysics in order to narrow the variety of possible interpretations. However, in most cases this is done on the interpretation level. A more rigorous reduction of ambiguity can be achieved by coupling within the inversion level. In order to combine data that are not directly related to each other, two main ways exist: • use of petrophysical relations to redirect the output parameters to a common parameter set • structural coupling of otherwise independent inversions based on the model characteristics We use the latter way, for which various approaches have been presented. Günther & Rücker (2006) used a generalized smoothness-constrained inversion scheme and on this basis Günther & Bentley (2006) presented a structural coupling between resistivity and velocity using the gradients of the individual models. An IRLS function is used to predict weights for the model boundary based on co-located model gradients of the other method. As a result, we obtain two physical properties on the same discretisation. Further methods such as cluster analysis can be used to produce a comprehensive subsurface model. Fuzzy c-means clustering yields not only the cluster membership for each model cell, but also a matching function can be derived that is of valuable help in the interpretation.
-
-
-
Surface-Subsurface Integration Reveals Faults in Gulf of Suez Oilfields
Authors A. Laake, M. Sheneshen, C. Strobbia, L. Velasco and A. Cuttsling software for the surface-subsurface integration. The joint analysis of Rayleigh wave data with satellite imagery provides a near surface structural geologic model, which can be interpreted for shallow drilling risks related to fault outcrops. The suite of near surface geological products – Rayleigh wave velocity mapping, short offset rayparameter interferometry and shallow fault mapping – is enabled by the acquisition, processing and interpretation of point-receiver seismic data. For the first time detailed structural geology comprising faults and lithology changes was imaged in the near surface, a data regime that is conventionally contaminated by the seismic acquisition footprint.
-
-
-
From independent data to comprehensive models
Authors M. Mueller-Petke and T. Guenther and U. YaramanciGeophysical exploration has become more and more multi-parameter and multi-method driven during the last decades. These data sets allow to obtain, to connect and to interpret subsurface properties more reliably. However the potential of these data sets is often unused and interpretation is reduced to independent inversions. We give an overview on basic principles and differences using the potential of those data sets. We show examples using data from Magnetic Resonance Sounding (MRS) and Geoelectrics.
-
-
-
In-situ permeability from physics-based integration of poroelastic reflection coefficients
Authors K. van Dalen and R. GhoseA reliable estimate of the in-situ permeability of a porous layer in the subsurface is extremely difficult to obtain. We have observed that at the field seismic frequency band the poroelastic behaviour for different seismic wave modes can differ in such a way that their combination can give unique estimates of in-situ permeability and porosity simultaneously. We have integrated the angle- and frequency-dependent poroelastic reflection coefficients of different seismic wave modes, and have tested the results through numerical simulations. The estimated values of permeability and porosity appear to be robust against uncertainties in the employed poroelastic attenuation mechanism. Potential applications of this approach exist in hydrocarbon exploration, hydrogeology, and geotechnical engineering.
-
-
-
Recent advances and open problems in the integration of near-surface geophysical data
Authors A. Vesnaver, D. Nieto, L. Baradello, M. Romanelli and A. VuanThe integration of different geophysical techniques is the best way to reduce the ambiguities of any single prospecting method, when characterizing geobodies by their rock properties. Seismic imaging is the main tool for delineating deep targets in 3D, but its quality may increase when the near surface effects are compensated for by gravity or electro-magnetic methods (den Boer et al. 2000, Dell’Aversana 2003, Colombo et al. 2008, 2010, among others). Classical refraction statics, in fact, break down when velocities are not monotonically increasing or the shallow formations are very inhomogeneous. In the last decade, new contributions are emerging from unusual information sources as vibrators’ controllers (Al-Ali et al. 2003, Ley et al. 2006), geological maps and satellite imagery (Vesnaver et al. 2006b, 2009, Laake et al. 2008). Here we review some of these recent results and highlight a few problems that require further analysis. We describe also an ongoing experiment for expanding the band-width of active seismic surveys by integrating them with passive ones.
-
-
-
In-situ soil properties from transmission seismic measurements using frequency-dependent wave attributes
Authors A. Zhubayev and R. GhoseA new concept for a physics-based integration of the velocity and attenuation of seismic waves in the shallow subsoil is proposed and tested. The theories of poroelasticity explaining the frequency-dependent seismic wave propagation have been explored. The integration leads to simultaneous estimation of two or more important soil properties in undisturbed condition, which is otherwise difficult if not impossible to achieve. The results of application to field data look promising.
-
-
-
Static and dynamic aspects of near surface characterization through physics-based integration of GPR, ERT, SIP and SP data in the time-lapse mode
Authors G. Cassiani, A. Binley, A. Brovelli, R. Deiana, P. Dietrich, A. Flores, A. Kemna, E. Rizzo and U. Werban .The use of geophysics for the characterization of the near surface is requiring more and more frequently that data be analysed quantitatively to offer meaningful information for the specific discipline object of investigation. This is true for all applications, including environmental studies, hydrology, soil science and geotechnics. This tendency leads substantially to overtaking of the classical approach to geophysics as a pure imaging technique, and requires in-depth understanding of the information contained in each specific physical measurement. Irrespective of the specific application, the geophysical response of the near surface is essentially controlled by a combination of geological (“static”) and ambient (“dynamic”) factors. The latter include moisture content and temperature variations. The separation of static and dynamic factors is the key step towards a quantitative use of near surface geophysics, as individual disciplines and applications may be interested selectively in one or more of the static or dynamic aspects, or combinations. Physico-mathematical modelling is often a fundamental tool that helps to discriminate between static and dynamic aspects, extracting the factors of specific interest for the application at hand. A link between measured geophysical quantities and the corresponding quantities of practical interest can only be established in the form of quantitative constitutive relationships. As many applications can benefit from the joint application of multivariate geophysical measurements (e.g. ERT, GPR, SIP etc) it would be highly advantageous to develop constitutive laws that in turn depend on few parameters that can be independently measured and that have a common, albeit different, impact on several geophysical data. In this contribution we illustrate the above general framework with a number of applications including catchment hydrology, digital soil mapping, contaminated site characterization and subsurface hydrology.
-
-
-
Seismic body and surface wave data integration for near surface characterisation
Authors L. V. Socco, D. Boiero, S. Foti and C. PiattiSeismic methods are widely used in near surface characterisation and very often different seismic datasets relative to body and surface waves are acquired at the same site. These data are, in the majority of the cases, acquired and interpreted separately to provide different information disregarding the synergies between different methods both in acquisition and inversion. In particular the joint or constrained inversion of different datasets may overcome intrinsic limitations of individual techniques and provide a more reliable and consistent final velocity model. Moreover, different information coming from different datasets provide a comprehensive site characterisation.
-
-
-
The Limits of Automatic History Matching
By B. DaviesAfter thirty-plus years of development of commercial reservoir simulators, and twenty-plus years of research into history matching, manual and otherwise, we continue to be surprised by what our new wells encounter in the dynamically changing subsurface, and by what our old wells produce. What are the limits of what is achievable by automatic history matching? One approach to this question is to posit the existence of an infallible automatic history matcher of some description, and then to consider what the implications would be for oilfield operational practice. The author looks at which ways in which ostensibly revolutionary technological breakthroughs are actually adopted and normalised by practicing engineers, and the long-term implications for the delivery of their early promise of savings in time, money or skilled labor. Several examples are introduced from the recent history of other information-driven industries, and from the author's field experience in the delivery and application of predictive reservoir models in the different phases of the reservoir lifecycle. In many cases, so-called "automatic" history matchers find their greatest utility not as black boxes that deliver a perfect model, but as guides to the intelligent use of more conventional manual matching techniques. What are the differences between a "perfect matcher" and a "helpful matching assistant"? Can both these design goals be achieved in a single piece of software, or is a different architectural approach required? Finally, the author speculates about the implications of these findings for the future of reservoir modelling practice, and considers how the non-specialist might be better served by the technology providers.
-
-
-
Conditioning the models with … uncertainties
By T. V. NguyenThe geomodelling technique has become the principal tool for geological representation of the subsurface for the last 10-15 years. It has been subject to an important technical and commercial development and growth and as a consequence has led also the way to a rapid evolution of reservoir modelling and the use of dynamic data. In particular the contribution from geostatistical methods has been a key factor for success. Figure 1 shows the classical streamline process from data processing/ analysis and interpretation to the geomodelling and reservoir simulation leading to the final evaluation of IHIP and production/ reserves. The interesting point to see is that many feed-back loops exist today, not only from reservoir model to geomodel but also from geomodel to different previous data processing/ analysis and interpretation steps. These feed-back loops clearly identify the need to go back more and more upstream in order to better condition the final reservoir model to the field monitoring and production history data. The practice of geomodelling and now of the feed-back loops increase the need for team integration and cross-discipline approach. One important reason for this need is the presence of uncertainties within the different type of data generally due to the scarcity and the quality of the acquisition. Processing and interpretation could sometimes become so difficult that only data integration could somehow helps to relieve the situation.
-
-
-
Time-lapse seismic provides key constraints to dynamic models
By P. HatchellTime-lapse seismic is one of the few technologies that provides a full-field areal picture of what is happening in the subsurface and is routinely used to update static and dynamic models. This is a mature technology in some parts of the world (marine + high porosity) and progress is continually made in more difficult areas (on-shore, HPHT, near infrastructure, lower porosity). Under the right conditions, time-lapse seismic is a proven method to detect and image differences due to changed fluid saturation and pore pressure inside the reservoir and deformations such as those related to reservoir compaction outside the reservoir. This capability often provides information on: (i) the progress of an injected fluid front, (ii) the ingress of an aquifer, (iii) the expansion of a gas cap, (iv) gas evolved due to depletion below bubble point, and (v) the distribution of reservoir compaction. The areal and vertical resolution of this information is typically at the scale of tens of meters. This technology addresses important uncertainties in our knowledge of reservoir connectivity and heterogeneity.
-
-
-
From History to Prediction -Techniques for Conditioning Reservoir Models to Dynamic Data
More LessReservoir simulation models should always be built for specific business goals. It is an accepted rule that models used for production forecasts should reproduce the production on history. Although, most history matching processes are often the result of a complex team effort, objectives for using simulation models and the required level of detail are quite diverse. Applications range from prospect evaluation with limited available calibration data to designing detailed production planning scenarios for mature fields with highly constraining well production data. In either case, applied techniques, workflow requirements and the level of complexity will naturally differ. In recent years assisted history matching techniques and optimization workflows have been established and included in best-practice guidelines in an increasing number of companies in the oil and gas industry. The application of assisted history matching techniques is often motivated by the need to handle increasingly complex problem statements as well as the desire to improve workflow efficiency and transparency. Initially, the focus was given to finding single best models. Modelling paradigms, however, are changing. More recently, the industry has given a stronger interest to understanding a distribution of alternative scenarios which more realistically captures the uncertainty-envelope. This step is non-trivial, since there is no natural extension from the paradigm of single best history-matched models with deterministic forecasting capabilities to the paradigm of establishing a distribution of alternative production forecasts. This defines a major challenge to the reservoir engineering workflow and the question of handling multiple models with alternative outcomes. This talk reviews selected techniques used in history matching workflows. It discusses practical considerations for finding a compromise between “accurate” history-matched models with deterministic forecasting capabilities and the newer paradigm of a sufficient coverage of the uncertainty space for establishing uncertainty distributions.
-
-
-
The evolution of HPC and its opportunities and challenges for Seismic Imaging
By N. BienatiOne of the most important factors for Oil & Gas industry (as for any other industry) is the ability of making predictions about the future. In particular, in this workshop we are concerned with forecasts about the future of HPC and its impact on seismic imaging industry. Needless to say, everyone can predict that hardware performance will continue increasing in the future, but one question that it is interesting to address is: how much? One reliable answer can come from the Top500 list. Indeed, Professor Hans Meuer at University of Mannheim, one of the fathers of Top500, has shown (Meuer, 2008) that, according to historical data, the performance of the system classified at the bottom of the list follows a linear trend on a logarithmic scale (see Figure 1). The rate of growth is around 2x every 13 months, faster than Moore’s law that assumes 2x every 18 months. The nice fit of the data to this trend suggests a good confidence in using the linear trend for extrapolation. The result of such extrapolation is the prediction that between 2015 and 2016 all the systems in the list will exceed the performance of 1 Petaflop/sec. Likewise, it is not unreasonable to predict that this figure will be the minimum standard for all the major players in the seismic imaging industry, both Oil Companies and Service Companies. This is certainly a good news for seismic imaging applications like Reverse Time Migration, Full Waveform Inversion and Seismic Modelling that are amongst the most compute intensive.
-
-
-
Trends in Multicore Processors
By A. GonzálezMoore’s law has fueled a dramatic evolution in microprocessor and will keep doing it in forthcoming generations. Microprocessor designers have leveraged the improvements in process technology to enhance the microarchitecture of processors in different manners. In this quest for delivering higher performance, the whole industry has recently started a journey in the land of multicores. Multicores are very effective to increase computing density, by increasing the number of processing units generation after generation. The scalability of multicore processors faces multiple challenges that will require significant innovation in applications, programming paradigms and tools, and architectures. In this talk, I will describe some of the research avenues that are being pursued to address these challenges.
-
-
-
Programming Seismic Algorithms for GPUs
Authors S. Morton, T. Cullison, I. Terentyev and S. MaGraphics processing units (GPUs) have been shown to be capable of efficiently running computationally demanding seismic imaging algorithms. And the recent significant increase in expenditures by the petroleum industry for GPU clusters indicates these systems are cost effective. With this hurdle cleared, the adoption of GPUs is probably limited mainly by our ability to program seismic algorithms for GPUs. At Hess Corporation, we have moved the most computationally intensive parts of our seismic imaging
codes from CPUs to GPUs over the past few years. The effort involved has varied widely from code to code, from a cost of a man-month to nearly a man-year. Our one-way wave-equation migration for GPUs is a direct port of the computational algorithm used on CPUs. The Kirchhoff code required manual optimization of many of its components. An optimized reverse-time migration library was constructed by screening a set of automatically generated kernels. In this talk we will present the computational algorithms for these seismic imaging codes and discuss our software approaches and performance results.
-
-
-
Accelerating seismic processing applications with FPGAs
By O. PellMicroprocessors have been hitting the limits of attainable clock frequencies for the past few years, resulting in the current multi-core processor solutions provided by the major microprocessor vendors. Multiple cores on a chip result in the need to share the same pins to get to the memory system and communication channels to other machines. This leads to a “memory wall”, since the number of pins per chip does not scale with the number of cores, and a “power wall” since chips must still be cooled within the same physical space. Many geophysically important applications such as finite difference modeling, downwards continuation based migration and sparse matrix solvers already exhibit significantly worse than linear scaling on multiple cores, a problem that is only going to worsen as the major microprocessor vendors move beyond quad-core chips to many-core architectures. Maxeler streaming accelerators implemented on Field Programmable Gate Arrays (FPGAs) allow us to bypass the memory wall by minimizing access to external memory and explicitly forwarding data on-chip at a very high bandwidth (over 10TB/s on the latest chips). The high performance attainable with such architectures has been established for a range of applications (for example [1], [2], [3]). At the same time, since FPGA performance is achieved by massive parallelism at relatively low clock frequencies (hundreds of MHz), we avoid the “power wall” and allow our FPGA-based HPC systems to be configured very densely, with accompanying savings in operational costs for power, space, maintenance, etc.
-
-
-
Seismic Imaging and HPC, how to preserve our investment and to prepare the future?
By H. CalandraAn extraordinary challenge the oil industry must face in the hydrocarbon exploration is to develop leading edge technologies to reconstitute the three-dimensional structure of the Earth. Seismic Imaging industry is made possible because of the progress of the computer capacity to process more and more data in a shorter and shorter time. “Thanks to the extraordinary progress of the computer” we have been using for almost 40 years and will still be used for the coming years. Seismic imaging industry is also made possible because the data acquisition technology has made tremendous progress. But again the technology would not have been developed if we were not able to process the huge amount of data generated by seismic data acquisition without the help of large HPC systems. For more than 30 years, seismic reflection is the main technology used in our industry. The physics is well known and is based on solving different approximations of the wave equation. Anticipating and taking advantage of the constantly evolving of the computer technology, geophysicists are able to find numerical implementation which is the most adapted to the computer hardware: from 2D to 3D, Post to Pre Stack , Asymptotic to band limited, one way to RTM, ray tomography to wave tomography and Full Wave form inversion. All these evolutions follow very closely to the progress of the HPC.
-