- Home
- Conferences
- Conference Proceedings
- Conferences
72nd EAGE Conference and Exhibition - Workshops and Fieldtrips
- Conference date: 14 Jun 2010 - 17 Jun 2010
- Location: Barcelona, Spain
- ISBN: 978-90-73781-87-0
- Published: 13 June 2010
81 - 100 of 105 results
-
-
Seismic imaging solutions by multi-geophysical measurements and joint inversion
Authors D. Colombo and T. KehoA wide range of near surface geological features challenge seismic acquisition and processing in arid land environments: sand dunes, collapsed karsts, dry river beds, sabkas, outcropping refractors, high velocity near surface layers, velocity reversals, layered basalts and rough topography, to cite a few. These features introduce sharp velocity changes in the vertical and horizontal directions that are difficult to model by using seismic data alone (e.g. velocity inversions, karsts). As a consequence, their imprints remain in the seismic images from surface to reservoir depths. The type of problems introduced by unresolved near surface velocity anomalies range from lack of seismic image quality, to misidentification of prospective low-relief structures and to erroneous depth conversions. Conventional statics and seismic acquisition practices often fail in areas with complex near surface conditions. Therefore, new and even unconventional approaches should be considered to address the near surface challenge. Among these, non-seismic methods such as precision gravity, shallow electromagnetics (EM) and/or electrical resistivity techniques could be effective in reconstructing near surface features correlated to seismic velocity anomalies.
-
-
-
Integration of seismic, well, potential-field and geological data for ore prospecting in the Iberian Pyrite Belt
Authors J. Carvalho, P. Sousa, J. X. Matos and C. PintoaOre prospecting using gravimetric and magnetic data has become one of the traditional approaches in the last decades, often complemented with electric and electromagnetic methods. However, due to the problem of non-uniqueness inherent to potential-filed modelling, constrains provided by structural methods such as seismic reflection are often used. During the exploration of massive sulphide polimetallic minerals in the Iberian Pyrite Belt Figueira de Cavaleiros sector, located in the Sado Tertiary Basin, several gravimetric and magnetic anomalies were considered as interesting targets. In order to reduce ambiguity of the gravimetric modelling and to confirm the geological model of the area, two seismic reflection profiles were acquired. The interpretation of these profiles was assisted by three mechanical boreholes, two of them located in the research area, in order to make a seismostratigraphic interpretation. Unfortunately, the gravimetric modelling suggests that the anomaly has a lithological and structural origin and is not related with massive sulphides. Nevertheless, a good agreement between the seismic and potential-field data was achieved and new insights into the geological model for the region were obtained form this work, with accurate data about the Tertiary cover and Palaeozoic basement.
-
-
-
Possibilities for multidisciplinary, integrated approaches in near-surface geophysics
By R. GhoseWe show that it is possible, under certain boundary conditions, to integrate different methods or disciplines based on the underlying physics to address near-surface characterization challenges. The benefits are improved efficiency and marked enhancements in accuracy and reliability. Very divergent disciplines (e.g. small-strain seismic VS and large-strain geotechnical CPT qc) can be integrated provided there is a convexity in the property domain. It is important to reduce different observations to comparable scales. The integration approaches based on poroelastcity theories show promising results on field data even at low frequencies and appear to be robust against noise and uncertainties in data and physical models.
-
-
-
The use of structurally coupled cooperative inversion in conjunction with cluster analysis towards a comprehensive subsurface characterization
Authors T. Günther, C. Rücker and M. Müller-PetkeThe use of multiple physical principles and data is a common rule in geophysics in order to narrow the variety of possible interpretations. However, in most cases this is done on the interpretation level. A more rigorous reduction of ambiguity can be achieved by coupling within the inversion level. In order to combine data that are not directly related to each other, two main ways exist: • use of petrophysical relations to redirect the output parameters to a common parameter set • structural coupling of otherwise independent inversions based on the model characteristics We use the latter way, for which various approaches have been presented. Günther & Rücker (2006) used a generalized smoothness-constrained inversion scheme and on this basis Günther & Bentley (2006) presented a structural coupling between resistivity and velocity using the gradients of the individual models. An IRLS function is used to predict weights for the model boundary based on co-located model gradients of the other method. As a result, we obtain two physical properties on the same discretisation. Further methods such as cluster analysis can be used to produce a comprehensive subsurface model. Fuzzy c-means clustering yields not only the cluster membership for each model cell, but also a matching function can be derived that is of valuable help in the interpretation.
-
-
-
Surface-Subsurface Integration Reveals Faults in Gulf of Suez Oilfields
Authors A. Laake, M. Sheneshen, C. Strobbia, L. Velasco and A. Cuttsling software for the surface-subsurface integration. The joint analysis of Rayleigh wave data with satellite imagery provides a near surface structural geologic model, which can be interpreted for shallow drilling risks related to fault outcrops. The suite of near surface geological products – Rayleigh wave velocity mapping, short offset rayparameter interferometry and shallow fault mapping – is enabled by the acquisition, processing and interpretation of point-receiver seismic data. For the first time detailed structural geology comprising faults and lithology changes was imaged in the near surface, a data regime that is conventionally contaminated by the seismic acquisition footprint.
-
-
-
From independent data to comprehensive models
Authors M. Mueller-Petke and T. Guenther and U. YaramanciGeophysical exploration has become more and more multi-parameter and multi-method driven during the last decades. These data sets allow to obtain, to connect and to interpret subsurface properties more reliably. However the potential of these data sets is often unused and interpretation is reduced to independent inversions. We give an overview on basic principles and differences using the potential of those data sets. We show examples using data from Magnetic Resonance Sounding (MRS) and Geoelectrics.
-
-
-
In-situ permeability from physics-based integration of poroelastic reflection coefficients
Authors K. van Dalen and R. GhoseA reliable estimate of the in-situ permeability of a porous layer in the subsurface is extremely difficult to obtain. We have observed that at the field seismic frequency band the poroelastic behaviour for different seismic wave modes can differ in such a way that their combination can give unique estimates of in-situ permeability and porosity simultaneously. We have integrated the angle- and frequency-dependent poroelastic reflection coefficients of different seismic wave modes, and have tested the results through numerical simulations. The estimated values of permeability and porosity appear to be robust against uncertainties in the employed poroelastic attenuation mechanism. Potential applications of this approach exist in hydrocarbon exploration, hydrogeology, and geotechnical engineering.
-
-
-
Recent advances and open problems in the integration of near-surface geophysical data
Authors A. Vesnaver, D. Nieto, L. Baradello, M. Romanelli and A. VuanThe integration of different geophysical techniques is the best way to reduce the ambiguities of any single prospecting method, when characterizing geobodies by their rock properties. Seismic imaging is the main tool for delineating deep targets in 3D, but its quality may increase when the near surface effects are compensated for by gravity or electro-magnetic methods (den Boer et al. 2000, Dell’Aversana 2003, Colombo et al. 2008, 2010, among others). Classical refraction statics, in fact, break down when velocities are not monotonically increasing or the shallow formations are very inhomogeneous. In the last decade, new contributions are emerging from unusual information sources as vibrators’ controllers (Al-Ali et al. 2003, Ley et al. 2006), geological maps and satellite imagery (Vesnaver et al. 2006b, 2009, Laake et al. 2008). Here we review some of these recent results and highlight a few problems that require further analysis. We describe also an ongoing experiment for expanding the band-width of active seismic surveys by integrating them with passive ones.
-
-
-
In-situ soil properties from transmission seismic measurements using frequency-dependent wave attributes
Authors A. Zhubayev and R. GhoseA new concept for a physics-based integration of the velocity and attenuation of seismic waves in the shallow subsoil is proposed and tested. The theories of poroelasticity explaining the frequency-dependent seismic wave propagation have been explored. The integration leads to simultaneous estimation of two or more important soil properties in undisturbed condition, which is otherwise difficult if not impossible to achieve. The results of application to field data look promising.
-
-
-
Static and dynamic aspects of near surface characterization through physics-based integration of GPR, ERT, SIP and SP data in the time-lapse mode
Authors G. Cassiani, A. Binley, A. Brovelli, R. Deiana, P. Dietrich, A. Flores, A. Kemna, E. Rizzo and U. Werban .The use of geophysics for the characterization of the near surface is requiring more and more frequently that data be analysed quantitatively to offer meaningful information for the specific discipline object of investigation. This is true for all applications, including environmental studies, hydrology, soil science and geotechnics. This tendency leads substantially to overtaking of the classical approach to geophysics as a pure imaging technique, and requires in-depth understanding of the information contained in each specific physical measurement. Irrespective of the specific application, the geophysical response of the near surface is essentially controlled by a combination of geological (“static”) and ambient (“dynamic”) factors. The latter include moisture content and temperature variations. The separation of static and dynamic factors is the key step towards a quantitative use of near surface geophysics, as individual disciplines and applications may be interested selectively in one or more of the static or dynamic aspects, or combinations. Physico-mathematical modelling is often a fundamental tool that helps to discriminate between static and dynamic aspects, extracting the factors of specific interest for the application at hand. A link between measured geophysical quantities and the corresponding quantities of practical interest can only be established in the form of quantitative constitutive relationships. As many applications can benefit from the joint application of multivariate geophysical measurements (e.g. ERT, GPR, SIP etc) it would be highly advantageous to develop constitutive laws that in turn depend on few parameters that can be independently measured and that have a common, albeit different, impact on several geophysical data. In this contribution we illustrate the above general framework with a number of applications including catchment hydrology, digital soil mapping, contaminated site characterization and subsurface hydrology.
-
-
-
Seismic body and surface wave data integration for near surface characterisation
Authors L. V. Socco, D. Boiero, S. Foti and C. PiattiSeismic methods are widely used in near surface characterisation and very often different seismic datasets relative to body and surface waves are acquired at the same site. These data are, in the majority of the cases, acquired and interpreted separately to provide different information disregarding the synergies between different methods both in acquisition and inversion. In particular the joint or constrained inversion of different datasets may overcome intrinsic limitations of individual techniques and provide a more reliable and consistent final velocity model. Moreover, different information coming from different datasets provide a comprehensive site characterisation.
-
-
-
The Limits of Automatic History Matching
By B. DaviesAfter thirty-plus years of development of commercial reservoir simulators, and twenty-plus years of research into history matching, manual and otherwise, we continue to be surprised by what our new wells encounter in the dynamically changing subsurface, and by what our old wells produce. What are the limits of what is achievable by automatic history matching? One approach to this question is to posit the existence of an infallible automatic history matcher of some description, and then to consider what the implications would be for oilfield operational practice. The author looks at which ways in which ostensibly revolutionary technological breakthroughs are actually adopted and normalised by practicing engineers, and the long-term implications for the delivery of their early promise of savings in time, money or skilled labor. Several examples are introduced from the recent history of other information-driven industries, and from the author's field experience in the delivery and application of predictive reservoir models in the different phases of the reservoir lifecycle. In many cases, so-called "automatic" history matchers find their greatest utility not as black boxes that deliver a perfect model, but as guides to the intelligent use of more conventional manual matching techniques. What are the differences between a "perfect matcher" and a "helpful matching assistant"? Can both these design goals be achieved in a single piece of software, or is a different architectural approach required? Finally, the author speculates about the implications of these findings for the future of reservoir modelling practice, and considers how the non-specialist might be better served by the technology providers.
-
-
-
Conditioning the models with … uncertainties
By T. V. NguyenThe geomodelling technique has become the principal tool for geological representation of the subsurface for the last 10-15 years. It has been subject to an important technical and commercial development and growth and as a consequence has led also the way to a rapid evolution of reservoir modelling and the use of dynamic data. In particular the contribution from geostatistical methods has been a key factor for success. Figure 1 shows the classical streamline process from data processing/ analysis and interpretation to the geomodelling and reservoir simulation leading to the final evaluation of IHIP and production/ reserves. The interesting point to see is that many feed-back loops exist today, not only from reservoir model to geomodel but also from geomodel to different previous data processing/ analysis and interpretation steps. These feed-back loops clearly identify the need to go back more and more upstream in order to better condition the final reservoir model to the field monitoring and production history data. The practice of geomodelling and now of the feed-back loops increase the need for team integration and cross-discipline approach. One important reason for this need is the presence of uncertainties within the different type of data generally due to the scarcity and the quality of the acquisition. Processing and interpretation could sometimes become so difficult that only data integration could somehow helps to relieve the situation.
-
-
-
Time-lapse seismic provides key constraints to dynamic models
By P. HatchellTime-lapse seismic is one of the few technologies that provides a full-field areal picture of what is happening in the subsurface and is routinely used to update static and dynamic models. This is a mature technology in some parts of the world (marine + high porosity) and progress is continually made in more difficult areas (on-shore, HPHT, near infrastructure, lower porosity). Under the right conditions, time-lapse seismic is a proven method to detect and image differences due to changed fluid saturation and pore pressure inside the reservoir and deformations such as those related to reservoir compaction outside the reservoir. This capability often provides information on: (i) the progress of an injected fluid front, (ii) the ingress of an aquifer, (iii) the expansion of a gas cap, (iv) gas evolved due to depletion below bubble point, and (v) the distribution of reservoir compaction. The areal and vertical resolution of this information is typically at the scale of tens of meters. This technology addresses important uncertainties in our knowledge of reservoir connectivity and heterogeneity.
-
-
-
From History to Prediction -Techniques for Conditioning Reservoir Models to Dynamic Data
More LessReservoir simulation models should always be built for specific business goals. It is an accepted rule that models used for production forecasts should reproduce the production on history. Although, most history matching processes are often the result of a complex team effort, objectives for using simulation models and the required level of detail are quite diverse. Applications range from prospect evaluation with limited available calibration data to designing detailed production planning scenarios for mature fields with highly constraining well production data. In either case, applied techniques, workflow requirements and the level of complexity will naturally differ. In recent years assisted history matching techniques and optimization workflows have been established and included in best-practice guidelines in an increasing number of companies in the oil and gas industry. The application of assisted history matching techniques is often motivated by the need to handle increasingly complex problem statements as well as the desire to improve workflow efficiency and transparency. Initially, the focus was given to finding single best models. Modelling paradigms, however, are changing. More recently, the industry has given a stronger interest to understanding a distribution of alternative scenarios which more realistically captures the uncertainty-envelope. This step is non-trivial, since there is no natural extension from the paradigm of single best history-matched models with deterministic forecasting capabilities to the paradigm of establishing a distribution of alternative production forecasts. This defines a major challenge to the reservoir engineering workflow and the question of handling multiple models with alternative outcomes. This talk reviews selected techniques used in history matching workflows. It discusses practical considerations for finding a compromise between “accurate” history-matched models with deterministic forecasting capabilities and the newer paradigm of a sufficient coverage of the uncertainty space for establishing uncertainty distributions.
-
-
-
The evolution of HPC and its opportunities and challenges for Seismic Imaging
By N. BienatiOne of the most important factors for Oil & Gas industry (as for any other industry) is the ability of making predictions about the future. In particular, in this workshop we are concerned with forecasts about the future of HPC and its impact on seismic imaging industry. Needless to say, everyone can predict that hardware performance will continue increasing in the future, but one question that it is interesting to address is: how much? One reliable answer can come from the Top500 list. Indeed, Professor Hans Meuer at University of Mannheim, one of the fathers of Top500, has shown (Meuer, 2008) that, according to historical data, the performance of the system classified at the bottom of the list follows a linear trend on a logarithmic scale (see Figure 1). The rate of growth is around 2x every 13 months, faster than Moore’s law that assumes 2x every 18 months. The nice fit of the data to this trend suggests a good confidence in using the linear trend for extrapolation. The result of such extrapolation is the prediction that between 2015 and 2016 all the systems in the list will exceed the performance of 1 Petaflop/sec. Likewise, it is not unreasonable to predict that this figure will be the minimum standard for all the major players in the seismic imaging industry, both Oil Companies and Service Companies. This is certainly a good news for seismic imaging applications like Reverse Time Migration, Full Waveform Inversion and Seismic Modelling that are amongst the most compute intensive.
-
-
-
Trends in Multicore Processors
By A. GonzálezMoore’s law has fueled a dramatic evolution in microprocessor and will keep doing it in forthcoming generations. Microprocessor designers have leveraged the improvements in process technology to enhance the microarchitecture of processors in different manners. In this quest for delivering higher performance, the whole industry has recently started a journey in the land of multicores. Multicores are very effective to increase computing density, by increasing the number of processing units generation after generation. The scalability of multicore processors faces multiple challenges that will require significant innovation in applications, programming paradigms and tools, and architectures. In this talk, I will describe some of the research avenues that are being pursued to address these challenges.
-
-
-
Programming Seismic Algorithms for GPUs
Authors S. Morton, T. Cullison, I. Terentyev and S. MaGraphics processing units (GPUs) have been shown to be capable of efficiently running computationally demanding seismic imaging algorithms. And the recent significant increase in expenditures by the petroleum industry for GPU clusters indicates these systems are cost effective. With this hurdle cleared, the adoption of GPUs is probably limited mainly by our ability to program seismic algorithms for GPUs. At Hess Corporation, we have moved the most computationally intensive parts of our seismic imaging
codes from CPUs to GPUs over the past few years. The effort involved has varied widely from code to code, from a cost of a man-month to nearly a man-year. Our one-way wave-equation migration for GPUs is a direct port of the computational algorithm used on CPUs. The Kirchhoff code required manual optimization of many of its components. An optimized reverse-time migration library was constructed by screening a set of automatically generated kernels. In this talk we will present the computational algorithms for these seismic imaging codes and discuss our software approaches and performance results.
-
-
-
Accelerating seismic processing applications with FPGAs
By O. PellMicroprocessors have been hitting the limits of attainable clock frequencies for the past few years, resulting in the current multi-core processor solutions provided by the major microprocessor vendors. Multiple cores on a chip result in the need to share the same pins to get to the memory system and communication channels to other machines. This leads to a “memory wall”, since the number of pins per chip does not scale with the number of cores, and a “power wall” since chips must still be cooled within the same physical space. Many geophysically important applications such as finite difference modeling, downwards continuation based migration and sparse matrix solvers already exhibit significantly worse than linear scaling on multiple cores, a problem that is only going to worsen as the major microprocessor vendors move beyond quad-core chips to many-core architectures. Maxeler streaming accelerators implemented on Field Programmable Gate Arrays (FPGAs) allow us to bypass the memory wall by minimizing access to external memory and explicitly forwarding data on-chip at a very high bandwidth (over 10TB/s on the latest chips). The high performance attainable with such architectures has been established for a range of applications (for example [1], [2], [3]). At the same time, since FPGA performance is achieved by massive parallelism at relatively low clock frequencies (hundreds of MHz), we avoid the “power wall” and allow our FPGA-based HPC systems to be configured very densely, with accompanying savings in operational costs for power, space, maintenance, etc.
-
-
-
Seismic Imaging and HPC, how to preserve our investment and to prepare the future?
By H. CalandraAn extraordinary challenge the oil industry must face in the hydrocarbon exploration is to develop leading edge technologies to reconstitute the three-dimensional structure of the Earth. Seismic Imaging industry is made possible because of the progress of the computer capacity to process more and more data in a shorter and shorter time. “Thanks to the extraordinary progress of the computer” we have been using for almost 40 years and will still be used for the coming years. Seismic imaging industry is also made possible because the data acquisition technology has made tremendous progress. But again the technology would not have been developed if we were not able to process the huge amount of data generated by seismic data acquisition without the help of large HPC systems. For more than 30 years, seismic reflection is the main technology used in our industry. The physics is well known and is based on solving different approximations of the wave equation. Anticipating and taking advantage of the constantly evolving of the computer technology, geophysicists are able to find numerical implementation which is the most adapted to the computer hardware: from 2D to 3D, Post to Pre Stack , Asymptotic to band limited, one way to RTM, ray tomography to wave tomography and Full Wave form inversion. All these evolutions follow very closely to the progress of the HPC.
-