- Home
- Conferences
- Conference Proceedings
- Conferences
81st EAGE Conference and Exhibition 2019
- Conference date: June 3-6, 2019
- Location: London, UK
- Published: 03 June 2019
1001 - 1010 of 1010 results
-
-
Red Ghosts and Broadband Processing
By Shuki RonenSummarySurface ghosts cause spectral notches. There is always a notch at 0 Hz which limits the low frequency content of the data. The objective of deghosting in data processing is to undo the physical process of surface ghosting. When scattering from irregular surfaces, such as ocean swell, is taken into account, the reflection (and transmission) coefficients become frequency dependent.
This is true for both sources and receivers. Near seismic sources, in addition to swell, particle accelerations may be comparable or even exceed the earth's gravity which is what is keeping water under air. Surface tension also plays a role.
The displacement can be large enough for stress-strain relations to become nonlinear. These nonlinear effects are frequency dependent causing the reflection coefficient to be smaller at higher frequency. Frequency independent reflection would be white but frequency dependent ghosting is red, more so near the source than near the receiver. It is useful to understand the physics of ghosting so that the relations between the up- and the down- going are understood and the up/down separation for deghosting can be made more robust.
-
-
-
Seismic Exploration of a Deep, Possibly Super - critical, Hydrothermal Reservoir in the Larderello Area
Authors W. Rabbel, S. Buske, T. Jusri, D. Köhn, J. Lehr, H.B. Motra, L. Schreiter, M. Thorwart and The DESCRAMBLE Working GroupSummaryThe K-Horizon is a well-known seismic reflection structure of regional extent in the Larderello area. The geological nature of the K-Horizon is unknown though. It is supposed to be a geothermal reservoir where supercritical conditions could to be possible (~450°C temperature). We demonstrate the principally high potential of 3D seismic imaging and inversion for characterizing location and structure of deep reservoir. Depth uncertainties can be strongly reduced by incorporating vertical seismic profiling, lab and field studies of seismic anisotropy and numerical thermal modeling in the interpretational procedure. The K-Horizon seems to represent a pair of thin alternating high-and low-velocity-zones, the uppermost of which can be identified with a drilled dry zone of strongly decreased strength.
-
-
-
Future Technologies for Geothermal Exploration: the Perspective of the Strategic Research and Innovation Agenda of ETIP-DG
By A. ManzellaSummaryThe presentation will be an occasion to describe what is envisaged by the Strategic Research and Innovation Agenda of the Technology and Innovation Platform on Deep Geothermal, and discuss future perspectives of R&I for geothermal exploration.
-
-
-
Before Machine Learning: handling seismic data with Python and segyio
By J. KvalsvikSummaryWe have truly entered the era of machine learning, and new and exciting models and techniques are developed and designed every day.
One of the challenges of ML, however, is the rich variety of formats and data around. It is very difficult to build machine learning models if it is difficult to get data.
SEG-Y has been an industry standard for over 40 years now, and data in SEG-Y files are quite valuable. Leveraging this data is a key component in many machine learning projects, and the format still represents a challenge.
This is a hands-on workshop describing and demonstrating practical use of the free Python library segyio (https://github.com/equinor/segyio) for reading and writing SEG-Y. It has become quite popular in the open geoscience community, and is designed from scratch to be a suitable building block for new applications.
The workshop will focus on dialogue and discussion, and address some common use cases for machine learning and implementing SEG-Y support in new machine learning and geoscience projects.
-
-
-
On the Challenges of Time-Lapse EM in a Production Environment: Lessons Learned From a Real-World Trial
Authors R. Streich, A. Schaller, G. Drijkoningen and L. LiuSummaryWe will present results from a field trial of onshore time-lapse EM in a production environment, with special focus on the influence of well casings and pipelines on the acquired data, and lessons learned for future application of 4D EM.
-
-
-
Efficient Monte Carlo Uncertainty Quantification Through Problem-dependent Proposals
By K. MosegaardSummaryThe solution of an inverse problem is a process where an algorithm asks questions to the data. In some cases the questions are yes/no questions (accepting or rejecting a model proposed by an Markov Chain Monte Carlo (MCMC) algorithm) and in other cases the questions are more complex, as in a deterministic algorithm's quest for gradients or curvatures. However, no algorithm can ask the right question without an efficient interrogation strategy. Such a strategy comes from what we call ‘prior information’, either about the solution to be found, or about the nature of the forward relation. The latter strategy is particularly important and is for MCMC algorithms expressed through the ‘proposal distribution’. We shall explore the importance of proposal strategies, and show that dramatic improvements can be made if information-rich strategies are employed.
-
-
-
The Data Acquisition and Study program into Induced Seismicity in the Groningen Gas Field, N.E. Netherlands
By J. Van ElkSummaryThe Mw 3.6 earthquake near the village of Huizinge on the 16th August 2012, prompted field operator, NAM, to expand its research program into the induced seismicity in the Groningen field. I will introduce the programme and the issue of induced seismicity in Groningen to the EAGE community in this EAGE workshop on induced seismicity in London.
The studies program consists of two parts. The main research effort focuses on the assessment of hazard and risk the community living above the field is exposed to. This is a very targeted program consisting of monitoring of the seismicity in the field area and studies supporting the modelling of the link between the cause, the production of gas from the Groningen field, to the effect, building damage and potential risk to people in and around these buildings. This part of the research was executed under tight deadlines set by the regulator and Minister of Economic Affairs and Climate Policy. Each of the studies in this part of the program aimed at improving the modelling of elements in the cause and effect chain.
Additionally, a scientific program was set up with the aim to better understand the physical processes leading to the destabilisation of the faults in the field and the induced seismicity. The components of this research are (1) geophysical data acquisition, analysis and modelling (2) geomechanical modelling on a single fault and at the field scale and (3) laboratory experiments on the complex coupled processes controlling rupture of faults and friction during the fault movement. Synthesis of these field and laboratory observations with the geomechanical models of faults has much improved understanding of the coupling between gas production, reservoir deformation, fault rupture and seismic wave generation. Moreover, huge amounts of data have been produced and are available for advancing this understanding further, in the context of Groningen and of induced seismicity in general.
-
-
-
Management of Induced Seismicity During Hydrolic Fracturing in Real Time
By H. ClarkeSummaryIt is well established that hydraulic fracturing can cause reactivation of pre-existing subsurface geological structures resulting in induced seismicity. In response, various jurisdictions have imposed mitigating regulations, primarily in the form of Traffic Light Schemes, whereby injection is reduced or stopped if seismic events exceed a certain magnitude threshold. The threshold magnitudes vary significantly between jurisdictions: for example, in Alberta the red light is set at M = 4, whereas in the UK the red light is set at M = 0.5, a difference in earthquake moment of over 175,000 times.
-
-
-
Selection of Representative Models: An Example of a Fluvial Sandstone Reservoir
By K. QureshiSummaryThe typical workflow for the creation of final static models is composed of many steps based on the inputs from surface seismic interpretation, geological well markers, 1D analysis of logs and a depositional model. Similarly, geological modeling processes such as facies and petrophysical modeling can simulate different properties based on the input data and parameters. Uncertainty is inherent in all these inputs and processes. For example, depth-converted seismic horizons use velocity models, which can impact the reservoir thickness and fault throws; facies modeling influences geometry, proportions, and spatial distribution of bodies; and petrophysical modeling influences porosity and permeability and fault analysis on the transmissibility between the fault blocks. Some of these uncertainties influence hydrocarbon volumes, some uncertainties influence the dynamic behavior, and some influence both. Using the latest tool for geological screening, we can select representative models from the uncertainty runs. Such screening supports the use of numerical experiments to calculate measures of both volume and dynamic connectivity, which, combined with input parameter analysis, serves the model selection. The selected representative models can be used for detailed reservoir simulation. ( Larue,2006 ). We tested the above method using synthetic dataset of a dome-shaped structure with fluvial sandstone reservoir having number of wells.
-
-
-
Education in the Future - Connected Learner Experience
Authors K. Jesudasan, A. Khan, A. Bhaduri and M. SumanSummaryThe industry has seen unprecedented change and capability development of the workforce has been lagging for various reasons. The shift in demographics and the rapid adoption of new digital technologies in the industry require a new and more complete approach to learning and development that can address a younger workforce and leverage digital technologies. A digital connected learner experience can cater to the needs of the workforce of the future. It leverages a digital content library to provide a structured yet personalized learner experience and enables accelerated development. This digital learning ecosystem would integrate to existing systems and aggregate different sources of digital content to bring a more complete learning environment centered around the user.
-