- Home
- Conferences
- Conference Proceedings
- Conferences
79th EAGE Conference and Exhibition 2017 - Workshops
- Conference date: 12 Jun 2017 - 15 Jun 2017
- Location: Paris, France
- ISBN: 978-94-6282-219-1
- Published: 12 June 2017
1 - 50 of 144 results
-
-
Experimental Investigation of Thermal Marangoni Effect on Bypassed Oil Recovery
Authors M. Masoudi, B. Rostami, M. Khosravi and P. AbolhosseiniIn this paper, the effect of interfacial tension (IFT) gradient caused by the temperature changes (Benard-Marangoni phenomenon), as a novel EOR method has been investigated. For a proper understanding and visualizing the mechanism, glass micromodels were used. Carbon dioxide and methane were injected separately to n-decane saturated micromodel. Gas injection experiments were conducted in different conditions and the impact of temperature and pressure have been investigated. Cold gas injection was compared with isotherm gas injection as the zero level of Marangoni flow. The impact of Marangoni flow has been compared with other active production mechanisms. Presented results revealed that the IFT gradient due to change in temperature at the interface of the oil and gas, induces a conductive flow that improves oil recovery and compensates the negative effects of other mechanisms by overcoming the capillary forces. The results show the significant impact of thermal Marangoni convection on the recovery of bypassed oil and introduced Benard-Marangoni convection as an important mechanism of bypassed oil recovery especially in low pressure reservoirs.
-
-
-
IoT-based Wireless Networking for Geoscience Applications
Authors H. Jamali-Rad and X. CampmanNowadays, sensors are everywhere in the Oil and Gas industry, which leads to creation of a “Big Data”. This data in many cases should be aggregated and coordinated in a real-time fashion, sometimes in harsh environments. To address this issue, we have defined a unified wireless sensing framework that comprises three modules: cheap low-power long-range wireless sensors with inherent compatibility with the Internet of Things (IoT), advanced scalable wireless networking protocols, and data storage/analytics running on clouds for analysis and decision making. As a showcase, we present our seismic field test results with low-power wide-area networks (LPWANs) in the Netherlands.
-
-
-
Performing Successful Data Science in the Geoscience Domain
Authors D. Irving and J. McConnellTo perform data science with scientific data, we must represent the scientific problem space to allow analytics. This requires a blend of traditional physics-based algorithms with modern advanced analytics, performed on datasets large enough to yield statistically robust insights. These exposed insights in the data must be explained by scientists, driving creative thinking, in contrast to application-driven workflows where line-of-sight to original data is typically absent. We show how an open approach to data parsing, storage and integration drives better understanding of data, and moreover, enables the deployment and development of open source tools for processing, analysing and visualising data and insights. Dealing with measurement data brings challenges of quality, sparsity and irregular sampling, in datasets that must be integrated in the spatial, time and frequency domains. This is time-consuming work, often taking 80% of the time of each analytical study, and so we recommend that data from the geoscience domain should be curated in a “load once, use many times” paradigm. Higher-level parameters can then be created to capture the scientific insights of multi-physics systems for use in one-off or operationalised descriptions of a system. After implementing this level of abstraction, the geoscientific world is ready for data science.
-
-
-
Fostering high-impact machine learning ecosystem in subsurface science and engineering
By M. HallThe field of machine learning is experiencing a boom. The International Energy Agency predicts the 2020 analytics market in upstream petroleum alone will more than double to $10 billion. Previous such hype cycles, especially the one at the end of the 1980s, ended in a mass extinction event: artificial intelligence companies died off, funding seas dried up, and 'expert systems' became dirty words. Meanwhile, however, research continued under codenames like 'informatics', 'machine learning', 'big data', and 'data analytics'. Today, as the AI spring gives way to an AI summer, how can we give our projects the best chance of having the impact we believe they can have? As the petroleum industry moves into its autumnal years, I propose eight strategies for the computational science and engineering community to bring about the profound changes to its safety and operating efficiency that we all believe we can achieve. These strategies are well tested in other fields, and many of them have at least been tried in subsurface science and engineering.
-
-
-
Deep Learning on Hyperspectral Data for Land Use and Vegetation Mapping
Authors N. Audebert, B. Le Saux, S. Lefevere, C. Taillandier and D. DubucqRemote sensing technology is a remarkable tool to explore and to measure Earth’s surface features. Total and ONERA set up a collaborative partnership named New Advanced Observation Method Integration (NAOMI) that aims at adapting and developing new remote sensing techniques specifically targeted for hydrocarbons exploration and environmental protection. In this context, we integrate deep learning for classification of hyperspectral data. To detect different land uses and materials in aerial hyperspectral images, neural networks prove themselves to be very efficient tools, as they are able to learn discriminant features that help classification performance.
-
-
-
Machine Learning can extract the information needed for modelling and data analysing from unstructured documents
Authors H. Blondelle, A. Juneja, J. Micaelli and P. NeriSince its early days, the exploration and production industry has handled large volumes of data, mainly measurements, to build subsurface models used for strategic or technical decisions. More recently, data analytics technologies have emerged to complement the modelling tools, with notable successes in the domain of field monitoring. But the broader adoption of new analytical tools is made difficult due to the limited access to the large percentage of relevant data that is stored in unstructured formats. This issue is not new: modelling tools faced the same difficulty, but with a lower order of magnitude because each tool has a limited set of input data. Manual information extraction by skilled technicians from unstructured documents to feed sophisticated enterprise data models and modelling tools was acceptable even if it represented a poor use of a trained professional’s time. Despite these efforts, it is estimated that only 20% of the information available in our industry is stored in structured, searchable databases. Analytical tools require much more than that to perform adequately. With the emergence of new analytics tools, our industry now has a much greater appetite for data than it has ever had before. Is Machine Learning the means to satisfy it?
-
-
-
Unsupervised identification of electrofacies employing machine learning
Authors I. Emelyanova, M. Pervukhina, M. Clennell and C. DytMachine learning techniques are widely used in petrophysicics and geophysics to solve complex and non-linear problems of practical importance. In particular, numerous applications for identifying electrofacies from well logs have been conducted. However, there is no unique approach for reliable automatic classification of electrofacies as the accuracy of the applied algorithms may vary depending on data and initial conditions. To overcome instability in outcomes from various algorithms, we suggest applying different clustering techniques to log data, in a way similar to the popular method of supervised classification ensemble learning. Such an ensemble of clustering outputs are further integrated into unique classes (electrofacies) for subsequent automated identification of lithofacies. Here we apply three different clustering algorithms, namely, Spectral Clustering Self-Organizing Map and k-means, in order to reliably classify electrofacies at a petroleum exploration well Lauda-1 drilled in the Northern Carnarvon Basin (Western Australia). The clustering outputs integrated into electrofacies are validated using an expert facies classification. We show that some facies identified by the expert are not distinguished as separate classes, at least for the chosen well and selected logs. The established electrofacies can further be assigned to conventional lithofacies. This requires creating an expert system which is currently under development.
-
-
-
Machine Learning Based Workflows in Exploration and Production
Authors J. Limbeck, M. Araya, G. Joosten, A. Eales, P. Gelderblom and D. HohlIn this presentation we are going to cover a mapping between existing E&P workflow components and their data science based counterparts - as we have developed or envision them. We present one example from the geophysics domain, where deep neural nets are used to accelerate the seismic interpretation process (GeoDNN), and one example from the reservoir engineering domain (AutoSum) where machine learning is used to analyze are large ensemble of reservoir models.
-
-
-
Automated facies prediction in drillholes using machine learning
Authors M. Blouin, A. Caté, L. Perozzi and E. GloaguenMachine learning is a popular topic in geosciences at the moment. It allows the management and interpretation of data in quantities and varieties (number of variables) that a human being would not be able to achieve. Rock physical properties acquired along drillholes can be used to generate predictions about the nature and characteristics of the rock when wireline logging is taking place. In this paper, we investigate the accuracy of facies prediction using machine learning algorithms by automatically interpreting geological rock types along drillholes from rock physical properties. A data-processing workflow is proposed to enhance the prediction power of the geophysical measurements, a model calibration approach is outlined and predictions on test data are presented. Results show more than 80% of correspondence between the automated prediction and the geologist interpretation.
-
-
-
Optimising storage for high-speed data access to large volumes of data – recent advances and future direction
More LessHistorically seismic and other data have been stored on tape media of various types. The density of data on tape has increased over time. Access times have improved as well but they are not as fast as disk. Disk storage comes in various guises from cheap commodity to highly resilient. A cost benefit analysis shows that high speed intelligent disks are more cost effective than multiple cheap disks. Additional work has demonstrated that using Object storage on disk significantly improves access time to data as compared with standard Btree storage. The advent of cheap solid-state storage has changed the way we view storage and more and more systems are appearing with solid-state only storage (laptops, phones, tablets and high end analytics machines). Has the time now come to ditch tapes and disks and switch to solid state?
-
-
-
ForM@Ter: a data and services centre for Solid Earth
Authors E. Ostanciaux, M. Mandea, M. Diament and O. JametTo ease the use of satellite and in situ Earth’s observation data, the French scientific community develops four centres corresponding to the main physical Earth’s compartments: ERIS (atmosphere), ForM@Ter (solid Earth), ODATIS (ocean) and THEIA (land surfaces). These centres are developed in the framework of a project of a single research infrastructure included on the French « large » research infrastructures roadmap, to be implemented in the coming years. The first ForM@Ter target focuses on surface deformation from SAR and optical imagery data. The associated services are implemented considering the needs expressed by the French scientific community to support the use of the huge data volumes like those provided by Sentinel missions. Within this context, we present the Ground Deformation Monitoring service which is developed for scientific and private users to facilitate exploitation of radar and optical data for ground motion monitoring applications. It contributes to the ESFRI EPOS research infrastructure implementation. A massive processing radar data service is also being implemented with the objective to provide displacement map time series over large areas. It will be established using MUSCATE, a CNES computing infrastructure. These two services are based on PEPS, a CNES infrastructure hosting Sentinel products.
-
-
-
Technical Descriptions in Long-term 115⁰C Borehole Digital Micro-seismic Monitoring at the PTRC Aquistore CO2 Sequestration Project
Authors C. Nixon, D. Schmitt, R. Kofman, D. White and et allA preliminary overview of digital downhole microseismic monitoring at the SaskPower Aquistore carbon capture and storage project in the Williston basin is discussed. The digital downhole monitoring system is presented, which has been successfully deployed for up to six weeks at 2800 m and 115⁰C. Technical experiences are described, including difficulties and solutions in sustained operation at extreme conditions. 750 gigabytes of high quality seismic monitoring data was obtained and reviewed for seismic events by moveout and signal to noise ratio. Teleseismic events, mine blasting, and dynamite orientation shots were all spotted easily. A present lack of induced seismicity looks promising for carbon capture and storage in the Williston basin, but monitoring data is still being more carefully reviewed with automated selection algorithms.
-
-
-
Insights from applying Machine Learning techniques to Geosciences data from the Oil and Gas industry
By M. TibbettsArundo is a USA and Norwegian based data science services and software solutions company applying machine learning techniques across a wide range of heavy-asset industries. Our data scientists typically come from academic backgrounds where machine learning is regularly utilized such as particle physics. We have recently been working with partner organizations from the Oil & Gas industry, applying our data science expertise to geoscience data, including seismic, from operating fields. We will summarize our experience of using machine learning with such data and how our wider experience of those techniques has been utilized for geoscience use cases. We expect to be able to present results of analyses we have worked on with our partner organizations for the first time. Finally, we will discuss how subsurface geoscience data and machine learning can be used to optimize maintenance and operations in the oil and gas industry.
-
-
-
Sensitivity analysis of synthetic seismograms in sedimentary basin with respect to uncertain seismological parameters
Authors F. De Martin, P. Thierry, D. Keyes and E. ChaljubThis study focuses on the understanding of the variability and sensitivity of synthetic earthquake ground motions at sedimentary basin scale associated with the epistemic uncertainties of the model of seismic wave propagation. The key question at stake is the following: What is the spatiotemporal variability and sensitivity of seismograms at the surface with respect to seismological parameters? To achieve this objective, we describe the whole concept including pre- and post-processing around the main earthquake simulation engine. The different stages can be viewed as follow: 1° definition of the initial model perturbation to generate a given set of input parameters, 2° simulation including a runtime filtering and a post-simulation filtering and decimation to reduce the amount of output data, 3° uncertainty quantification analysis on parallel file system including Hadoop evaluation, and multi-level MPI communicators to obtain the final results of the global big data application.
-
-
-
Automatic similarity mesasures to manage geoscience databases
By A. FugaIn geosciences Data Management, the volume and complexity of data flows as well as historical merges of companies and databases, have raised harmonization, reconciliation and geo-referencing topics. Data duplication in databases causes storage place loosing, and casts doubt upon the different versions of the same seismic exploration data. The harmonization issue appears also when loading newly acquired or bought data in the referent database. To ensure data quality as well as good data access and storage savings, this integration needs to be done without creating duplicates and without lowering the databases quality. To meet the emergency of data integration requests, this research and development work has led to the design of a methodology and software based on multiscale contextual similarity, automatically computed. A new workflow has been adapted in TOTAL for seismic 2D and 3D navigation lines harmonization, wells databases reconciliation, technical documents geo-referencing, etc. This research work has demonstrated the capability to save 75% of the time data loaders, data managers or geophysicists spend on classical methods for harmonization. Moreover, databases visualization algorithms have been created for databases harmonization and characterization, to open perspectives of a more global and visual Data Management approach.
-
-
-
Channel Characterization Using Support Vector Machine
Authors A. Mardan, A. Javaherian and M. MirzakhanianRapid growth in the size of seismic data and the number of attributes cause to increase the significance of pattern recognition techniques in interpreting the seismic data. Unsupervised methods include k-means, self-organizing maps (SOM) and generative topographic maps (GTM) let interpreters do a preliminary interpretation and conclude relatively suitable information with no much primary data from studied area. On the other hand, utilizing supervised learning such as neural networks (NN) and support vector machines (SVM) by interpreters require some primary information from studied area to seed the existent facies and use these seeded samples as the input to the algorithm. In this study, to detect channel facies of one of the southwest hydrocarbon fields of Iran, we used k-means and SVM to train the second algorithm by using the extracted primary information from the first algorithm. Results show that the existent channels in the studied area have two different facies that can be detected by applied algorithms.
-
-
-
Advanced Machine Learning for Unconventional Plays
More LessImproving the capital efficiency in oil and gas exploration and production, particularly in the unconventional (UNC) plays, becomes vitally important for the industry. Since it is evident that the existing geological and petro-physical methodologies and technologies that enjoy good success in conventional plays become not as effective when applied to UNC plays, more effective approaches are in high demand. However, in oil and gas exploration, the most critical phase is the early land appraisal and initial development of the so-called green fields wherein the available data is usually scarce. This poses a great challenge to both domain experts and Machine Learning practitioners. How can Machine Learning and its related techniques be applied to help in early land appraisal and sweetspotting to greatly improve the capital efficiency? This paper describes our recent advances in developing a Machine Learning sweetspotting workflow and presents our results and findings in identifying the higher production potential areas in an example. The workflow uses a imputation scheme and a more generic and powerful ensemble learning technique which combines the strengths of a set of different Machine Learning algorithms. Consequently, our new workflow has achieved very good results in terms of R^2 (0.83) from leave-one-out cross validation.
-
-
-
Carbonate Reservoir Cementation Factor Modeling Using Wireline Logs and Artificial Intelligence Methodology
Authors F. Anifowose, C. Ayadiuno and F. RashedianAn approach, comprising statistical and artificial intelligence techniques, to modeling rock cementation factor in a Saudi Arabian carbonate reservoir using wireline logs is presented. The objective is to obtain a more accurate prediction of rock cementation factor, denoted by the exponent, m, in Archie’s equation, as a variable log using multivariate linear regression (MLR), artificial neural networks, and support vector machines. Published equations by Nugent, Lucia and Shell are empirical derivations based on porosity logs and assumptions that may not be applicable in other geological settings. Typically, log analysts use the average of m values obtained from special core analysis (SCAL) measurements. Such constant values do not account for formation heterogeneity resulting in inaccurate water saturation and pore volume estimation with high operational and economic costs. In this study, six wireline logs from seven wells were combined with their corresponding core measured m values to build and optimize the proposed models to predict the m values for new wells or uncored sections of existing wells. The predicted m values produced by the MLR model closely matched available m data from SCAL measurements. This study fulfills the pressing need for variable m as a more accurate input to water saturation models.
-
-
-
Adjusting well plan trajectory through 3-D seismic litho-fluid classification, A Case Study
Authors K. Kazemi and M. DelnavaAdjusting the geological well plan trajectories with geophysical methods could be an effective way in assisting to prepare more precise and accurate plans for drilling new wells. The main objective of this study was to check the proposed well path through geophysical sections and position the well trajectory with respect to the target layers and reduce the drilling risks and also costs as much as possible. For this reason at first a pre-stack simultaneous seismic inversion was conducted to generate Acoustic impedance and Vp/Vs ratio cubes. Consequently elastic and petrophysical well log data were evaluated to determine different litho-fluid classes including Hydrocarbon Sand (HC Sst.), Wet Sand (Wet Sst.) and Shale classes. Bayesian-derived probability density functions (PDFs) for each litho-fluid class were calculated from well log computations of AI and Vp/Vs. Using the PDFs and pre-stack seismic inversion results, probability cubes for the individual litho-fluids in addition to final litho-fluid cube were calculated. Based on generated results, well plan trajectory was adjusted to pass through the well-defined HC Sst (the target reservoir layer). Results of this study illustrated the usefulness of the Litho-fluid cubes derived from 3D seismic data in reducing the drilling risks and costs.
-
-
-
Subsurface Integrity Management of UGS
Authors F. Favret, R. Del Potro and M. DiezWell integrity has been specifically documented in particular in Norsok D-010 and in ISO/TS 16530 with leak paths and subsequent risk assessment, well barrier elements (WBE), monitoring and maintenance. Several companies have developed their own methodology for well integrity management including new software that has been developed to help operators plan and optimize maintenance, based on lessons learned from well equipment failure. However to ensure the safety of a storage facility, well integrity management has to be complemented by storage integrity management (Bonnier et al., 2015). Getting information on storage status during operation is difficult since these huge storing volumes cannot be accessed for direct in-situ controls. For example, in leached salt caverns, indirect observations are possible but the logistics required prevents continuous observations. Here we present two complementary indirect methods that contribute to the assessment of storage integrity: microseismic monitoring and PVT modelling. This joint approach is applicable to all storage types. A case of a multi salt cavern gas storage facility is presented in this document.
-
-
-
How widespread is induced seismicity in Canada and the USA?
Authors M. van der Baan and F. CalixtoThe seismicity rate in Oklahoma in the last 5-8 years is correlated to increased large-scale hydrocarbon production. Contrary to Oklahoma, analysis of oil and gas production versus seismicity rates in 6 other States in the USA and 3 provinces in Canada finds no State/Province-wide correlation between increased seismicity and hydrocarbon production, despite 8-16 fold increases in production in some States, including North Dakota (Bakken formation) and Pennsylvania, West Virginia (Marcellus shale). However, in various areas, seismicity rates have increased locally.
-
-
-
Case study: Lacq pilot CO2 storage in old gas field - microseismic monitoring
Authors J. Barnavol and X. Payreor reservoir seal integrity and supply all seismological metrics and attributes necessary for seismic risk management and population information. During the monitoring (2009–2015), 2637 events have been detected and 717 were located inside the surveillance perimeter. Magnitude of events ranged between -2.6 and +1.1 in two distinct clusters: one in the vicinity of the injection point (reservoir perimeter) and another one close to the “Meillon/Saint-Faust” fault complex, located 2km north of the injection site (local perimeter) (figure below). The seismic monitoring on this CO2 storage pilot shows a strong influence of the injection in the reservoir perimeter (≈1 km around the injection point). At a larger scale (local perimeter), this study did not enable to assess reliably eventual seismicity rate changes during injection operations. Nevertheless, this “multi-scale” seismic network design perfectly achieves the goals assigned in terms of risk management, population information and injection mapping.
-
-
-
Dynamics of Fault Activation by Hydraulic Fracturing in Overpressured Shales
By D. EatonFluid-injection processes can induce earthquakes by increasing pore pressure and/or shear stress on faults. Natural processes, including transformation of organic material (kerogen) into hydrocarbon, can similarly cause fluid overpressure. Here we document examples where earthquakes induced by hydraulic fracturing are strongly clustered within areas characterized by pore-pressure gradient in excess of 15 kPa/m. By contrast, induced earthquakes are virtually absent in the same formations elsewhere. Monte Carlo analysis indicates that there is negligible probability that this spatial correlation developed by chance. A detailed analysis was undertaken within a region in Alberta, Canada where uniquely comprehensive data characterize dynamic interactions between seismicity and well completions. Seismicity is strongly clustered in space and time, exhibiting spatially varying persistence and activation threshold. The largest event (ML 4.4) can be reconciled with a previously postulated upper bound on magnitude, only if the cumulative effect of multiple treatment stages is considered. Induced seismicity from hydraulic fracturing reveals contrasting signatures of fault activation by stress effects and fluid diffusion. Patterns of seismicity indicate that stress changes during operations can activate fault slip to an offset distance of > 1 km, whereas pressurization by hydraulic fracturing into a fault yields episodic seismicity that can persist for months.
-
-
-
Effective stress drop of fluid induced seismicity
Authors T. Fischer and S. HainzlIn this paper we test how the effective stress drop is comparable to the static stress drop of a single earthquake rupturing the same fault portion. To this purpose, we compare the spatiotemporal evolution of the seismic moment release and analyze the uncertainties of the resulting stress drop estimates. We show that the effective stress drop is only comparable to earthquake stress drops in specific cases. In particular, the effective stress drop values significantly underestimate the earthquake stress drops in the presence of aseismic deformation. Furthermore, the values are only scale-independent if pre-stress and post-stress conditions are uniform in space. Our analysis of data from injection-induced seismicity, natural earthquake swarms and aftershock sequences shows that in most cases the effective stress drop estimate is rather stable during the cluster evolution. Slightly increasing estimates for injection-induced seismicity are indicative for the local forcing of the system, while overall low effective stress drop values hint to the important role of aseismic deformations. While normal values up to 1MPa are found for seismicity associating geothermal reservoirs stimulation, anomalous small effective stress drop occur in case of fracking tight sands and shales, which may indicate aseismic deformation during these treatments.
-
-
-
Scaling of induced seismicity, implications for the role of geological setting on seismic hazard
Authors G. Viegas, A. Baig and T. UrbancicIn this study we present conflicting stress drop estimates of injection-induced events in two regions of the Western Canadian Sedimentary Basin. Horn River Basin events show lower stress drops than Duvernay Basin events by a factor of 10 to 20. We propose that the observed stress drop differences are caused by different regional stress characteristics, assuming the seismic events are generated during similar injection programs. A potential difference the Fox Creek region is characterized by the presence of reefs in the Leduc formation that cross-cut the Duvernay shale formations which form drapes over the reef-off reef facies (Stoakes, 1980). We suggest that differences in stress drops reflect differences in the regional stress state with events occurring in more stressed regions having higher stress drops, that is, being able to release larger quantities of stored elastic strain. Higher stress drop earthquakes have a significant role in seismic hazard as they generate higher frequency strong ground motions which can potentially cause more damages.
-
-
-
Mechanisms driving earthquake faulting during a case of injection-induced seismicity
Authors M. Diez and R. del PotroInduced seismicity is currently one of the main geomechanical and environmental challenges faced by the underground industry. Current efforts focus mainly on seismic monitoring and risk assessments while progress on unraveling the mechanisms that drive induced seismicity have been addressed to a lesser extent. Here we explore stability conditions for unstable slip, and potential dynamic weakening mechanisms to explain earthquake faulting in a case of injection-induced seismicity. Injection of natural gas into the Castor Underground Gas Storage, offshore Spain, which generated a ~2 bar pressure increase, induced a seismic swarm that culminated in a series of Mw~4 earthquakes, two weeks after shut in. We focus our attention on frictional weakening and on the dynamic weakening effect of shear heating-induced thermal pressurization to explain Mw >3 earthquake faulting during the Castor sequence. These new mechanisms that we explore may help improve our understanding of cases of injection-induced seismicity, in regions of low natural seismicity, where the external forcing, or amplitude of the stress perturbation is relatively small.
-
-
-
Verification of Network Design for Induced Seismicity
More LessInduced seismicity monitoring for hydrocarbon or geothermal energy extraction is usually designed to meet political or environmental goals and limitation (e.g. UK limitations on magnitude of completeness thresholds). Therefore operators seek monitoring network designs to meet or exceed these goal or limitations. Generally, before starting any microseismic monitoring the geometry of array has to be designed to achieve optimal performance of the network and to fulfil all demands for obtaining of the required data quality. Design of the monitoring array should follow several rules. For surface monitoring networks, a proper detection and location of events requires station spacing approximately twice as large as the expected depth of the seismic events. In addition, seismic noise in some area may significantly decrease the monitoring network performance. Last but not least the monitoring network performance is dependent on the assumed velocity and attenuation models. All of these factors significantly affect the network performance and it is a challenge to verify that the proposed performance is going to be real. Additionally, network performance in seismically quiet areas is particularly challenging as there is no seismicity to benchmark it on before the start of the operations (e.g. Gaucher, 2016).
-
-
-
Examining the capability of statistical models to mitigate induced seismicity during hydraulic fracturing of shale gas reservoirs
Authors J. Verdon and J. KendallIn this paper we test the ability of statistical methods to estimate the expected size of the largest event during stimulation, applying these approaches to two datasets collected during hydraulic stimulation of a North American Devonian Shale. We apply these methods in a prospective manner: using the microseismicity recorded during the early phases of a stimulation stage to make forecasts about what will happen as the stage continues. We do so to put ourselves in the shoes of an operator or regulator, where decisions must be taken based on data as it is acquired, rather that a post hoc analysis once a stimulation stage has been completed. We find that the proposed methods are able to provide a reasonable forecast of the largest event to occur during each stage. This means that these methods can be used as the basis of a mitigation strategy. Applying such a strategy to our case studies, we find that the majority of stages would have been allowed to continue as planned, while that the need for mitigation would have been identified for all of the stages that ended up inducing larger events.
-
-
-
Microseismic Geomechanical Evaluation of Fault Slip Associated with Hydraulic Fracturing
By S. MaxwellThe paper describes applying a coupled hydraulic-mechanical model and the resulting seismicity catalogue to explore repeated fault activation and seismicity patterns associated with multi-stage hydraulic fracturing. The model is also used to explore seismic risk mitigation by changing the viscosity of the fracturing fluid.
-
-
-
Integration of geomechanical modeling with induced seismic source mechanisms to assess deformation and stress changes
Authors D. Angus, G. Viegas, T. Urbancic and A. BaigThe recording of induced seismicity plays a significant role in monitoring geo-industrial activities in terms of providing improved understanding of the failure mechanisms and as well as quantitative tool for risk assessment. The induced seismicity is due to local perturbations to the in situ stress field from such industrial activities. Significant work is being directed at developing fracture models that enable the assessment of local deformation and stress evolution due to stimulation design. Passive seismic monitoring provides an additional data source to study the stress field evolution by imaging the in situ response of the rock mass over and characterizing the failure mechanisms and high-order inelastic response of the rock mass. We integrate microseismic data with a geomechanical model to quantify deformation and stress field evolution. Our approach utilizes moment tensor solutions to represent localized discrete rupture zones. The geomechanical algorithm evaluates the Green’s functions for each rupture, and subsequently calculates the co- and post-seismic deformation using linear superposition. We perform this work flow on a passive seismic dataset to validate the technique as well as provide insight into the observed seismicity response due to a large (~M5) event.
-
-
-
Production Induced Seismicity in the Netherlands - From quick-scan to advanced models
More LessApproximately one out of six producing onshore gas fields (both the Groningen Field and small fields in the north-western part of the country) in the Netherlands experiences production-induced seismicity. Poro-elasticity and related differential compaction were referred to as the prime mechanisms causing this seismicity. The observed onset of induced seismicity in the Netherlands occurred after a considerable pressure drop in the gas fields. A large range of methods has been applied to study the background of Dutch seismicity, ranging from quick scans to advanced 3D geomechanical modelling studies. We have shown that both simplified 2D and full-field 3D geomechanical models can be used to model the onset of reactivation and identify faults which are prone to be reactivated. One of the approaches we used was inclusion of dynamic rupture modelling in traditional geomechanics workflows. In dynamic rupture modelling, traditional static friction laws are displaced by dynamic slip evolution. This enabled us to include realistic nucleation and propagation of the seismic events, as e.g. extend of the rupture area and slip displacements could be determined. We present an overview of the different methods that have been used to better understand induced seismicity in the Dutch onshore gas fields, including our dynamic rupture modelling method and an outlook to what this could mean in the future.
-
-
-
Numerical modelling of production-induced stress changes and fault reactivation in Rotliegend gas fields of the North German Basin
Authors C. Haug, A. Henk and J. NüchterProduction-induced seismicity is an increasing challenge to the E&P industry. Related poroelastic stress changes in reservoirs with complex geometries, their interference with tectonic stresses and their interaction with faults cannot be sufficiently explained by analytical models. Here, we develop generic numerical models to study production-induced stress changes and fault reactivation in and near compartmentalized gas reservoirs. The models are inspired by features of Rotliegend gas fields of the Northern German Basin but do not describe a specific reservoir. In a linear-elastic model series I, stress changes during pore pressure drawdown are investigated for different model parameters. Field properties leading to an increased tendency of fault reactivation are, among others, a high Biot-Willis coefficient, a locally reduced overburden load and a large reservoir thickness. In model series II a contact surface pair simulating a simplified fault is incorporated and the mechanical response of the contact surface to different schemes for absolute stress developments are simulated. For high friction coefficients the contact surface stays stable with production while for µ<0.6 the contact surface slips after a stage of poroelastic stress increase. Modelling results provide insight into the mechanisms that control production-induced poroelastic stress changes in compartmentalized reservoirs with complex geometries.
-
-
-
Pore-scale Processes in Amott Spontaneous Imbibition Tests
Authors M. Rücker, W.B. Bartels, M.A. Boone, T. Bultreys, H. Mahani, S. Berg, A. Georgiadis, S.M. Hassanizadeh and V. CnuddeWe observed the redistribution of the oil phase in the pore space of the rock in real-time in water-wet and mixed-wet (by ageing in crude oil) carbonate samples. During the imbibition of the water phase both, pore filling events with connection to the surrounding brine as well as snap-off events connected through water films only were detected. The distribution of the oil in different pore sizes as well as the different event types help to identify the wettability state of the system and understand how pore scale processes lead to the oil production at the larger scale.
-
-
-
Differential imaging of porous plate capillary drainage in laminated sandstone rock using X-ray micro-tomography
Authors Q. Lin, B. Bijeljic, H. Rieke and M. BluntThe experimental determination of representative capillary pressure curves as a function of saturation is of utmost importance for the determination of the initial reservoir fluid distribution and subsequent flow properties under production. We design an experimental procedure to image porous plate capillary drainage using X-ray micro-tomography based on differential imaging for a laminated sandstone micro core (4.86 mm in diameter). The pore structure, including the sub-resolution micro pores, was characterised and quantified using both the initial dry scan and the scan fully saturated with Potassium Iodide (KI) doped brine (30 wt%). During the porous plate capillary drainage, nitrogen (N2) was injected at a constant pressure and the capillary pressure was controlled by the pressure drop through the core sample. A full range of capillary pressure curve against saturation from 0 to 1.17 MPa is provided from the image analysis and is compared with the Special Core Analysis (SCAL) for the original core (35 mm in diameter). We are also able to discern that brine remained predominantly within the sub-resolution micro-pores, such as regions of fine lamination. Moreover, brine covering the rock grain surface and in the corners of the macro-pores can also be visualised.
-
-
-
Estimation of source time functions and yields of explosions directly from seismograms using the cube-root scaling law
More LessI estimate the source time functions and yields of explosions directly from seismograms. The method requires seismograms at a single receiver for two events of different size at the same source location and eliminates the path effect between source and receiver by finding a ratio filter that shapes the seismogram of the smaller event to the seismogram of the larger. If the noise is small, the convolution of the filter with the source time function of the smaller event yields the source time function of the larger event. The two source time functions are also related by the well-known scaling law in which the injected volume is proportional to the yield and the time constant is proportional to the cube-root of the yield. These two independent equations are solved for the two source time functions by a trial-and-error method that gives the ratio of the yields. The two seismograms are then deconvolved to recover two estimates of the Green’s function. Applying the method to seismograms from the 2009 and 2013 North Korean underground nuclear tests gives yields of 6 and 16 kt, respectively. This method has applications in seismic exploration on land using a dynamite source.
-
-
-
Passive seismic imaging of structure discontinuities around the active fault using scattered earthquake waveforms
More LessFor seismically active fault with relative dense seismic network, seismic tomography using first arrival times from earthquakes is routinely applied to image fault zone structures. To characterize structure discontinuities around the active fault, seismic scattering imaging using scattered waves can be applied. This technique has been extensively used with active seismic sources, especially in oil/gas industry. However, scattering imaging using waveforms from earthquakes is rare. Here we present the imaging results using scattered SH waves from earthquakes around the SAFOD, California. Near vertical reflectors are clearly imaged around the San Andreas fault (SAF), similar to results obtained using scattered P-P waves. However, for strike-slip focal mechanisms for earthquakes along the SAF, first arrival polarities for P waves are different at different regions. In comparison, SH waves have the same polarities. Because wavelength for P waves is longer than S waves, making the resolution of the imaging result is higher when using scattered S waves. For this study, we found coherent phases (scattered SH waves) after direct SH waves. Overall, the imaging results using different types of waves are bery similar, thus supporting the reliability and usefulness of using passive seismic events to image structure discontinuities.
-
-
-
A unified inversion scheme for diffractions and passive events
Authors B. Schwarz, A. Bauer and D. GajewskiWe present a unified inversion scheme that can likewise be applied to passive events and different active-source diffraction data configurations. For the active case, we specifically target the weak but highly illuminating diffracted background through automated adaptive subtraction of the dominant reflected contributions, while only making use of the available near-offset channel. Based on local stacking and coherence evaluation, the recorded diffracted events are treated as passive source wavefields, which are characterized in terms of local properties of wavefronts emerging at the registration surface. In an industrial field data example offshore Israel, we show that the diffraction-based wavefront inversion of only the near-offset channel leads to results, which are in reasonable agreement with available geological interpretations. By application to an academic low-fold dataset recorded near Santorini, we demonstrate that the inversion scheme offers the opportunity to construct laterally resolved depth-velocity models even in the absence of large-offset recordings. In addition, we suggest a simple fully data-driven strategy to globally link the independently performed local coherence measurements that share the same origin in depth. Through this global characterization of events, we motivate to perform event-consistent statistics, which allow to estimate mean scattering or passive source locations and conveniently asses location uncertainties.
-
-
-
Wave field inversion of ambient seismic noise
Authors S.A.L. de Ridder and J.R. MaddisonWe formulate a full wave field inversion for ambient seismic noise recorded by large and dense seismograph arrays. Full wave field inversion exploits the constraints on the gradients of the wave field that array data inherently possess (the sum is greater than its parts). Consequently, we can relax the spatial and temporal constraints on the wave field source functions in the seismological inverse problem. The result is that we become insensitive to the noise sources in, and changing character of, the ambient seismic field. In principle the formulation holds equally for ambient noise wave fields and for wave fields excited by controlled sources. We support the theory with examples in one dimension in the time domain, and in two dimensions in the frequency domain. The latter being of interest to invert surface wave ambient noise for phase velocity maps. We include checker board tests for geometries mimicking USArray and Ocean Bottom Cables.
-
-
-
-
-
Imaging seismic anisotropy in a shale gas reservoir by combining microseismic and 3D surface reflection seismic data
Authors W. Gajek, J. Verdon, M. Malinowski and J. TrojanowskiA strong VTI fabric can dominate the influence of weaker azimuthal anisotropy on seismic wave propagation, causing ambiguities, making it challenging to invert geophysical observations for fracture orientations and densities. We employ SWS technique for a microseismic dataset collected in a vertical borehole during a hydraulic stimulation of a shale gas target in northern Poland to image the fracture strike and density masked by a strong VTI signature. In order to overcome VTI fabric influence and enhance the inversion stability we integrate SWS data with parameters obtained from the surface 3D seismic survey. We succesfully image a pre-existing vertical fracture set not aligned with the maximum horizontal stress. The obtained results are consistent with fracture strike and crack density interpreted from well logs data.
-
-
-
Seismic Uncertainty and Ambiguity
Authors K. Mosegard, A. Zunino, N. Frandsen and P. ChristiansenThe link between seismic data and subsurface properties suffers from an intrinsic ambiguity, i.e., that many reservoir models fit the same data within the noise. In some pathological cases, this may cause biases in the interpretation of the structure of the earth models used in exploration and reservoir management. Inversion techniques for large seismic data sets encountered in the oil industry are well established and are assumed to be reliable. Although this is generally true, thanks to integrated knowledge from geology and other geophysical data, there is, in some cases, still a significant risk that traditional approaches may end up finding only part of the models which can explain the observed data, overlooking potentially different scenarios and, moreover, hampering a correct uncertainty quantification. This phenomenon is often observed in practice when different inversion contractors arrive at significantly different results from the same data sets. The impact of the unavoidable non-uniqueness should be assessed when performing inversion of seismic data. We investigate the magnitude of the ambiguity problem in seismic modelling of chalk reservoirs by explicitly taking ambiguity into account in the inverse problem. Our study is based on a careful selected test case from the the Danish North Sea sector.
-
-
-
Multiphysics uncertainty analysis and considerations: a toolkit for interpreters
By M. MantovaniThe non-seismic employment operates a hypothetical reduction of the geophysical equivalence. Traditionally, the mutual agreement imposed to the properties is supposed to reduce the degree of freedom in shaping the final model of the earth. Nevertheless, in many cases, it is difficult to quantify how much influence the non-seismic data has on the reduction of the geophysical ambiguity. The presentation attempt to exemplify one instance of a typical situations of velocity determination with seismic-only solvers, and provide some reference number for estimating the geophysical equivalence reduction and the decrease of the uncertainty of results in the occurrence of seismic and non-seismic data.
-
-
-
Tomographic model uncertainties and their effect on imaged structures.
Authors M. Reinier, J. Messud, P. Guillaume and T. RebertWe demonstrate a recently developed method for computing tomography model uncertainties and mapping them into the migrated domain. After the final tomography, the method generates a series of equi-probable velocity model perturbations within a standard deviation confidence level. This allows computing standard deviation-like attributes for velocity and anisotropy parameters and for key horizons. An application to West of Shetland dataset highlights the interest of the estimated uncertainties.
-
-
-
Anisotropic Earth model building and some sources of uncertainty in the results
By O. ZdravevaAnisotropic earth model building (EMB) is a challenging task: even when we use best-quality modern workflows with non-seismic data and information to better constrain the problem, results are inherently non-unique. Methods for quantifying the uncertainty of earth models for seismic imaging exist and their successful application has been demonstrated in the past. All of them assume that the available model is a close representation of the true earth and is accurate enough, i.e. it explains at least all available seismic and borehole data. In addition, they rely on some extra knowledge and information for the area under investigation to be brought in to form some priors. This abstract discusses and illustrates the EMB sensitivity and uncertainty associated with: (1) the absence of enough complementary to surface seismic measurements; (2) inaccuracy in salt geometry and subsalt velocities, and (3) Q-compensation methods and parameterization. It emphasizes the need to validate earth models thoroughly before conducting uncertainty analysis and, if needed, to further update the models and ensure all limitations and assumptions of the data conditioning and EMB validation are factored into the priors for further uncertainty analysis.
-
-
-
Uncertainty estimation by probabilistic first arrival time tomography using Markov Chain Monte Carlo sampling
Authors A. Gesret, J. Belhadj, T. Romary, M. Noble and N. DesassisWe present several applications of probabilistic first arrival time tomography by Markov Chain Monte Carlo sampling dedicated to uncertainty estimation. In the first part, we introduce a new velocity model parameterization based on Johnson-Mehl tessellation that allows applying probabilistic approach to typical seismic refraction data. We also present results of the tomography to a real data set recorded in the context of hydraulic fracturing and illustrate how the velocity model uncertainties can be properly taken into account when locating seismic events.
-
-
-
Model uncertainty analysis for de-risking seismic image accuracy
Authors A. Bell, L. Russo, T. Martin and D. van der BurgWith exploration moving towards areas of increasing geological complexity, reservoir evaluation is often based on the interpretation of a single seismic image. Recovering a suitable velocity model employed in pre-stack depth migration plays a crucial role in the creation of this image on which economic evaluations are drawn. Typical depth imaging projects provide final velocity model attributes and their associated seismic image. The amount of uncertainty associated with this image is poorly understood; the only quantitative measures of reliability are provided through analysis of volumetric residual move-out and by comparison with available auxiliary data. We aim to address this situation by using the same tomography method we use to derive the model parameters. The workflow allows us to establish the resolution of the tomography, in establishing the model parameters. It also enables us to determine the recoverable degree models may be perturbed prior to the tomography. Using the workflow in conjunction with these criteria we generate a population of solution models which equally conform to the observed data. We then analyse the variance of this model population to derive confidence attributes to assign to both the target model and its associated seismic image.
-
-
-
Complexity in common image gather behaviour in offshore continental settings arising from velocity model building and imaging choices.
Authors G. O'Brien, M. Igoe, J. Doherty, P. Matrice and R. MecklenburghCommon image gather complexity arising from uncertainties in the subsurface velocity model feeds into exploration risk through degradation in seismic attributes and facies prediction. In off-shore continental settings where the overburden is geologically complex, the seismic wavefield exhibits complex behaviour when propagating through such highly varying geological structures which impacts the resultant seismic images and attributes. Using a representative synthetic model, the imaging choices and velocity model uncertainties are explored in light of maximizing facies prediction using seismically derived attributes.
-
-
-
What mistakes are we making while interpreting salt? Could FWI help?
Authors J. Dellinger, A. Brenders, X. Shen, I. Ahmed, J. Sandschaper and J. EtgenModel studies indicate that our conventional salt-interpretation workflow, consisting of cascaded sequences of flooding with either salt or sediment velocities, migration, and picking, produces two distinct types of velocity errors: 1) small but ubiquitous positioning errors of the margins of the salt, and 2) large “chunky” errors where salt boundary reflections were grossly misinterpreted. Full-Waveform Inversion might be a solution, but to achieve success may require new kinds of data, improved algorithms, or most likely both.
-
-
-
Reduction of depth uncertainties using common offset RTM (COR) Gathers
Authors S. Liu, G. Rodriguez and F. HaoSubsalt velocity estimation has presented significant challenges in the past. Ray based methods suffer from poor S/N ratios that results from sparse ray coverage beneath salt bodies. The use of common offset RTM gathers (COR) has been shown to decrease uncertainties in subsalt residual moveout estimation, which can more reliably be used by tomographic algorithms to invert for more accurate velocity estimations. Furthermore COR gathers have been shown to improve salt velocity estimations in areas with sediment inclusions. Better ties to well information (sonic logs, formation markers) have validated the improved resultant velocity models.
-
-
-
Post-Migration Processing and Imaging in the Local Angle Domain
By Z. KorenIn many areas of interest, the available seismic data combined with the most advanced seismic modeling/imaging tools, well information, potential field data, and geological and geophysical constraints, are still not sufficient to uniquely determine the complexity of subsurface geological media. There has been a continuous effort to enrich these data components in order to converge to a minimum set of plausible geological models that can be considered throughout the O&G exploration, development and production stages. The reliability of seismic imaging in complex geological areas depends on many factors. One of the most important is the ability to use the available recorded seismic data to illuminate subsurface image points from a wide range of directions and opening angles/azimuths between the incident and scattered waves. This multi-dimensional illumination challenge mainly depends on the density and extension of the seismic acquisition system and on the complexity and accuracy of the inverted subsurface geological model. Moreover, seismic imaging is classified into many categories, depending on the specific goal at each stage. For example, structuraloriented imaging for locating large scale potential reservoirs significantly differs from high-resolution imaging at the reservoir for fracture detection. In this work I demonstrate the advantages of using a novel multi-dimensional local angle domain (LAD) system for enriching information from the available recorded seismic data in order to obtain more reliable information about continuous and discontinuous subsurface target objects. In particular, I’ll briefly demonstrate the potential of using the mapped seismic data for different post-migration processing/imaging solutions: velocity model updating and re-migration, amplitude correction, accounting for illumination, geometrical spreading, and absorption/dispersion (Q-correction), in-situ data reconstruction, and specular/diffraction imaging.
-