- Home
- Conferences
- Conference Proceedings
- Conferences
79th EAGE Conference and Exhibition 2017 - Workshops
- Conference date: 12 Jun 2017 - 15 Jun 2017
- Location: Paris, France
- ISBN: 978-94-6282-219-1
- Published: 12 June 2017
1 - 100 of 144 results
-
-
Experimental Investigation of Thermal Marangoni Effect on Bypassed Oil Recovery
Authors M. Masoudi, B. Rostami, M. Khosravi and P. AbolhosseiniIn this paper, the effect of interfacial tension (IFT) gradient caused by the temperature changes (Benard-Marangoni phenomenon), as a novel EOR method has been investigated. For a proper understanding and visualizing the mechanism, glass micromodels were used. Carbon dioxide and methane were injected separately to n-decane saturated micromodel. Gas injection experiments were conducted in different conditions and the impact of temperature and pressure have been investigated. Cold gas injection was compared with isotherm gas injection as the zero level of Marangoni flow. The impact of Marangoni flow has been compared with other active production mechanisms. Presented results revealed that the IFT gradient due to change in temperature at the interface of the oil and gas, induces a conductive flow that improves oil recovery and compensates the negative effects of other mechanisms by overcoming the capillary forces. The results show the significant impact of thermal Marangoni convection on the recovery of bypassed oil and introduced Benard-Marangoni convection as an important mechanism of bypassed oil recovery especially in low pressure reservoirs.
-
-
-
IoT-based Wireless Networking for Geoscience Applications
Authors H. Jamali-Rad and X. CampmanNowadays, sensors are everywhere in the Oil and Gas industry, which leads to creation of a “Big Data”. This data in many cases should be aggregated and coordinated in a real-time fashion, sometimes in harsh environments. To address this issue, we have defined a unified wireless sensing framework that comprises three modules: cheap low-power long-range wireless sensors with inherent compatibility with the Internet of Things (IoT), advanced scalable wireless networking protocols, and data storage/analytics running on clouds for analysis and decision making. As a showcase, we present our seismic field test results with low-power wide-area networks (LPWANs) in the Netherlands.
-
-
-
Performing Successful Data Science in the Geoscience Domain
Authors D. Irving and J. McConnellTo perform data science with scientific data, we must represent the scientific problem space to allow analytics. This requires a blend of traditional physics-based algorithms with modern advanced analytics, performed on datasets large enough to yield statistically robust insights. These exposed insights in the data must be explained by scientists, driving creative thinking, in contrast to application-driven workflows where line-of-sight to original data is typically absent. We show how an open approach to data parsing, storage and integration drives better understanding of data, and moreover, enables the deployment and development of open source tools for processing, analysing and visualising data and insights. Dealing with measurement data brings challenges of quality, sparsity and irregular sampling, in datasets that must be integrated in the spatial, time and frequency domains. This is time-consuming work, often taking 80% of the time of each analytical study, and so we recommend that data from the geoscience domain should be curated in a “load once, use many times” paradigm. Higher-level parameters can then be created to capture the scientific insights of multi-physics systems for use in one-off or operationalised descriptions of a system. After implementing this level of abstraction, the geoscientific world is ready for data science.
-
-
-
Fostering high-impact machine learning ecosystem in subsurface science and engineering
By M. HallThe field of machine learning is experiencing a boom. The International Energy Agency predicts the 2020 analytics market in upstream petroleum alone will more than double to $10 billion. Previous such hype cycles, especially the one at the end of the 1980s, ended in a mass extinction event: artificial intelligence companies died off, funding seas dried up, and 'expert systems' became dirty words. Meanwhile, however, research continued under codenames like 'informatics', 'machine learning', 'big data', and 'data analytics'. Today, as the AI spring gives way to an AI summer, how can we give our projects the best chance of having the impact we believe they can have? As the petroleum industry moves into its autumnal years, I propose eight strategies for the computational science and engineering community to bring about the profound changes to its safety and operating efficiency that we all believe we can achieve. These strategies are well tested in other fields, and many of them have at least been tried in subsurface science and engineering.
-
-
-
Deep Learning on Hyperspectral Data for Land Use and Vegetation Mapping
Authors N. Audebert, B. Le Saux, S. Lefevere, C. Taillandier and D. DubucqRemote sensing technology is a remarkable tool to explore and to measure Earth’s surface features. Total and ONERA set up a collaborative partnership named New Advanced Observation Method Integration (NAOMI) that aims at adapting and developing new remote sensing techniques specifically targeted for hydrocarbons exploration and environmental protection. In this context, we integrate deep learning for classification of hyperspectral data. To detect different land uses and materials in aerial hyperspectral images, neural networks prove themselves to be very efficient tools, as they are able to learn discriminant features that help classification performance.
-
-
-
Machine Learning can extract the information needed for modelling and data analysing from unstructured documents
Authors H. Blondelle, A. Juneja, J. Micaelli and P. NeriSince its early days, the exploration and production industry has handled large volumes of data, mainly measurements, to build subsurface models used for strategic or technical decisions. More recently, data analytics technologies have emerged to complement the modelling tools, with notable successes in the domain of field monitoring. But the broader adoption of new analytical tools is made difficult due to the limited access to the large percentage of relevant data that is stored in unstructured formats. This issue is not new: modelling tools faced the same difficulty, but with a lower order of magnitude because each tool has a limited set of input data. Manual information extraction by skilled technicians from unstructured documents to feed sophisticated enterprise data models and modelling tools was acceptable even if it represented a poor use of a trained professional’s time. Despite these efforts, it is estimated that only 20% of the information available in our industry is stored in structured, searchable databases. Analytical tools require much more than that to perform adequately. With the emergence of new analytics tools, our industry now has a much greater appetite for data than it has ever had before. Is Machine Learning the means to satisfy it?
-
-
-
Unsupervised identification of electrofacies employing machine learning
Authors I. Emelyanova, M. Pervukhina, M. Clennell and C. DytMachine learning techniques are widely used in petrophysicics and geophysics to solve complex and non-linear problems of practical importance. In particular, numerous applications for identifying electrofacies from well logs have been conducted. However, there is no unique approach for reliable automatic classification of electrofacies as the accuracy of the applied algorithms may vary depending on data and initial conditions. To overcome instability in outcomes from various algorithms, we suggest applying different clustering techniques to log data, in a way similar to the popular method of supervised classification ensemble learning. Such an ensemble of clustering outputs are further integrated into unique classes (electrofacies) for subsequent automated identification of lithofacies. Here we apply three different clustering algorithms, namely, Spectral Clustering Self-Organizing Map and k-means, in order to reliably classify electrofacies at a petroleum exploration well Lauda-1 drilled in the Northern Carnarvon Basin (Western Australia). The clustering outputs integrated into electrofacies are validated using an expert facies classification. We show that some facies identified by the expert are not distinguished as separate classes, at least for the chosen well and selected logs. The established electrofacies can further be assigned to conventional lithofacies. This requires creating an expert system which is currently under development.
-
-
-
Machine Learning Based Workflows in Exploration and Production
Authors J. Limbeck, M. Araya, G. Joosten, A. Eales, P. Gelderblom and D. HohlIn this presentation we are going to cover a mapping between existing E&P workflow components and their data science based counterparts - as we have developed or envision them. We present one example from the geophysics domain, where deep neural nets are used to accelerate the seismic interpretation process (GeoDNN), and one example from the reservoir engineering domain (AutoSum) where machine learning is used to analyze are large ensemble of reservoir models.
-
-
-
Automated facies prediction in drillholes using machine learning
Authors M. Blouin, A. Caté, L. Perozzi and E. GloaguenMachine learning is a popular topic in geosciences at the moment. It allows the management and interpretation of data in quantities and varieties (number of variables) that a human being would not be able to achieve. Rock physical properties acquired along drillholes can be used to generate predictions about the nature and characteristics of the rock when wireline logging is taking place. In this paper, we investigate the accuracy of facies prediction using machine learning algorithms by automatically interpreting geological rock types along drillholes from rock physical properties. A data-processing workflow is proposed to enhance the prediction power of the geophysical measurements, a model calibration approach is outlined and predictions on test data are presented. Results show more than 80% of correspondence between the automated prediction and the geologist interpretation.
-
-
-
Optimising storage for high-speed data access to large volumes of data – recent advances and future direction
More LessHistorically seismic and other data have been stored on tape media of various types. The density of data on tape has increased over time. Access times have improved as well but they are not as fast as disk. Disk storage comes in various guises from cheap commodity to highly resilient. A cost benefit analysis shows that high speed intelligent disks are more cost effective than multiple cheap disks. Additional work has demonstrated that using Object storage on disk significantly improves access time to data as compared with standard Btree storage. The advent of cheap solid-state storage has changed the way we view storage and more and more systems are appearing with solid-state only storage (laptops, phones, tablets and high end analytics machines). Has the time now come to ditch tapes and disks and switch to solid state?
-
-
-
ForM@Ter: a data and services centre for Solid Earth
Authors E. Ostanciaux, M. Mandea, M. Diament and O. JametTo ease the use of satellite and in situ Earth’s observation data, the French scientific community develops four centres corresponding to the main physical Earth’s compartments: ERIS (atmosphere), ForM@Ter (solid Earth), ODATIS (ocean) and THEIA (land surfaces). These centres are developed in the framework of a project of a single research infrastructure included on the French « large » research infrastructures roadmap, to be implemented in the coming years. The first ForM@Ter target focuses on surface deformation from SAR and optical imagery data. The associated services are implemented considering the needs expressed by the French scientific community to support the use of the huge data volumes like those provided by Sentinel missions. Within this context, we present the Ground Deformation Monitoring service which is developed for scientific and private users to facilitate exploitation of radar and optical data for ground motion monitoring applications. It contributes to the ESFRI EPOS research infrastructure implementation. A massive processing radar data service is also being implemented with the objective to provide displacement map time series over large areas. It will be established using MUSCATE, a CNES computing infrastructure. These two services are based on PEPS, a CNES infrastructure hosting Sentinel products.
-
-
-
Technical Descriptions in Long-term 115⁰C Borehole Digital Micro-seismic Monitoring at the PTRC Aquistore CO2 Sequestration Project
Authors C. Nixon, D. Schmitt, R. Kofman, D. White and et allA preliminary overview of digital downhole microseismic monitoring at the SaskPower Aquistore carbon capture and storage project in the Williston basin is discussed. The digital downhole monitoring system is presented, which has been successfully deployed for up to six weeks at 2800 m and 115⁰C. Technical experiences are described, including difficulties and solutions in sustained operation at extreme conditions. 750 gigabytes of high quality seismic monitoring data was obtained and reviewed for seismic events by moveout and signal to noise ratio. Teleseismic events, mine blasting, and dynamite orientation shots were all spotted easily. A present lack of induced seismicity looks promising for carbon capture and storage in the Williston basin, but monitoring data is still being more carefully reviewed with automated selection algorithms.
-
-
-
Insights from applying Machine Learning techniques to Geosciences data from the Oil and Gas industry
By M. TibbettsArundo is a USA and Norwegian based data science services and software solutions company applying machine learning techniques across a wide range of heavy-asset industries. Our data scientists typically come from academic backgrounds where machine learning is regularly utilized such as particle physics. We have recently been working with partner organizations from the Oil & Gas industry, applying our data science expertise to geoscience data, including seismic, from operating fields. We will summarize our experience of using machine learning with such data and how our wider experience of those techniques has been utilized for geoscience use cases. We expect to be able to present results of analyses we have worked on with our partner organizations for the first time. Finally, we will discuss how subsurface geoscience data and machine learning can be used to optimize maintenance and operations in the oil and gas industry.
-
-
-
Sensitivity analysis of synthetic seismograms in sedimentary basin with respect to uncertain seismological parameters
Authors F. De Martin, P. Thierry, D. Keyes and E. ChaljubThis study focuses on the understanding of the variability and sensitivity of synthetic earthquake ground motions at sedimentary basin scale associated with the epistemic uncertainties of the model of seismic wave propagation. The key question at stake is the following: What is the spatiotemporal variability and sensitivity of seismograms at the surface with respect to seismological parameters? To achieve this objective, we describe the whole concept including pre- and post-processing around the main earthquake simulation engine. The different stages can be viewed as follow: 1° definition of the initial model perturbation to generate a given set of input parameters, 2° simulation including a runtime filtering and a post-simulation filtering and decimation to reduce the amount of output data, 3° uncertainty quantification analysis on parallel file system including Hadoop evaluation, and multi-level MPI communicators to obtain the final results of the global big data application.
-
-
-
Automatic similarity mesasures to manage geoscience databases
By A. FugaIn geosciences Data Management, the volume and complexity of data flows as well as historical merges of companies and databases, have raised harmonization, reconciliation and geo-referencing topics. Data duplication in databases causes storage place loosing, and casts doubt upon the different versions of the same seismic exploration data. The harmonization issue appears also when loading newly acquired or bought data in the referent database. To ensure data quality as well as good data access and storage savings, this integration needs to be done without creating duplicates and without lowering the databases quality. To meet the emergency of data integration requests, this research and development work has led to the design of a methodology and software based on multiscale contextual similarity, automatically computed. A new workflow has been adapted in TOTAL for seismic 2D and 3D navigation lines harmonization, wells databases reconciliation, technical documents geo-referencing, etc. This research work has demonstrated the capability to save 75% of the time data loaders, data managers or geophysicists spend on classical methods for harmonization. Moreover, databases visualization algorithms have been created for databases harmonization and characterization, to open perspectives of a more global and visual Data Management approach.
-
-
-
Channel Characterization Using Support Vector Machine
Authors A. Mardan, A. Javaherian and M. MirzakhanianRapid growth in the size of seismic data and the number of attributes cause to increase the significance of pattern recognition techniques in interpreting the seismic data. Unsupervised methods include k-means, self-organizing maps (SOM) and generative topographic maps (GTM) let interpreters do a preliminary interpretation and conclude relatively suitable information with no much primary data from studied area. On the other hand, utilizing supervised learning such as neural networks (NN) and support vector machines (SVM) by interpreters require some primary information from studied area to seed the existent facies and use these seeded samples as the input to the algorithm. In this study, to detect channel facies of one of the southwest hydrocarbon fields of Iran, we used k-means and SVM to train the second algorithm by using the extracted primary information from the first algorithm. Results show that the existent channels in the studied area have two different facies that can be detected by applied algorithms.
-
-
-
Advanced Machine Learning for Unconventional Plays
More LessImproving the capital efficiency in oil and gas exploration and production, particularly in the unconventional (UNC) plays, becomes vitally important for the industry. Since it is evident that the existing geological and petro-physical methodologies and technologies that enjoy good success in conventional plays become not as effective when applied to UNC plays, more effective approaches are in high demand. However, in oil and gas exploration, the most critical phase is the early land appraisal and initial development of the so-called green fields wherein the available data is usually scarce. This poses a great challenge to both domain experts and Machine Learning practitioners. How can Machine Learning and its related techniques be applied to help in early land appraisal and sweetspotting to greatly improve the capital efficiency? This paper describes our recent advances in developing a Machine Learning sweetspotting workflow and presents our results and findings in identifying the higher production potential areas in an example. The workflow uses a imputation scheme and a more generic and powerful ensemble learning technique which combines the strengths of a set of different Machine Learning algorithms. Consequently, our new workflow has achieved very good results in terms of R^2 (0.83) from leave-one-out cross validation.
-
-
-
Carbonate Reservoir Cementation Factor Modeling Using Wireline Logs and Artificial Intelligence Methodology
Authors F. Anifowose, C. Ayadiuno and F. RashedianAn approach, comprising statistical and artificial intelligence techniques, to modeling rock cementation factor in a Saudi Arabian carbonate reservoir using wireline logs is presented. The objective is to obtain a more accurate prediction of rock cementation factor, denoted by the exponent, m, in Archie’s equation, as a variable log using multivariate linear regression (MLR), artificial neural networks, and support vector machines. Published equations by Nugent, Lucia and Shell are empirical derivations based on porosity logs and assumptions that may not be applicable in other geological settings. Typically, log analysts use the average of m values obtained from special core analysis (SCAL) measurements. Such constant values do not account for formation heterogeneity resulting in inaccurate water saturation and pore volume estimation with high operational and economic costs. In this study, six wireline logs from seven wells were combined with their corresponding core measured m values to build and optimize the proposed models to predict the m values for new wells or uncored sections of existing wells. The predicted m values produced by the MLR model closely matched available m data from SCAL measurements. This study fulfills the pressing need for variable m as a more accurate input to water saturation models.
-
-
-
Adjusting well plan trajectory through 3-D seismic litho-fluid classification, A Case Study
Authors K. Kazemi and M. DelnavaAdjusting the geological well plan trajectories with geophysical methods could be an effective way in assisting to prepare more precise and accurate plans for drilling new wells. The main objective of this study was to check the proposed well path through geophysical sections and position the well trajectory with respect to the target layers and reduce the drilling risks and also costs as much as possible. For this reason at first a pre-stack simultaneous seismic inversion was conducted to generate Acoustic impedance and Vp/Vs ratio cubes. Consequently elastic and petrophysical well log data were evaluated to determine different litho-fluid classes including Hydrocarbon Sand (HC Sst.), Wet Sand (Wet Sst.) and Shale classes. Bayesian-derived probability density functions (PDFs) for each litho-fluid class were calculated from well log computations of AI and Vp/Vs. Using the PDFs and pre-stack seismic inversion results, probability cubes for the individual litho-fluids in addition to final litho-fluid cube were calculated. Based on generated results, well plan trajectory was adjusted to pass through the well-defined HC Sst (the target reservoir layer). Results of this study illustrated the usefulness of the Litho-fluid cubes derived from 3D seismic data in reducing the drilling risks and costs.
-
-
-
Subsurface Integrity Management of UGS
Authors F. Favret, R. Del Potro and M. DiezWell integrity has been specifically documented in particular in Norsok D-010 and in ISO/TS 16530 with leak paths and subsequent risk assessment, well barrier elements (WBE), monitoring and maintenance. Several companies have developed their own methodology for well integrity management including new software that has been developed to help operators plan and optimize maintenance, based on lessons learned from well equipment failure. However to ensure the safety of a storage facility, well integrity management has to be complemented by storage integrity management (Bonnier et al., 2015). Getting information on storage status during operation is difficult since these huge storing volumes cannot be accessed for direct in-situ controls. For example, in leached salt caverns, indirect observations are possible but the logistics required prevents continuous observations. Here we present two complementary indirect methods that contribute to the assessment of storage integrity: microseismic monitoring and PVT modelling. This joint approach is applicable to all storage types. A case of a multi salt cavern gas storage facility is presented in this document.
-
-
-
How widespread is induced seismicity in Canada and the USA?
Authors M. van der Baan and F. CalixtoThe seismicity rate in Oklahoma in the last 5-8 years is correlated to increased large-scale hydrocarbon production. Contrary to Oklahoma, analysis of oil and gas production versus seismicity rates in 6 other States in the USA and 3 provinces in Canada finds no State/Province-wide correlation between increased seismicity and hydrocarbon production, despite 8-16 fold increases in production in some States, including North Dakota (Bakken formation) and Pennsylvania, West Virginia (Marcellus shale). However, in various areas, seismicity rates have increased locally.
-
-
-
Case study: Lacq pilot CO2 storage in old gas field - microseismic monitoring
Authors J. Barnavol and X. Payreor reservoir seal integrity and supply all seismological metrics and attributes necessary for seismic risk management and population information. During the monitoring (2009–2015), 2637 events have been detected and 717 were located inside the surveillance perimeter. Magnitude of events ranged between -2.6 and +1.1 in two distinct clusters: one in the vicinity of the injection point (reservoir perimeter) and another one close to the “Meillon/Saint-Faust” fault complex, located 2km north of the injection site (local perimeter) (figure below). The seismic monitoring on this CO2 storage pilot shows a strong influence of the injection in the reservoir perimeter (≈1 km around the injection point). At a larger scale (local perimeter), this study did not enable to assess reliably eventual seismicity rate changes during injection operations. Nevertheless, this “multi-scale” seismic network design perfectly achieves the goals assigned in terms of risk management, population information and injection mapping.
-
-
-
Dynamics of Fault Activation by Hydraulic Fracturing in Overpressured Shales
By D. EatonFluid-injection processes can induce earthquakes by increasing pore pressure and/or shear stress on faults. Natural processes, including transformation of organic material (kerogen) into hydrocarbon, can similarly cause fluid overpressure. Here we document examples where earthquakes induced by hydraulic fracturing are strongly clustered within areas characterized by pore-pressure gradient in excess of 15 kPa/m. By contrast, induced earthquakes are virtually absent in the same formations elsewhere. Monte Carlo analysis indicates that there is negligible probability that this spatial correlation developed by chance. A detailed analysis was undertaken within a region in Alberta, Canada where uniquely comprehensive data characterize dynamic interactions between seismicity and well completions. Seismicity is strongly clustered in space and time, exhibiting spatially varying persistence and activation threshold. The largest event (ML 4.4) can be reconciled with a previously postulated upper bound on magnitude, only if the cumulative effect of multiple treatment stages is considered. Induced seismicity from hydraulic fracturing reveals contrasting signatures of fault activation by stress effects and fluid diffusion. Patterns of seismicity indicate that stress changes during operations can activate fault slip to an offset distance of > 1 km, whereas pressurization by hydraulic fracturing into a fault yields episodic seismicity that can persist for months.
-
-
-
Effective stress drop of fluid induced seismicity
Authors T. Fischer and S. HainzlIn this paper we test how the effective stress drop is comparable to the static stress drop of a single earthquake rupturing the same fault portion. To this purpose, we compare the spatiotemporal evolution of the seismic moment release and analyze the uncertainties of the resulting stress drop estimates. We show that the effective stress drop is only comparable to earthquake stress drops in specific cases. In particular, the effective stress drop values significantly underestimate the earthquake stress drops in the presence of aseismic deformation. Furthermore, the values are only scale-independent if pre-stress and post-stress conditions are uniform in space. Our analysis of data from injection-induced seismicity, natural earthquake swarms and aftershock sequences shows that in most cases the effective stress drop estimate is rather stable during the cluster evolution. Slightly increasing estimates for injection-induced seismicity are indicative for the local forcing of the system, while overall low effective stress drop values hint to the important role of aseismic deformations. While normal values up to 1MPa are found for seismicity associating geothermal reservoirs stimulation, anomalous small effective stress drop occur in case of fracking tight sands and shales, which may indicate aseismic deformation during these treatments.
-
-
-
Scaling of induced seismicity, implications for the role of geological setting on seismic hazard
Authors G. Viegas, A. Baig and T. UrbancicIn this study we present conflicting stress drop estimates of injection-induced events in two regions of the Western Canadian Sedimentary Basin. Horn River Basin events show lower stress drops than Duvernay Basin events by a factor of 10 to 20. We propose that the observed stress drop differences are caused by different regional stress characteristics, assuming the seismic events are generated during similar injection programs. A potential difference the Fox Creek region is characterized by the presence of reefs in the Leduc formation that cross-cut the Duvernay shale formations which form drapes over the reef-off reef facies (Stoakes, 1980). We suggest that differences in stress drops reflect differences in the regional stress state with events occurring in more stressed regions having higher stress drops, that is, being able to release larger quantities of stored elastic strain. Higher stress drop earthquakes have a significant role in seismic hazard as they generate higher frequency strong ground motions which can potentially cause more damages.
-
-
-
Mechanisms driving earthquake faulting during a case of injection-induced seismicity
Authors M. Diez and R. del PotroInduced seismicity is currently one of the main geomechanical and environmental challenges faced by the underground industry. Current efforts focus mainly on seismic monitoring and risk assessments while progress on unraveling the mechanisms that drive induced seismicity have been addressed to a lesser extent. Here we explore stability conditions for unstable slip, and potential dynamic weakening mechanisms to explain earthquake faulting in a case of injection-induced seismicity. Injection of natural gas into the Castor Underground Gas Storage, offshore Spain, which generated a ~2 bar pressure increase, induced a seismic swarm that culminated in a series of Mw~4 earthquakes, two weeks after shut in. We focus our attention on frictional weakening and on the dynamic weakening effect of shear heating-induced thermal pressurization to explain Mw >3 earthquake faulting during the Castor sequence. These new mechanisms that we explore may help improve our understanding of cases of injection-induced seismicity, in regions of low natural seismicity, where the external forcing, or amplitude of the stress perturbation is relatively small.
-
-
-
Verification of Network Design for Induced Seismicity
More LessInduced seismicity monitoring for hydrocarbon or geothermal energy extraction is usually designed to meet political or environmental goals and limitation (e.g. UK limitations on magnitude of completeness thresholds). Therefore operators seek monitoring network designs to meet or exceed these goal or limitations. Generally, before starting any microseismic monitoring the geometry of array has to be designed to achieve optimal performance of the network and to fulfil all demands for obtaining of the required data quality. Design of the monitoring array should follow several rules. For surface monitoring networks, a proper detection and location of events requires station spacing approximately twice as large as the expected depth of the seismic events. In addition, seismic noise in some area may significantly decrease the monitoring network performance. Last but not least the monitoring network performance is dependent on the assumed velocity and attenuation models. All of these factors significantly affect the network performance and it is a challenge to verify that the proposed performance is going to be real. Additionally, network performance in seismically quiet areas is particularly challenging as there is no seismicity to benchmark it on before the start of the operations (e.g. Gaucher, 2016).
-
-
-
Examining the capability of statistical models to mitigate induced seismicity during hydraulic fracturing of shale gas reservoirs
Authors J. Verdon and J. KendallIn this paper we test the ability of statistical methods to estimate the expected size of the largest event during stimulation, applying these approaches to two datasets collected during hydraulic stimulation of a North American Devonian Shale. We apply these methods in a prospective manner: using the microseismicity recorded during the early phases of a stimulation stage to make forecasts about what will happen as the stage continues. We do so to put ourselves in the shoes of an operator or regulator, where decisions must be taken based on data as it is acquired, rather that a post hoc analysis once a stimulation stage has been completed. We find that the proposed methods are able to provide a reasonable forecast of the largest event to occur during each stage. This means that these methods can be used as the basis of a mitigation strategy. Applying such a strategy to our case studies, we find that the majority of stages would have been allowed to continue as planned, while that the need for mitigation would have been identified for all of the stages that ended up inducing larger events.
-
-
-
Microseismic Geomechanical Evaluation of Fault Slip Associated with Hydraulic Fracturing
By S. MaxwellThe paper describes applying a coupled hydraulic-mechanical model and the resulting seismicity catalogue to explore repeated fault activation and seismicity patterns associated with multi-stage hydraulic fracturing. The model is also used to explore seismic risk mitigation by changing the viscosity of the fracturing fluid.
-
-
-
Integration of geomechanical modeling with induced seismic source mechanisms to assess deformation and stress changes
Authors D. Angus, G. Viegas, T. Urbancic and A. BaigThe recording of induced seismicity plays a significant role in monitoring geo-industrial activities in terms of providing improved understanding of the failure mechanisms and as well as quantitative tool for risk assessment. The induced seismicity is due to local perturbations to the in situ stress field from such industrial activities. Significant work is being directed at developing fracture models that enable the assessment of local deformation and stress evolution due to stimulation design. Passive seismic monitoring provides an additional data source to study the stress field evolution by imaging the in situ response of the rock mass over and characterizing the failure mechanisms and high-order inelastic response of the rock mass. We integrate microseismic data with a geomechanical model to quantify deformation and stress field evolution. Our approach utilizes moment tensor solutions to represent localized discrete rupture zones. The geomechanical algorithm evaluates the Green’s functions for each rupture, and subsequently calculates the co- and post-seismic deformation using linear superposition. We perform this work flow on a passive seismic dataset to validate the technique as well as provide insight into the observed seismicity response due to a large (~M5) event.
-
-
-
Production Induced Seismicity in the Netherlands - From quick-scan to advanced models
More LessApproximately one out of six producing onshore gas fields (both the Groningen Field and small fields in the north-western part of the country) in the Netherlands experiences production-induced seismicity. Poro-elasticity and related differential compaction were referred to as the prime mechanisms causing this seismicity. The observed onset of induced seismicity in the Netherlands occurred after a considerable pressure drop in the gas fields. A large range of methods has been applied to study the background of Dutch seismicity, ranging from quick scans to advanced 3D geomechanical modelling studies. We have shown that both simplified 2D and full-field 3D geomechanical models can be used to model the onset of reactivation and identify faults which are prone to be reactivated. One of the approaches we used was inclusion of dynamic rupture modelling in traditional geomechanics workflows. In dynamic rupture modelling, traditional static friction laws are displaced by dynamic slip evolution. This enabled us to include realistic nucleation and propagation of the seismic events, as e.g. extend of the rupture area and slip displacements could be determined. We present an overview of the different methods that have been used to better understand induced seismicity in the Dutch onshore gas fields, including our dynamic rupture modelling method and an outlook to what this could mean in the future.
-
-
-
Numerical modelling of production-induced stress changes and fault reactivation in Rotliegend gas fields of the North German Basin
Authors C. Haug, A. Henk and J. NüchterProduction-induced seismicity is an increasing challenge to the E&P industry. Related poroelastic stress changes in reservoirs with complex geometries, their interference with tectonic stresses and their interaction with faults cannot be sufficiently explained by analytical models. Here, we develop generic numerical models to study production-induced stress changes and fault reactivation in and near compartmentalized gas reservoirs. The models are inspired by features of Rotliegend gas fields of the Northern German Basin but do not describe a specific reservoir. In a linear-elastic model series I, stress changes during pore pressure drawdown are investigated for different model parameters. Field properties leading to an increased tendency of fault reactivation are, among others, a high Biot-Willis coefficient, a locally reduced overburden load and a large reservoir thickness. In model series II a contact surface pair simulating a simplified fault is incorporated and the mechanical response of the contact surface to different schemes for absolute stress developments are simulated. For high friction coefficients the contact surface stays stable with production while for µ<0.6 the contact surface slips after a stage of poroelastic stress increase. Modelling results provide insight into the mechanisms that control production-induced poroelastic stress changes in compartmentalized reservoirs with complex geometries.
-
-
-
Pore-scale Processes in Amott Spontaneous Imbibition Tests
Authors M. Rücker, W.B. Bartels, M.A. Boone, T. Bultreys, H. Mahani, S. Berg, A. Georgiadis, S.M. Hassanizadeh and V. CnuddeWe observed the redistribution of the oil phase in the pore space of the rock in real-time in water-wet and mixed-wet (by ageing in crude oil) carbonate samples. During the imbibition of the water phase both, pore filling events with connection to the surrounding brine as well as snap-off events connected through water films only were detected. The distribution of the oil in different pore sizes as well as the different event types help to identify the wettability state of the system and understand how pore scale processes lead to the oil production at the larger scale.
-
-
-
Differential imaging of porous plate capillary drainage in laminated sandstone rock using X-ray micro-tomography
Authors Q. Lin, B. Bijeljic, H. Rieke and M. BluntThe experimental determination of representative capillary pressure curves as a function of saturation is of utmost importance for the determination of the initial reservoir fluid distribution and subsequent flow properties under production. We design an experimental procedure to image porous plate capillary drainage using X-ray micro-tomography based on differential imaging for a laminated sandstone micro core (4.86 mm in diameter). The pore structure, including the sub-resolution micro pores, was characterised and quantified using both the initial dry scan and the scan fully saturated with Potassium Iodide (KI) doped brine (30 wt%). During the porous plate capillary drainage, nitrogen (N2) was injected at a constant pressure and the capillary pressure was controlled by the pressure drop through the core sample. A full range of capillary pressure curve against saturation from 0 to 1.17 MPa is provided from the image analysis and is compared with the Special Core Analysis (SCAL) for the original core (35 mm in diameter). We are also able to discern that brine remained predominantly within the sub-resolution micro-pores, such as regions of fine lamination. Moreover, brine covering the rock grain surface and in the corners of the macro-pores can also be visualised.
-
-
-
Estimation of source time functions and yields of explosions directly from seismograms using the cube-root scaling law
More LessI estimate the source time functions and yields of explosions directly from seismograms. The method requires seismograms at a single receiver for two events of different size at the same source location and eliminates the path effect between source and receiver by finding a ratio filter that shapes the seismogram of the smaller event to the seismogram of the larger. If the noise is small, the convolution of the filter with the source time function of the smaller event yields the source time function of the larger event. The two source time functions are also related by the well-known scaling law in which the injected volume is proportional to the yield and the time constant is proportional to the cube-root of the yield. These two independent equations are solved for the two source time functions by a trial-and-error method that gives the ratio of the yields. The two seismograms are then deconvolved to recover two estimates of the Green’s function. Applying the method to seismograms from the 2009 and 2013 North Korean underground nuclear tests gives yields of 6 and 16 kt, respectively. This method has applications in seismic exploration on land using a dynamite source.
-
-
-
Passive seismic imaging of structure discontinuities around the active fault using scattered earthquake waveforms
More LessFor seismically active fault with relative dense seismic network, seismic tomography using first arrival times from earthquakes is routinely applied to image fault zone structures. To characterize structure discontinuities around the active fault, seismic scattering imaging using scattered waves can be applied. This technique has been extensively used with active seismic sources, especially in oil/gas industry. However, scattering imaging using waveforms from earthquakes is rare. Here we present the imaging results using scattered SH waves from earthquakes around the SAFOD, California. Near vertical reflectors are clearly imaged around the San Andreas fault (SAF), similar to results obtained using scattered P-P waves. However, for strike-slip focal mechanisms for earthquakes along the SAF, first arrival polarities for P waves are different at different regions. In comparison, SH waves have the same polarities. Because wavelength for P waves is longer than S waves, making the resolution of the imaging result is higher when using scattered S waves. For this study, we found coherent phases (scattered SH waves) after direct SH waves. Overall, the imaging results using different types of waves are bery similar, thus supporting the reliability and usefulness of using passive seismic events to image structure discontinuities.
-
-
-
A unified inversion scheme for diffractions and passive events
Authors B. Schwarz, A. Bauer and D. GajewskiWe present a unified inversion scheme that can likewise be applied to passive events and different active-source diffraction data configurations. For the active case, we specifically target the weak but highly illuminating diffracted background through automated adaptive subtraction of the dominant reflected contributions, while only making use of the available near-offset channel. Based on local stacking and coherence evaluation, the recorded diffracted events are treated as passive source wavefields, which are characterized in terms of local properties of wavefronts emerging at the registration surface. In an industrial field data example offshore Israel, we show that the diffraction-based wavefront inversion of only the near-offset channel leads to results, which are in reasonable agreement with available geological interpretations. By application to an academic low-fold dataset recorded near Santorini, we demonstrate that the inversion scheme offers the opportunity to construct laterally resolved depth-velocity models even in the absence of large-offset recordings. In addition, we suggest a simple fully data-driven strategy to globally link the independently performed local coherence measurements that share the same origin in depth. Through this global characterization of events, we motivate to perform event-consistent statistics, which allow to estimate mean scattering or passive source locations and conveniently asses location uncertainties.
-
-
-
Wave field inversion of ambient seismic noise
Authors S.A.L. de Ridder and J.R. MaddisonWe formulate a full wave field inversion for ambient seismic noise recorded by large and dense seismograph arrays. Full wave field inversion exploits the constraints on the gradients of the wave field that array data inherently possess (the sum is greater than its parts). Consequently, we can relax the spatial and temporal constraints on the wave field source functions in the seismological inverse problem. The result is that we become insensitive to the noise sources in, and changing character of, the ambient seismic field. In principle the formulation holds equally for ambient noise wave fields and for wave fields excited by controlled sources. We support the theory with examples in one dimension in the time domain, and in two dimensions in the frequency domain. The latter being of interest to invert surface wave ambient noise for phase velocity maps. We include checker board tests for geometries mimicking USArray and Ocean Bottom Cables.
-
-
-
-
-
Imaging seismic anisotropy in a shale gas reservoir by combining microseismic and 3D surface reflection seismic data
Authors W. Gajek, J. Verdon, M. Malinowski and J. TrojanowskiA strong VTI fabric can dominate the influence of weaker azimuthal anisotropy on seismic wave propagation, causing ambiguities, making it challenging to invert geophysical observations for fracture orientations and densities. We employ SWS technique for a microseismic dataset collected in a vertical borehole during a hydraulic stimulation of a shale gas target in northern Poland to image the fracture strike and density masked by a strong VTI signature. In order to overcome VTI fabric influence and enhance the inversion stability we integrate SWS data with parameters obtained from the surface 3D seismic survey. We succesfully image a pre-existing vertical fracture set not aligned with the maximum horizontal stress. The obtained results are consistent with fracture strike and crack density interpreted from well logs data.
-
-
-
Seismic Uncertainty and Ambiguity
Authors K. Mosegard, A. Zunino, N. Frandsen and P. ChristiansenThe link between seismic data and subsurface properties suffers from an intrinsic ambiguity, i.e., that many reservoir models fit the same data within the noise. In some pathological cases, this may cause biases in the interpretation of the structure of the earth models used in exploration and reservoir management. Inversion techniques for large seismic data sets encountered in the oil industry are well established and are assumed to be reliable. Although this is generally true, thanks to integrated knowledge from geology and other geophysical data, there is, in some cases, still a significant risk that traditional approaches may end up finding only part of the models which can explain the observed data, overlooking potentially different scenarios and, moreover, hampering a correct uncertainty quantification. This phenomenon is often observed in practice when different inversion contractors arrive at significantly different results from the same data sets. The impact of the unavoidable non-uniqueness should be assessed when performing inversion of seismic data. We investigate the magnitude of the ambiguity problem in seismic modelling of chalk reservoirs by explicitly taking ambiguity into account in the inverse problem. Our study is based on a careful selected test case from the the Danish North Sea sector.
-
-
-
Multiphysics uncertainty analysis and considerations: a toolkit for interpreters
By M. MantovaniThe non-seismic employment operates a hypothetical reduction of the geophysical equivalence. Traditionally, the mutual agreement imposed to the properties is supposed to reduce the degree of freedom in shaping the final model of the earth. Nevertheless, in many cases, it is difficult to quantify how much influence the non-seismic data has on the reduction of the geophysical ambiguity. The presentation attempt to exemplify one instance of a typical situations of velocity determination with seismic-only solvers, and provide some reference number for estimating the geophysical equivalence reduction and the decrease of the uncertainty of results in the occurrence of seismic and non-seismic data.
-
-
-
Tomographic model uncertainties and their effect on imaged structures.
Authors M. Reinier, J. Messud, P. Guillaume and T. RebertWe demonstrate a recently developed method for computing tomography model uncertainties and mapping them into the migrated domain. After the final tomography, the method generates a series of equi-probable velocity model perturbations within a standard deviation confidence level. This allows computing standard deviation-like attributes for velocity and anisotropy parameters and for key horizons. An application to West of Shetland dataset highlights the interest of the estimated uncertainties.
-
-
-
Anisotropic Earth model building and some sources of uncertainty in the results
By O. ZdravevaAnisotropic earth model building (EMB) is a challenging task: even when we use best-quality modern workflows with non-seismic data and information to better constrain the problem, results are inherently non-unique. Methods for quantifying the uncertainty of earth models for seismic imaging exist and their successful application has been demonstrated in the past. All of them assume that the available model is a close representation of the true earth and is accurate enough, i.e. it explains at least all available seismic and borehole data. In addition, they rely on some extra knowledge and information for the area under investigation to be brought in to form some priors. This abstract discusses and illustrates the EMB sensitivity and uncertainty associated with: (1) the absence of enough complementary to surface seismic measurements; (2) inaccuracy in salt geometry and subsalt velocities, and (3) Q-compensation methods and parameterization. It emphasizes the need to validate earth models thoroughly before conducting uncertainty analysis and, if needed, to further update the models and ensure all limitations and assumptions of the data conditioning and EMB validation are factored into the priors for further uncertainty analysis.
-
-
-
Uncertainty estimation by probabilistic first arrival time tomography using Markov Chain Monte Carlo sampling
Authors A. Gesret, J. Belhadj, T. Romary, M. Noble and N. DesassisWe present several applications of probabilistic first arrival time tomography by Markov Chain Monte Carlo sampling dedicated to uncertainty estimation. In the first part, we introduce a new velocity model parameterization based on Johnson-Mehl tessellation that allows applying probabilistic approach to typical seismic refraction data. We also present results of the tomography to a real data set recorded in the context of hydraulic fracturing and illustrate how the velocity model uncertainties can be properly taken into account when locating seismic events.
-
-
-
Model uncertainty analysis for de-risking seismic image accuracy
Authors A. Bell, L. Russo, T. Martin and D. van der BurgWith exploration moving towards areas of increasing geological complexity, reservoir evaluation is often based on the interpretation of a single seismic image. Recovering a suitable velocity model employed in pre-stack depth migration plays a crucial role in the creation of this image on which economic evaluations are drawn. Typical depth imaging projects provide final velocity model attributes and their associated seismic image. The amount of uncertainty associated with this image is poorly understood; the only quantitative measures of reliability are provided through analysis of volumetric residual move-out and by comparison with available auxiliary data. We aim to address this situation by using the same tomography method we use to derive the model parameters. The workflow allows us to establish the resolution of the tomography, in establishing the model parameters. It also enables us to determine the recoverable degree models may be perturbed prior to the tomography. Using the workflow in conjunction with these criteria we generate a population of solution models which equally conform to the observed data. We then analyse the variance of this model population to derive confidence attributes to assign to both the target model and its associated seismic image.
-
-
-
Complexity in common image gather behaviour in offshore continental settings arising from velocity model building and imaging choices.
Authors G. O'Brien, M. Igoe, J. Doherty, P. Matrice and R. MecklenburghCommon image gather complexity arising from uncertainties in the subsurface velocity model feeds into exploration risk through degradation in seismic attributes and facies prediction. In off-shore continental settings where the overburden is geologically complex, the seismic wavefield exhibits complex behaviour when propagating through such highly varying geological structures which impacts the resultant seismic images and attributes. Using a representative synthetic model, the imaging choices and velocity model uncertainties are explored in light of maximizing facies prediction using seismically derived attributes.
-
-
-
What mistakes are we making while interpreting salt? Could FWI help?
Authors J. Dellinger, A. Brenders, X. Shen, I. Ahmed, J. Sandschaper and J. EtgenModel studies indicate that our conventional salt-interpretation workflow, consisting of cascaded sequences of flooding with either salt or sediment velocities, migration, and picking, produces two distinct types of velocity errors: 1) small but ubiquitous positioning errors of the margins of the salt, and 2) large “chunky” errors where salt boundary reflections were grossly misinterpreted. Full-Waveform Inversion might be a solution, but to achieve success may require new kinds of data, improved algorithms, or most likely both.
-
-
-
Reduction of depth uncertainties using common offset RTM (COR) Gathers
Authors S. Liu, G. Rodriguez and F. HaoSubsalt velocity estimation has presented significant challenges in the past. Ray based methods suffer from poor S/N ratios that results from sparse ray coverage beneath salt bodies. The use of common offset RTM gathers (COR) has been shown to decrease uncertainties in subsalt residual moveout estimation, which can more reliably be used by tomographic algorithms to invert for more accurate velocity estimations. Furthermore COR gathers have been shown to improve salt velocity estimations in areas with sediment inclusions. Better ties to well information (sonic logs, formation markers) have validated the improved resultant velocity models.
-
-
-
Post-Migration Processing and Imaging in the Local Angle Domain
By Z. KorenIn many areas of interest, the available seismic data combined with the most advanced seismic modeling/imaging tools, well information, potential field data, and geological and geophysical constraints, are still not sufficient to uniquely determine the complexity of subsurface geological media. There has been a continuous effort to enrich these data components in order to converge to a minimum set of plausible geological models that can be considered throughout the O&G exploration, development and production stages. The reliability of seismic imaging in complex geological areas depends on many factors. One of the most important is the ability to use the available recorded seismic data to illuminate subsurface image points from a wide range of directions and opening angles/azimuths between the incident and scattered waves. This multi-dimensional illumination challenge mainly depends on the density and extension of the seismic acquisition system and on the complexity and accuracy of the inverted subsurface geological model. Moreover, seismic imaging is classified into many categories, depending on the specific goal at each stage. For example, structuraloriented imaging for locating large scale potential reservoirs significantly differs from high-resolution imaging at the reservoir for fracture detection. In this work I demonstrate the advantages of using a novel multi-dimensional local angle domain (LAD) system for enriching information from the available recorded seismic data in order to obtain more reliable information about continuous and discontinuous subsurface target objects. In particular, I’ll briefly demonstrate the potential of using the mapped seismic data for different post-migration processing/imaging solutions: velocity model updating and re-migration, amplitude correction, accounting for illumination, geometrical spreading, and absorption/dispersion (Q-correction), in-situ data reconstruction, and specular/diffraction imaging.
-
-
-
Seismic matching pursuit decomposition based on the attenuated Ricker wavelet dictionary
More LessThe wavelet dictionary is the critical part for the matching pursuit decomposition of seismic traces. In this study, considering the effects of the wave‐attenuation we develop a new wavelet atom based on the traditional Ricker wavelet, and thus form a new dictionary—the attenuated Ricker wavelet dictionary. Based on the proposed dictionary, we use the orthogonal matching pursuit to decompose an input seismic trace into a set of components, each of which can establish the relation between the quality factor Q and the spread time of the wavelet atom. Then we can derive the Q model of the underground medium by smooth algorithms. By compensating the amplitude attenuation of each decomposed wavelet atom, the input seismic trace can be compensated. We test our algorithm with a synthetic trace and a field seismic profile preliminarily, the seismic wave‐attenuation can be well compensated and the resolution is improved.
-
-
-
Towards Realistic Synthetic Models of Middle East Carbonate Reservoirs for Evaluating Reservoir Engineering Workflows
Authors A. Darishchev and D. GuérillotThis paper proposes building realistic synthetic models of carbonate reservoirs typical of those frequently occurring in the Middle East, particularly in the State of Qatar. Based on published data, these models represent the Upper Jurassic Arab C and D reservoirs which are among the most abundant and economically important. They are well studied and most of the reservoir data is available in literature. In these models, the rock and fluid properties are based on actual fields. Developing and upgrading these models with partners and sponsors will lead to advanced reservoir engineering workflows for enhancing oil recovery, screening and benchmarking various innovative scenarios of field development and uncertainty assessment. This stimulates effective collaboration, knowledge and expertise sharing and establishes better practices in carbonate reservoir engineering.
-
-
-
Combining seismic and petrophysics to improve facies proportions modeling
Authors J. Chautru, H. Binet and M. BourgesA new workflow to propagate expected Facies proportions from wells into the 3D space is proposed, considering seismic data as stronger constraints than a simple drift or auxiliary variable in cokriging calculations. The idea is to combine seismic attributes transformed into Porosity, calibrated to well data, with the Porosity distribution per Facies determined from core plugs and Porosity logs. The workflow is made of two steps, the first one corresponding to calculations at seismic vertical scale, the second one being a resolution enhancement at well data scale (between 1 meter and 1 foot). In the first step, well data are obviously averaged at seismic vertical scale for ensuring consistency between the different data sources. At this vertical scale, the average Porosity at a given location in the reservoir is obtained from a mixture of already defined Facies, which expected proportions are related to the Porosity distribution inside each Facies. The distribution of expected Facies proportions at any point is calculated, ensuring that the average Porosity calculated from seismic attributes at that point is honoured. Then, the vertical evolution of Facies proportions is calculated at a sub-metric scale, from well data and the local average proportions previously determined.
-
-
-
Core Tests and Field Case Studies of Successful and Unsuccessful Low-Salinity Waterfloods from Four Oil Fields
Authors V. Khisamov, V. Akhmetgareev and T. ShilovaThe aim of the work is to define successful and unsuccessful cases of low-salinity waterfloods (LSW) and to determine the incremental recovery if compared with high-salinity water injection. Four oil fields have been studied. Double-coreflood tests showed: a) fines: three-fold decrease of water relative permeability (krw) for the Pervomaiskoye field and five-fold decrease for the Bastrykskoye field; b) wettability alteration in carbonates: three-fold reduction of residual oil saturation (Sor) for the Romashkinskoye field; however, no changes were noted for the Arkhangelskoye field. 3D models of the fields were built and history matched. To simulate salinity-dependent effect of fines migration and wettability alteration, options of krw and Sor decrease were used. Field performance analysis showed that LSW resulted in increase of incremental oil recovery by 3.5% in the Pervomaiskoye field, but in the Bastrykskoye field, the LSW effect was negligible. Analysis of two LSW pilots in the Romashkinskoye field showed improvement of the incremental oil recovery by 2.7% due to LSW, while LSW pilot in the Arkhangelskoye field was unsuccessful. For sandstones, LSW into aquifer yields no positive effect, nor does it at the initial and final stages of reservoir development. For carbonates, the effect depends on viscosity of oil.
-
-
-
Application of Nanoparticles in Adsorption Reduction of Polymer in Chemical EOR Processes, Experimental Approach
Authors S. Mellat and M. JamialahmadiThe implementation of tertiary Enhanced Oil Recovery methods is required due to the huge amount of oil retained in the reservoir at the primary recovery stage. Polymer flooding has proved to be an effective chemical EOR method which improves sweep efficiency and increases recovery; however, one of the major problems that can make polymer flooding inefficient and economically infeasible is polymer adsorption on the rock surface. Applicable methods which can reduce polymer adsorption in carbonated porous media have not yet been widely investigated. In this paper, the effects of adding nano silica and nano alumina on adsorption-reduction of polymer in Partially Hydrolyzed Polyacrylamide (PHPA) solutions were experimentally studied. Adsorption density of six test points was calculated based on the calibration curve generated in accordance to the solution electrical conductivity. Also, the Critical Micelle Concentration (CMC) was determined (1580 PPM at 28 C). The results show the adsorption density decreases by 16% in presence of nano silica and 24% in presence of nano alumina and therefore, the addition of nanoparticles into the polymeric solutions remarkably reduces the polymer concentration loss.
-
-
-
Effect of Wettability Alteration on Production Improvement in Gas Condensate Reservoirs: a Review Paper
Authors M. Sheydaeemehr, I. Shafiei Sarvestani and M. PasdarWettability alteration is a novel approach in gas condensate reservoirs by altering the wettability of the reservoir rocks from strongly liquid wetness to preferential gas wetness or intermediate-wetting by treating them with chemicals that it's effects has been proved in lab scale. Both gas and condensate cumulative productions are improved significantly after the wettability alteration. The effect of wettability alteration on gas-condensate production improvement is more pronounced in the intermediate-wetting state. Wettability alteration could significantly increase well productivity at relatively low cost because only the near well region needs to be treated. In this paper, a review of recent studies on effects of wettability alteration on gas and condensate production improvement in gas condensate reservoirs has been presented.
-
-
-
Salt deformation history and salt related structures signification ,offshore Golf of Gabes,Tunisia
Authors H. Ghuedifi, I. Hamdi Nasr, M. Hedi Inoubli, N. Barhoumi and et al.Regional seismic profile shows salt walls predominance on the Golf of Gabes which is flanked by elongate minibasins. The overburden and the underlying sub-salt basement become welded together which means that salt layer is completely removed , Salt welds forms may are in response to complete evacuation of salt materials due to salt movement or to viscous flow and dissolution , in consequence the salt growth may be still stopped at the actual stage. The rim basin between the salt walls reveal an important subsidence history of the basin .The Salt diapir in the study area is characterized by an upward-converging sides and antithetic crestal faults which can be the result from levelling movements that reduce structural relief due to salt growth according to Cloos(1928) and Dennis and Kelley (1980)or the consequence of salt welds formation. Thickness variation of geological strata toward the salt diapir reveals evidence about synsedimentary activity.
-
-
-
The fracture prediction in M carbonate oilfield
By Y. NieM carbonate oil field is located in Zagros low angle belt. Having experienced multi-period tectonic movement, the fracture system and the fracture scale of the belt are extremely complex. The structural fracture has formed the good reservoir space. Therefore, the reservoir fracture prediction is the key to effectively develop the M carbonate reservoir. According to the characteristics of M carbonate reservoir, this paper puts forward a set of fracture prediction methods based on the post-stack and pre-stack seismic data. Firstly, we apply post-stack seismic data to simulate stress field and analyze the fracture-developed zones quantitatively; secondly, apply AVAZ analysis to pre-stack seismic data and identify fracture density as well as the orientatation. Finally, reasonably verify the result of fracture by production wells. This research leads a detailed analysis for fracture reservoirs in the studied area, which in turn unveils the fracture development rules and spatial distribution characteristics. Therefore, it provides a certain guide for future oil and gas development.
-
-
-
Experimental investigation of oil-water two-phase slug flow in inclined pipes
More LessThis paper conducted an experiment to investigate the oil-water two-phase slug flow characteristics in inclined pipe for the inclination angle of +10º, +20º and +30º. The flow photographs and oil hold-up were obtained from the experiment.
-
-
-
Numerical Modeling of Carbonated Water Injection into Oil Reservoirs Using Buckley-Leverett Theory Considering the Capillary Pressure
Authors M. Ahmadi, D. Zendehboudi and D. JamesThe Buckley-Leverett theory is broadly employed in the upstream oil industry. This theorem suffers from the assumption of zero capillary pressure. This paper develops the Buckley-Leverett theory where the capillary pressure is taken into account. This modification on the original form of the Buckley-Leverett equation can extend the applications of this theory towards more accurate results in terms of fluid dynamics in porous media. This work considers a water-oil system with two different boundary conditions; including, constant carbonated water injection rate and constant bottom-hole pressure that simulate real conditions of Carbonated Water Injection (CWI) operations for oil reservoirs. The two-phase numerical modeling of CWI is performed using the MATLAB® software through the Implicit Pressure Explicit Saturation (IMPES) method. Parametric sensitivity analysis is conducted where the effects of the different parameters (e.g., mobility ratio, injection rate, fluids and rock characteristics, time step, and grid size) are studied on the CWI performance, numerical dispersion, and round-off errors. In this work, a black oil simulator is developed where the wettability, capillary pressure, and relative permeability are extensively investigated. This research aims to further understand the CWI processes in oil reserves in terms of recovery mechanisms and practical implications.
-
-
-
Data Space Reflectivity and the Migration based Travel Time approach to FWI
By G. ChaventFull wave inversion (FWI) has received in the last ten years a renewed interest, due to its potential ability to perform simultaneously imaging and migration velocity analysis, But it appeared quickly that its resolution by local optimization methods is hampered by the cycle skip problem unless a very good initial guess of the background velocity is available. One approach suggested to overcome this difficulty is Migration Based Travel Time (MBTT), which uses a data space change of reflectivity unknown to alleviate the cycle skip problem of FWI. The method was originally presented in the time domain and with rather obscure notations, which made it difficult to understand to a geophysical audience. The author's reading of the literature on FWI has confirmed him that the method was essentially not understood. So the objective of this talk is to present the MBTT approach and its key feature, the data space reflectivity, with unified, and hopefully more geophysical notations, and to give, as far as possible, a geophysical insight into the motivations which presided to its inception, and into the reasons which make MBTT able to determine both the long and short wavelength of the velocity starting from a poor initial velocity model.
-
-
-
Extended Waveform Inversion
By B. SymesFull waveform inversion has evolved into a processing commodity with a firm foothold in the exploration workflow. However FWI still faces several major challenges. One of these is the occasional stagnation of inversion algorithms where no local model perturbation improves data fit. This "cycle-skip" phenomenon can hide kinematic information inherent in the data that would permit large updates in wave velocity fields. One approach to alleviating cycle-skip, extended inversion, transfers the data kinematics to the an extended model with more degrees of freedom than physical models. Variants fall into two categories, according to whether extra degrees of freedom are added to subsurface parameter fields or to source representations. A common theme is that the model should be so extended that the data can be fit throughout the inversion process, thus rendering cycle-skipping impossible. Model updates reduce a penalty that measures distance from the extended model to the original, physical model subspace. Some penalties are mathematically equivalent to a traveltime tomography objective. I will give a rough taxonomy of extended waveform inversion, with several examples, and describe some of the successes achieved and challenges facing this class of methods.
-
-
-
Optimizing the coefficients of the leading terms of the Born Series: FWI+MVA+more
More LessThe scattering series theoretically utilizes a model perturbation framework to explain the difference between the seismic modeled data corresponding to a background model and those measured in the field corresponding to the real Earth. These perturbations include short wavelength features like those predicted by full waveform inversion (FWI) gradients, and long wavelength features often constrained by migration velocity analysis (MVA) objectives. The Born series, however, is not a convergent series. If the perturbations are large, we probably will not be able to explain the data difference. Thus, using the leading terms of the Born in an iterative process, in which they are scaled properly, allows us to avoid such limitations and update the short and long wavelength components of the velocity model. In fact, the FWI update is manifested in the first term of the Born series, and the MVA update is represented by the transmission (first Fresnel zone) part of the second term. In this case, FWI and MVA are code names for dividing the optimized update to reflectivity based portions and those adequate for the background, respectively. Examples on synthetic and real data demonstrate this logic.
-
-
-
FWI for elastic media: macrovelocity reconstruction
Authors V. Cherveda, G. Chavent and K. GadykshinCurrently the common approach to perform FWI is nonlinear least square minimization of the standard data misfit functional which characterizes L2 residual between observed data and synthetized one for a current velocity model. As was aforementioned this approach has been developed and studied in a great number of publications. But up to now there are problems with reliable reconstruction of macro velocity component via FWI for realistic frequency bandwidth and offsets. The reason is the structure of the forward map, which transforms the velocity model to the data – it is almost quadratic with a well-conditioned matrix with respect to perturbations of reflectors, but has very complicated nonlinear behavior with respect to propagator perturbation . Intuitively this can be explained by the so-called “cycle- skip” effect when phase shifts of the observed and synthetic data may result in local minima. To mitigate the problem earlier was introduced a multiscale inversion strategy in time and frequency domain when the frequency of the input data is increased progressively and the inversion result for lower frequency becomes an initial guess for the higher frequency. However, such sequential inversion approach also fails due to lack of low frequencies in the data.
-
-
-
Extending the reach of FWI with reflection data: Potential and challenges
Authors A. Gomes and N. ChazalnoelIn this work, we present a Reflection FWI (RFWI) workflow to update the velocity model using the low-wavenumber component of the FWI gradient of reflection data, the so-called rabbit ears. This is achieved by alternately using high-wavenumber and low-wavenumber components of the gradient to update density and velocity models, respectively. We apply this method to a deep-water survey on the Mexican side of the Gulf of Mexico. The initial model is obtained after diving wave FWI and deep ray-based tomography. However, some discontinuities remain at the deep Wilcox and Cretaceous events. After RFWI application, we observe a significant improvement in these events, both in the migrated image and gathers. Finally, we discuss our observations on the requirements and limitations of RFWI, such as the poor vertical resolution. Furthermore, we make a parallel between this method, conventional FWI and ray-based tomography, identifying potential improvements for RFWI and discussing how to combine these approaches.
-
-
-
Application of tomographic FWI (TFWI) to large-scale field datasets: challenges and insights.
Authors B. Biondi, A. Almomin and R. SarkarIn the past few years we have demonstrated with blind-test synthetic examples and 3D field-data examples that a time-lag extension of the velocity model can lead to robust convergence of FWI (we called our method Tomographic FWI or TFWI) even when the starting model is far from the correct one. Computational cost and convergence rate are still major obstacles to a routine TFWI application to large-scale projects. We will discuss recent advances in these areas.
-
-
-
Updating velocity fields beyong the diving waves
More LessConventional least-squares based full waveform inversion (FWI) is not suitable to construct low-wavenumber back- ground model when recorded data is dominated by reflected energy. We present a new approach to address the challenge of building kinematically correct background model with FWI for reflection-dominant seismic data. The new approach de- composes a subsurface model into a smooth background, which is updated via minimizing a new objective function, and a rough reflectivity, which is computed through a migration or least-squares migration at current background. With such a model decomposition strategy and the Born modeling, we are able to directly compute the reflection-based low-wavenumber components of a conventional FWI gradient. To guarantee that these low-wavenumber components contribute to updating background model in correct directions, we developed a new optimization strategy, which consists of two essential compo- nents: first, computing an offset-dependent matching filter to match the predicted Born wavefield and observed reflections; second, measuring the incoherency of this offset-dependent fil- ter along offset and time, and then updating the background to minimize this incoherency.Real data demonstrate the success of the proposed algorithm in constructing kinematically correct models.
-
-
-
Wave-Equation Migration Velocity Analysis in the Surface Common Offset Domain: Application to Viking Graben Field Data
Authors P. Nandi and U. AlbertinThe quality of subsurface seismic images depends greatly on the accuracy of the velocity model used for imaging, but the highly nonlinear relationship between recorded data and the velocity model makes estimating accurate velocities a nontrivial task. Two methods, full-waveform inversion (FWI) and wave-equation migration velocity analysis (WEMVA), are commonly used to estimate velocities. These methods, both iterative and wave-equation based, estimate velocities by minimizing an objective function to achieve a desired level of accuracy. Full waveform inversion (FWI) can fail in complex environments when the initial model is an inaccurate representation of actual subsurface velocities. Methods based on wave-equation migration velocity analysis (WEMVA), however, use pre-stack image focusing in order to converge to a more global solution regardless of the starting model. We develop WEMVA in the surface common-offset domain by parsing data into separate bins containing a limited range of offset and processing each bin independently. We circumvent cycle-skipping, which can lead to an erroneous result, by using a sufficient number of bins to limit the amount of residual moveout between neighboring traces. We present the results of WEMVA in the common-offset domain on a 2D marine field data set recorded offshore Norway in the Viking Graben area.
-
-
-
Mitigating the gradient artefacts of Migration Velocity Analysis by Gauss-Newton update
Authors R. Soubaras and B. GratacosThis paper shows how the artefacts present in the gradient of the cost function when performing a Migration Velocity Analysis can be strongly attenuated by using a second-order Gauss-Newton scheme. The artefacts on the velocity gradient are strongly attenuated when the total gradient is deconvolved by the approximate total Hessian. At each iteration, a least-squares migration is produced rather than a migration. The proposed algorithm is illustrated on the Marmousi dataset.
-
-
-
Image-domain versus data-domain velocity analysis based on true-amplitude subsurface extended migration
Authors H. Chauris, Y. Li and E. CocherWe compare the traditional image-domain migration velocity analysis technique with a new approach developed in the data domain. From a reflectivity model obtained after true-amplitude migration and multiplied by an annihilator, new data are modelled under the Born approximation in the extended domain. The new objective function is simply the l2-norm of the data to be minimised. We discuss the advantages and limitations of the new approach versus the image-domain approach on two synthetic 2d models. The two approaches differ by the nature of the oscillations observed in the velocity gradients and by the relative weights contributing to the velocity update.
-
-
-
Reflection Waveform Inversion method: solutions to the reflectivity-background coupling problem and consequences on the convergence
Authors R. Valensi, R. Baina and V. DupratA new formulation of the Full-Waveform Inversion (FWI) method called the Reflection Full Waveform Inversion (RFWI) method has been recently introduced in order to enable background velocity updates from reflection events. However, it has been observed (even with state of the art optimization methods) that numerous iterations in the RFWI workflow are necessary to converge towards a correct background model. This numerous iterations makes the cost of the current approach prohibitive for real scale applications. In this contribution, using an analytic model, we explain how the high number of RFWI iterations is related to the background-reflectivity models coupling. We propose two solutions for solving this issue. One solution considers a joint optimization in an extended space considering independent de-migration and migration background models. Another solution consists in a variable projection technique which enables an optimization in a reduced model parameter space where the background and the reflectivity models are always consistent. In our tests, both solutions provide convergence rates of a least one order of magnitude faster than the conventional RFWI method. Furthermore, we provide a geometrical interpretation of the relation between these different solutions and explain why the methodology based on the variable projection should be preferable.
-
-
-
From RWI to JFWI: including diving waves in reflection-based velocity model building
Authors R. Brossier, W. Zhou, S. Operto, J. Virieux and P. YangIn this presentation, we will review the main limitations of classical FWI for imaging at depth, and the main concepts behind RWI and JFWI. Based on synthetic and real data applications on a 2D OBC datasets collected across a gas cloud, we will show the added value of such schemes and also their limitations mainly associated with the misfit function definition (cycle-skipping and amplitude dependency).
-
-
-
Frontier Play Concepts and the Tools for Regional Screening
Authors O. Sutcliffe, A. Davies, D. Hay and M. SimmonsThe ability to infill “white space” using testable geological models is necessary for identifying frontier plays. Three principal tools can help explorationists with this task. These tools include gross depositional environment (GDE) maps to define the organisation of facies, regional depth models to aid maturity and reservoir predictions, and geodynamic models to assess the nature and timing of tectonic events. These resources are used to populate a global database of >3,150 proven and unproven predicted plays. This database identified >50 basins that have multiple unproven plays, representing a significant target for frontier exploration. Through the integration of these tools, basins will be high-graded, and the location of the next exploration frontiers will be identified.
-
-
-
A method for estimating yet-to-find in hydrocarbon plays based on historical results of exploration
By D. QuirkThe background and description of method for calculating, reporting and checking yet-to-find petroleum resources.
-
-
-
The Value Statement for Applying a Fully Quantitative Approach to Play Mapping
Authors P. Brown, I. Longley and R. YoungWe will show the impact upon real time decision- making of understanding the chance dependencies between prospects, particularly in undrilled or lightly drilled geologic plays. The impact of potential success upon a family of undrilled prospects is often not fully understood or appreciated. GIS technology, along with consistent play mapping using a fully quantified approach (including splitting of chance in to play and prospect components) sometimes reveals that the optimal drilling inventory is not being selected to maximize potential volumes and value.
-
-
-
The Norwegian shelf - 50 years of exploration, and still attractive
By G. SoilandThe Norwegian Petroleum Directorate (NPD) is a government institution responsible for maintaining a complete inventory of petroleum resources in Norway. This is done in accordance with established resource classification routines, regular reporting from the oil companies and NPD’s geological mapping. NPD compile and publish annually figures on field reserves, contingent resources and YTF resources. Assessments of undiscovered resources is done by applying a stochastic play model method. The remaining undiscovered resources reflect the exploration potential with today’s knowledge and understanding. As new exploration activity proves up new petroleum plays, the potential is adjusted accordingly. This is an estimate of what will be technically possible to find and produce if all prospects are identified and drilled.
-
-
-
A play-based approach to future oil and gas exploration on the UK Continental Shelf
Authors P. Herrington, J. Bagguley and D. QuirkThe Oil & Gas Authority (OGA) has the remit to ensure exploration activity in the UK is carried out effectively by managing future potential, optimizing value for all stakeholders and ensuring all operations are safe. As part of the objectives, a 2½ year project carrying out play analyses and yet-to-find (YTF) evaluations for petroleum-prone areas of the offshore continental shelf has started. The plans will be presented in this paper.
-
-
-
Historical review of Play Fairway analysis using modern Play Based Exploration techniques in the Central North Sea, UKCS.
Authors C. Jepps, A. Shafi, F. Lottaroli and L. MecianiThe aim of this project, undertaken in 2015, was to review the accuracy of a petroleum prospectivity study in the UKCS CNS conducted by Lasmo in 1993 by comparing its predicted results to discoveries made since, and to re-predict the results of the original study using modern Play Based Exploration software and best practices. Four plays were analyzed for prospectivity during the project; the Triassic (TRI), Middle Jurassic (J2), Upper Jurassic Shallow (J3s) and Upper Jurassic Deep (J3d). The results show that between 1993 and 2015 an additional 2,082 MMBOE was discovered in the study area for the plays in question. The results validate the YTF concept as all the estimated YTF values based on the 1993 dataset (in the range 2,048-2,476 MMBOE) are “in the right ballpark”. However, if the YTF estimate for the Modern study is accurate, it is likely that all the modelled YTF values have underestimated the remaining hydrocarbon potential. In addition, the results show that the use of modern technologies and best practices produced a more accurate result.
-
-
-
The 2014 North Carnarvon Basin License Round – a real-world example of the application of modern play tools and techniques in a competitive mature basin exploration arena.
Authors J. Bradshaw and I. LongelyThe 2014 Australian offshore gazettal round was announced in May 2013 with thirteen of the 31 blocks in the North Carnarvon Basin, Australia’s premier oil and gas offshore basin. 12 bids were subsequently received on 6 blocks, which were later awarded with firm and contingent total work program commitments of $33 million and $150 million, respectively.
-
-
-
Play mapping in the East Java Basin, Indonesia: A Methodology for Future Exploration in
Authors I. Longley, C. Kenyon, A. Livsey and J. GoodallThe East Java basin of Indonesia is a long established petroleum province notable for having delivered significant volumes in the modern exploration era. The basin contains an unusually large variety of clastic and carbonate reservoir-seal pairs reflecting tectonostratigraphic settings ranging from extensional rift basins to stable platform areas to compressional and wrench-related inversion reflecting proximity to the convergent margin to the south. The large variety of plays and potential for further discoveries make the basin an ideal candidate for a systematic split-risk CRS mapping approach to determine remaining exploration potential both within proven and unproven play areas in both conventional and stratigraphic traps. CRS maps and stacks have been produced for nineteen plays over the greater East Java area as far north as the Assem Assem Basin and eastwards to the southwestern arm of Sulawesi. This methodology is repeatable and applicable to other western Indonesian Basins, as the focus shifts towards traps with significant stratigraphic trapping components and should help increase our understanding of the factors responsible for failure, hopefully leading to better predictive capabilities for success.
-
-
-
Evaluating a vintage Play Fairway Exercise using subsequent exploration results: did it work?
Authors F. Lottaroli, J. Craig and A. CozziIn this paper we compare the results of a vintage (1995) Play assessment of onshore Colombia (Upper Magdalena Basin) with the actual outcomes of exploration activity over the following 20 years, through re-construction of the vintage risk assessment in GIS and re-computing yet to find using different methodologies. The vintage play chance maps were constructed largely based on geological concepts and guesses, supported by outcrop geology and sparse subsurface data (mainly wells). Methodologies (modern and vintage) and results of YTF calculation are compared and discussed.
-
-
-
Multidisciplinary Integration of Prospectivity and Value in Play Analysis - An example from onshore Kazakhstan
Authors P. Ventris, J. de Jongh, E. Dujoncquoy and S. ArcherBasin evaluations necessarily focus significant effort on the sub-surface. The typical workflow involves building an integrated understanding of the basins structural origin and development, the sedimentary fill and the movement of fluids within the basin. Plays are then defined and play analysis typically undertaken based on this foundation of a thorough understanding of the basin architecture and fill. For the context of this presentation a play is defined as a reservoir – seal pair with the potential to access charge from one or more of the predicted or proven source rock intervals.
-
-
-
A map-based, integrated geological and economic approach for play analysis
Authors K. Nifuku, K. Ogino, K. Nakaoka, Y. Okano, T. Ito and T. TodorokiPlay based exploration is an effective approach in the exploration of emerging and frontier plays in a basin. It is essential to evaluate and understand geological potentials of the plays, for instance geological risk and field size, together with their economic potential in order to identify the best areas to invest. This paper introduces a workflow of a map-based, integrated geological and economic approach for play analysis, which is based on our assessment in the deepwater Northern Gulf of Mexico.
-
-
-
Extension of the Apulian Platform in the Northwest Greek Offshore: paleogeography update and impact on hydrocarbon prospectivity- Greece.
Authors V. Carayon, L. Montadert, J. Allard, A. Fournillon and et allThis presentation is about shallow water carbonate platforms and associated build-ups present the NW Greek Offshore. Its goal is to delineate the offshore extension of the Apulian platform-type series resting over the western edge of the Adria microplate present within the convergent boundary zone between the African and Eurasian plates. This study, based on careful seismic stratigraphy analysis of a multiclient 2D “Broad band” data acquired and processed by Petroleum Geo-services, has enabled to identify two shallow water isolated platforms between the Otranto Strait and the Strophades Islets. The northern “Apulian Ridge” and the southern “Strophades Ridge” present on either side of the Kefalonia dextral strike slip fault, are atoll-like rimmed carbonate platforms that developed in passive margin condition during the Mesozoic. During the Cenozoic these atolls were topped by two sets of shallow water build-ups attributed to (1) Upper Oligocene-Lower Miocene, quite unique in the Mediterranean as they formed during the Alpine orogeny climax, and (2) Upper Miocene, demised during the Messinian Salinity crisis. Despite crucial unknown about rocks age and properties, some of the identified Mesozoic build-ups or resedimented base of slope carbonates and Cenozoic fringing build-ups represent future valuable exploration targets.
-
-
-
Play Analysis of the Gamtoos Basin, off the south coast of South Africa: From Concept to Portfolio
Authors A. Davids, C. van Bloemenstein and J. RouxA play analysis was carried out in the underexplored Gamtoos Basin, South Africa. A total of 10 wells were drilled of which five encountered oil and/or gas shows. An integrated approach, incorporating geology, geophysics, sedimentology and geochemistry led to the mapping of petroleum elements, including source, reservoir, charge and seal risk as well as the geological chance of success (Pg). This culminated in the identification of 80 leads which were ranked from low to high risk. Full risk analysis of the Synrift 1a play fairway is described in detail while the portfolio of the other plays are only summarised. This analysis resulted in a better understanding of the regional geology, structural development and the hydrocarbon potential of the basin.
-
-
-
A Chance Calculus for Play-Based Exploration
By C. StabellThis paper presents a rules-based calculus for prospect chance estimation. A key idea is that estimates are based on a combined categorization of our knowledge level (DATA dimension) and our model of the geological context (MODEL dimension). We call it a chance calculus that combines data and judgement with simple rules. Earlier work (Milkov, 2014) is extended by using two-level chance rules that distinguish between shared play chance estimates and conditional prospect chance estimates as well as by a second separate step that handles seismic anomalies. The chance calculus provides estimates that are consistent and cover all chance factors and all conventional exploration situations. Explicit MODEL categorization not only provides a transparent, unambiguous basis for the chance estimates, but also eliminates double risking between shared and conditional chance estimates.
-
-
-
How to counter the effects of upside bias in prospect inventories used for yet-to-find estimations
By D. QuirkDownside discovery sizes result from when the pre-drill prospect model fails and occur in as much as 50% of successful exploration wells. This paper shows a way of incorporating the downside in prospect evaluations, based largely on historical data.
-
-
-
Production in the USA. How long will it take to decline by half if there is no new drilling?
Authors H. Kuzma and J. Conradson1.3 million individual well production forecasts are aggregated to estimate the time it will take for production to decline by half in the USA if no new wells are drilled. Drilling is currently less than a third of its 2014 levels. Considerable capital is required to replace declining production, capital which might not be readily available. Half production for the USA is forecast to be hit before 2022. For the shale basins such as the Bakken and Eagle Ford, it will be in the next 1 - 3 years. This means that hundreds of thousands of new wells will have to be drilled to restore production to 2014 levels. The limiting factor in exploiting American shale reserves may be access to capital, not geology.
-
-
-
Progresses in the exploration of structural uncertainties
Authors G. Caumon, G. Godefroy and F. LallierThis talk provides some motivations for generating multiple stochastic structural models and reviews recent trends in stochastic structural modelling methods. We discuss in particular the main advances and challenges to confront spatial data and structural concepts in cases where multiple structural scenarios with different numbers of faults and different fault connectivity mush be considered. We also propose a way to formulate the stochastic structural modelling problem and discuss its potential to manage many types of spatial observations, to integrate several types of structural concepts and to explore structural uncertainties in an organized way.
-
-
-
Statistical inversion of seismic data for reservoir property estimation: analytical vs. numerical approaches
By D. GranaThe estimation of reservoir properties from seismic data is a mathematical inverse problem and can be solved by combining geophysical modelling and inverse theory. Successful results have been obtained using either deterministic or statistical methods. One of the advantages of statistical approaches is the uncertainty quantification of the model parameter predictions. Bayesian inverse methods have been applied to seismic inverse problems to predict the point-wise posterior distribution of elastic attributes or petrophysical properties. In this work, we present several examples of Bayesian inversion for reservoir characterization applications.
-
-
-
Conceptual geological models and modelling tools for assessing fault-related uncertainties in flow from conventional clastic reservoirs
By T. ManzocchiAttempts to consider fault-related parameters during convergence of a flow model towards a history-match will, at best, allow fault rock permeability to be altered systematically, often as a function of the local Shale Gouge Ratio (SGR). Several published examples now exist of history-matched field-specific SGR-to-permeability relationships, but all have one thing in common: they derive from Brent-province reservoirs. Why? Is there something characteristic about post-depositional faults in deltaic sequences (like the Brent province) that makes the SGR approach more likely to succeed in this setting than in others (e.g. deep marine sequences containing syn-depositional faults)? Empirical evidence and geomechanical models suggests that different aspects of faults are likely to be significant on reservoir-scale flow in different geological settings, and there may be good reason to expect, a priori, different fault characteristics (e.g. sub-seismic fault segmentation and discrete shale smears) to dominate uncertainty on flow in different reservoir types. New tools are required to handle these characteristics. One important conceptual difference between the existing fault property modelling tools and the future tools needed to handle a wider range of fault-related uncertainties, is the requirement to represent heterogeneity explicitly, rather than as an average property captured by an SGR-to-permeability relationship.
-
-
-
Facies classification using machine learning: lessons from SEG-ML contest
By M. BlouinNew resources exploration and exploitation sites collect at high resolution and rate multiple sources of data, generating considerable amount of information to process and eventually, interpret. Newly developed Machine Learning algorithms can help overcoming this challenge to gain better insight on data and resources. As the latter is hot topic right now in geoscience, Matt Hall, Editor of Leading Edge’s Geophysical Tutorial launches a contest for facies prediction in 2016 October issue. For this purpose, wireline logs and geological facies data from nine wells in the Hugoton natural gas and helium field of southwest Kansas (Dubois et al. 2007) were made available, with facies in two wells being kept blind to contestants. With contributions from data scientists from all over the world, the best score achieved was a little over 63% for the automated prediction of 9 different facies. Submissions included a dozen different algorithms, but ensemble methods (Gradient Tree Boosting, Random Forest) proving to be the most successful ones. Also, feature engineering (the extraction of more information from the variables) turned out to be the key aspect for successful prediction. In addition, comprehensive interpretation of the contestants results showed that prediction uncertainty was roughly about 4%.
-
-
-
Accounting for diagenesis in the simulation of reservoir properties. How to manage uncertainties on the data?
Authors B. Doligez and H. BeucherThe objectives of this paper are to address the problem of integration of data corresponding to different physics and different supports to construct a geological static model. In fact, diagenesis and petrophysical properties are characterized and quantified from laboratory analyses, observations of thin sections, tests on cores while values are needed to populate cells of the geological model (metric to decametric size). At present, mean characteristics of these properties are generally attributed to the dominant facies in these cells, despite their more complex distribution existing from experimental measurements at a smaller scale. We illustrate on different examples derived from real data some ways to integrate this variability to obtain more realistic distribution of the reservoir properties. Moreover, the hypotheses on the physical process of diagenesis and on its continuity through different facies can be tested through different spatial model in order to quantify the final uncertainty on the resulting model. These points will be addressed using the Pluri-Gaussian Simulation method with parameters adapted to each type of uncertainty.
-
-
-
Some geostatistical models for filling heterogeneous reservoir with petrophysical properties
Authors D. Renard and H. BeucherOne of the main challenges of Geostatistics in reservoir characterization is to populate a portion of 3D earth model with rock properties. Their spatial characteristics are analyzed along the stratigraphic reference system where the variogram is modeled. A base case estimation (kriging) may be sufficient, or complemented by stochastic simulations for a sensitivity analysis. Sometimes properties present multimodal distribution reflecting the heterogeneity within the studied stratigraphic domain. Therefore it makes sense to look for explanatory co-variables, such as the lithotype. Then we simulate the spatial organization of different lithotypes first (using categorical simulation) and afterwards populate each lithotype with the rock property following the rock- dependent spatial characteristics. The previous workflow, which relies on the independence between lithotype and rock property, may not be relevant if the samples show a border effect when crossing the lithotype border. Some statistical tools are introduced to check the relevance of this assumption, such as the transition probabilities and the contact analysis. The border behavior is also checked after the initial simulation outcome has been upscaled. Some stochastic joint models of lithotype and rock property are conceived to reproduce the presence or absence of borders, conditioned to sample data.
-
-
-
Constraining the history match using 4D seismic data: how far can we go?
Authors C. MacBeth and R. ChassagneA key direction for history matching (HM) or close-the-loop exercises is the incorporation of time-lapse (4D) seismic data into our workflows.
-
-
-
Rapid forecasting of uncertain reservoir responses using functional data analysis
By C. ScheidtUncertainty in future predictions of oil reservoirs is traditionally obtained via a history matching procedure which generates a set of reservoir models that match the available data to within a user defined tolerance. The process of creating history-matched models can be a very expensive and difficult task, and often needs to be repeated when new data is obtained. In this presentation, we propose a diagnostic tool which indicates rapidly to what extent the posterior uncertainty on the forecast will be reduced, if at all, by the data. To do so, instead of generating models that match the data, we propose to estimate the relationship between the historical and forecast variables. This relationship is then used when new historical observations are available to update statistically the uncertainty on the forecast. Through this procedure, an estimate of Bayesian posterior forecast uncertainty can be obtained without requiring history matching, indicating if the data is indeed informative on the prediction variables.
-
-
-
Model aggregation for production forecasting
By G. StoltzI will review in this talk a machine-learning technique called robust online aggregation of predictors. This setting explains how to combine base forecasts provided by ensemble methods. No stochastic modeling is needed and the performance achieved is comparable to the one of the best (constant convex combination of) base forecast(s). These techniques have been applied to a variety of data sets (includign electricity consumption or echange rates) but I will focus in this talk on the forecasting of the production of oil and gas.
-
-
-
Identification of the key geological uncertainties by unconventional reservoirs features analysis
Authors A. Pishchuleva, O. Gorbovskaya and E. ZhukovskayaProspects of the study field are linked with the reservoir engineering of low permeability formations of lower Jurassic - J15, J14. It is essential to have an accurate prediction of geological parameters for optimal reservoir engineering system for such unconventional reservoirs. The main problem at the study field is the lack of direct methods for the reservoirs definition (poor well logging), the uncertainty of quantitative interpretation, of the lateral variability, the inability to forecast the reservoir properties according to the current seismic data processing. Through the core, logs and seismic data complex study were set the wide area spreading of sediments as reservoirs and non-reservoirs. It can be noticed certain reduction of the J15 formation total thickness on the elevations. The paper contains the prerequisites for the formation reservoir properties and their differentiation in the geological section, lithogenetic factors associated with tectonic processes and the basic geological uncertainties.
-
-
-
Spectral Decomposition and Attribute Analysis of Gharif Formation, North Oman
Authors S. Narasimman and M. Al-GhannamiThe aim of this study is to map sand distributions and faults within Upper Gharif by extracting attributes using the seismic, acoustic impedance, spectral decomposition and RGB blend. Sand distributions and faults within Upper Gharif are mapped by extracting attributes. In this study attributes were extracted using spectral decomposition, RGB blending and acoustic impedance volumes to distinguish between sands and shales. First using spectral decomposition in which the seismic trace is decomposed into mono-frequencies in order to observe the amplitudes related to each frequency in the seismic. Using Trap Search Engine attributes of mono-frequencies are generated and compared with the well data. In addition using RGB blend, best mono-frequency volumes, based on sand thickness and well calibrations, are combined for better analysis.
-