- Home
- Conferences
- Conference Proceedings
- Conferences
79th EAGE Conference and Exhibition 2017 - Workshops
- Conference date: 12 Jun 2017 - 15 Jun 2017
- Location: Paris, France
- ISBN: 978-94-6282-219-1
- Published: 12 June 2017
1 - 20 of 144 results
-
-
Experimental Investigation of Thermal Marangoni Effect on Bypassed Oil Recovery
Authors M. Masoudi, B. Rostami, M. Khosravi and P. AbolhosseiniIn this paper, the effect of interfacial tension (IFT) gradient caused by the temperature changes (Benard-Marangoni phenomenon), as a novel EOR method has been investigated. For a proper understanding and visualizing the mechanism, glass micromodels were used. Carbon dioxide and methane were injected separately to n-decane saturated micromodel. Gas injection experiments were conducted in different conditions and the impact of temperature and pressure have been investigated. Cold gas injection was compared with isotherm gas injection as the zero level of Marangoni flow. The impact of Marangoni flow has been compared with other active production mechanisms. Presented results revealed that the IFT gradient due to change in temperature at the interface of the oil and gas, induces a conductive flow that improves oil recovery and compensates the negative effects of other mechanisms by overcoming the capillary forces. The results show the significant impact of thermal Marangoni convection on the recovery of bypassed oil and introduced Benard-Marangoni convection as an important mechanism of bypassed oil recovery especially in low pressure reservoirs.
-
-
-
IoT-based Wireless Networking for Geoscience Applications
Authors H. Jamali-Rad and X. CampmanNowadays, sensors are everywhere in the Oil and Gas industry, which leads to creation of a “Big Data”. This data in many cases should be aggregated and coordinated in a real-time fashion, sometimes in harsh environments. To address this issue, we have defined a unified wireless sensing framework that comprises three modules: cheap low-power long-range wireless sensors with inherent compatibility with the Internet of Things (IoT), advanced scalable wireless networking protocols, and data storage/analytics running on clouds for analysis and decision making. As a showcase, we present our seismic field test results with low-power wide-area networks (LPWANs) in the Netherlands.
-
-
-
Performing Successful Data Science in the Geoscience Domain
Authors D. Irving and J. McConnellTo perform data science with scientific data, we must represent the scientific problem space to allow analytics. This requires a blend of traditional physics-based algorithms with modern advanced analytics, performed on datasets large enough to yield statistically robust insights. These exposed insights in the data must be explained by scientists, driving creative thinking, in contrast to application-driven workflows where line-of-sight to original data is typically absent. We show how an open approach to data parsing, storage and integration drives better understanding of data, and moreover, enables the deployment and development of open source tools for processing, analysing and visualising data and insights. Dealing with measurement data brings challenges of quality, sparsity and irregular sampling, in datasets that must be integrated in the spatial, time and frequency domains. This is time-consuming work, often taking 80% of the time of each analytical study, and so we recommend that data from the geoscience domain should be curated in a “load once, use many times” paradigm. Higher-level parameters can then be created to capture the scientific insights of multi-physics systems for use in one-off or operationalised descriptions of a system. After implementing this level of abstraction, the geoscientific world is ready for data science.
-
-
-
Fostering high-impact machine learning ecosystem in subsurface science and engineering
By M. HallThe field of machine learning is experiencing a boom. The International Energy Agency predicts the 2020 analytics market in upstream petroleum alone will more than double to $10 billion. Previous such hype cycles, especially the one at the end of the 1980s, ended in a mass extinction event: artificial intelligence companies died off, funding seas dried up, and 'expert systems' became dirty words. Meanwhile, however, research continued under codenames like 'informatics', 'machine learning', 'big data', and 'data analytics'. Today, as the AI spring gives way to an AI summer, how can we give our projects the best chance of having the impact we believe they can have? As the petroleum industry moves into its autumnal years, I propose eight strategies for the computational science and engineering community to bring about the profound changes to its safety and operating efficiency that we all believe we can achieve. These strategies are well tested in other fields, and many of them have at least been tried in subsurface science and engineering.
-
-
-
Deep Learning on Hyperspectral Data for Land Use and Vegetation Mapping
Authors N. Audebert, B. Le Saux, S. Lefevere, C. Taillandier and D. DubucqRemote sensing technology is a remarkable tool to explore and to measure Earth’s surface features. Total and ONERA set up a collaborative partnership named New Advanced Observation Method Integration (NAOMI) that aims at adapting and developing new remote sensing techniques specifically targeted for hydrocarbons exploration and environmental protection. In this context, we integrate deep learning for classification of hyperspectral data. To detect different land uses and materials in aerial hyperspectral images, neural networks prove themselves to be very efficient tools, as they are able to learn discriminant features that help classification performance.
-
-
-
Machine Learning can extract the information needed for modelling and data analysing from unstructured documents
Authors H. Blondelle, A. Juneja, J. Micaelli and P. NeriSince its early days, the exploration and production industry has handled large volumes of data, mainly measurements, to build subsurface models used for strategic or technical decisions. More recently, data analytics technologies have emerged to complement the modelling tools, with notable successes in the domain of field monitoring. But the broader adoption of new analytical tools is made difficult due to the limited access to the large percentage of relevant data that is stored in unstructured formats. This issue is not new: modelling tools faced the same difficulty, but with a lower order of magnitude because each tool has a limited set of input data. Manual information extraction by skilled technicians from unstructured documents to feed sophisticated enterprise data models and modelling tools was acceptable even if it represented a poor use of a trained professional’s time. Despite these efforts, it is estimated that only 20% of the information available in our industry is stored in structured, searchable databases. Analytical tools require much more than that to perform adequately. With the emergence of new analytics tools, our industry now has a much greater appetite for data than it has ever had before. Is Machine Learning the means to satisfy it?
-
-
-
Unsupervised identification of electrofacies employing machine learning
Authors I. Emelyanova, M. Pervukhina, M. Clennell and C. DytMachine learning techniques are widely used in petrophysicics and geophysics to solve complex and non-linear problems of practical importance. In particular, numerous applications for identifying electrofacies from well logs have been conducted. However, there is no unique approach for reliable automatic classification of electrofacies as the accuracy of the applied algorithms may vary depending on data and initial conditions. To overcome instability in outcomes from various algorithms, we suggest applying different clustering techniques to log data, in a way similar to the popular method of supervised classification ensemble learning. Such an ensemble of clustering outputs are further integrated into unique classes (electrofacies) for subsequent automated identification of lithofacies. Here we apply three different clustering algorithms, namely, Spectral Clustering Self-Organizing Map and k-means, in order to reliably classify electrofacies at a petroleum exploration well Lauda-1 drilled in the Northern Carnarvon Basin (Western Australia). The clustering outputs integrated into electrofacies are validated using an expert facies classification. We show that some facies identified by the expert are not distinguished as separate classes, at least for the chosen well and selected logs. The established electrofacies can further be assigned to conventional lithofacies. This requires creating an expert system which is currently under development.
-
-
-
Machine Learning Based Workflows in Exploration and Production
Authors J. Limbeck, M. Araya, G. Joosten, A. Eales, P. Gelderblom and D. HohlIn this presentation we are going to cover a mapping between existing E&P workflow components and their data science based counterparts - as we have developed or envision them. We present one example from the geophysics domain, where deep neural nets are used to accelerate the seismic interpretation process (GeoDNN), and one example from the reservoir engineering domain (AutoSum) where machine learning is used to analyze are large ensemble of reservoir models.
-
-
-
Automated facies prediction in drillholes using machine learning
Authors M. Blouin, A. Caté, L. Perozzi and E. GloaguenMachine learning is a popular topic in geosciences at the moment. It allows the management and interpretation of data in quantities and varieties (number of variables) that a human being would not be able to achieve. Rock physical properties acquired along drillholes can be used to generate predictions about the nature and characteristics of the rock when wireline logging is taking place. In this paper, we investigate the accuracy of facies prediction using machine learning algorithms by automatically interpreting geological rock types along drillholes from rock physical properties. A data-processing workflow is proposed to enhance the prediction power of the geophysical measurements, a model calibration approach is outlined and predictions on test data are presented. Results show more than 80% of correspondence between the automated prediction and the geologist interpretation.
-
-
-
Optimising storage for high-speed data access to large volumes of data – recent advances and future direction
More LessHistorically seismic and other data have been stored on tape media of various types. The density of data on tape has increased over time. Access times have improved as well but they are not as fast as disk. Disk storage comes in various guises from cheap commodity to highly resilient. A cost benefit analysis shows that high speed intelligent disks are more cost effective than multiple cheap disks. Additional work has demonstrated that using Object storage on disk significantly improves access time to data as compared with standard Btree storage. The advent of cheap solid-state storage has changed the way we view storage and more and more systems are appearing with solid-state only storage (laptops, phones, tablets and high end analytics machines). Has the time now come to ditch tapes and disks and switch to solid state?
-
-
-
ForM@Ter: a data and services centre for Solid Earth
Authors E. Ostanciaux, M. Mandea, M. Diament and O. JametTo ease the use of satellite and in situ Earth’s observation data, the French scientific community develops four centres corresponding to the main physical Earth’s compartments: ERIS (atmosphere), ForM@Ter (solid Earth), ODATIS (ocean) and THEIA (land surfaces). These centres are developed in the framework of a project of a single research infrastructure included on the French « large » research infrastructures roadmap, to be implemented in the coming years. The first ForM@Ter target focuses on surface deformation from SAR and optical imagery data. The associated services are implemented considering the needs expressed by the French scientific community to support the use of the huge data volumes like those provided by Sentinel missions. Within this context, we present the Ground Deformation Monitoring service which is developed for scientific and private users to facilitate exploitation of radar and optical data for ground motion monitoring applications. It contributes to the ESFRI EPOS research infrastructure implementation. A massive processing radar data service is also being implemented with the objective to provide displacement map time series over large areas. It will be established using MUSCATE, a CNES computing infrastructure. These two services are based on PEPS, a CNES infrastructure hosting Sentinel products.
-
-
-
Technical Descriptions in Long-term 115⁰C Borehole Digital Micro-seismic Monitoring at the PTRC Aquistore CO2 Sequestration Project
Authors C. Nixon, D. Schmitt, R. Kofman, D. White and et allA preliminary overview of digital downhole microseismic monitoring at the SaskPower Aquistore carbon capture and storage project in the Williston basin is discussed. The digital downhole monitoring system is presented, which has been successfully deployed for up to six weeks at 2800 m and 115⁰C. Technical experiences are described, including difficulties and solutions in sustained operation at extreme conditions. 750 gigabytes of high quality seismic monitoring data was obtained and reviewed for seismic events by moveout and signal to noise ratio. Teleseismic events, mine blasting, and dynamite orientation shots were all spotted easily. A present lack of induced seismicity looks promising for carbon capture and storage in the Williston basin, but monitoring data is still being more carefully reviewed with automated selection algorithms.
-
-
-
Insights from applying Machine Learning techniques to Geosciences data from the Oil and Gas industry
By M. TibbettsArundo is a USA and Norwegian based data science services and software solutions company applying machine learning techniques across a wide range of heavy-asset industries. Our data scientists typically come from academic backgrounds where machine learning is regularly utilized such as particle physics. We have recently been working with partner organizations from the Oil & Gas industry, applying our data science expertise to geoscience data, including seismic, from operating fields. We will summarize our experience of using machine learning with such data and how our wider experience of those techniques has been utilized for geoscience use cases. We expect to be able to present results of analyses we have worked on with our partner organizations for the first time. Finally, we will discuss how subsurface geoscience data and machine learning can be used to optimize maintenance and operations in the oil and gas industry.
-
-
-
Sensitivity analysis of synthetic seismograms in sedimentary basin with respect to uncertain seismological parameters
Authors F. De Martin, P. Thierry, D. Keyes and E. ChaljubThis study focuses on the understanding of the variability and sensitivity of synthetic earthquake ground motions at sedimentary basin scale associated with the epistemic uncertainties of the model of seismic wave propagation. The key question at stake is the following: What is the spatiotemporal variability and sensitivity of seismograms at the surface with respect to seismological parameters? To achieve this objective, we describe the whole concept including pre- and post-processing around the main earthquake simulation engine. The different stages can be viewed as follow: 1° definition of the initial model perturbation to generate a given set of input parameters, 2° simulation including a runtime filtering and a post-simulation filtering and decimation to reduce the amount of output data, 3° uncertainty quantification analysis on parallel file system including Hadoop evaluation, and multi-level MPI communicators to obtain the final results of the global big data application.
-
-
-
Automatic similarity mesasures to manage geoscience databases
By A. FugaIn geosciences Data Management, the volume and complexity of data flows as well as historical merges of companies and databases, have raised harmonization, reconciliation and geo-referencing topics. Data duplication in databases causes storage place loosing, and casts doubt upon the different versions of the same seismic exploration data. The harmonization issue appears also when loading newly acquired or bought data in the referent database. To ensure data quality as well as good data access and storage savings, this integration needs to be done without creating duplicates and without lowering the databases quality. To meet the emergency of data integration requests, this research and development work has led to the design of a methodology and software based on multiscale contextual similarity, automatically computed. A new workflow has been adapted in TOTAL for seismic 2D and 3D navigation lines harmonization, wells databases reconciliation, technical documents geo-referencing, etc. This research work has demonstrated the capability to save 75% of the time data loaders, data managers or geophysicists spend on classical methods for harmonization. Moreover, databases visualization algorithms have been created for databases harmonization and characterization, to open perspectives of a more global and visual Data Management approach.
-
-
-
Channel Characterization Using Support Vector Machine
Authors A. Mardan, A. Javaherian and M. MirzakhanianRapid growth in the size of seismic data and the number of attributes cause to increase the significance of pattern recognition techniques in interpreting the seismic data. Unsupervised methods include k-means, self-organizing maps (SOM) and generative topographic maps (GTM) let interpreters do a preliminary interpretation and conclude relatively suitable information with no much primary data from studied area. On the other hand, utilizing supervised learning such as neural networks (NN) and support vector machines (SVM) by interpreters require some primary information from studied area to seed the existent facies and use these seeded samples as the input to the algorithm. In this study, to detect channel facies of one of the southwest hydrocarbon fields of Iran, we used k-means and SVM to train the second algorithm by using the extracted primary information from the first algorithm. Results show that the existent channels in the studied area have two different facies that can be detected by applied algorithms.
-
-
-
Advanced Machine Learning for Unconventional Plays
More LessImproving the capital efficiency in oil and gas exploration and production, particularly in the unconventional (UNC) plays, becomes vitally important for the industry. Since it is evident that the existing geological and petro-physical methodologies and technologies that enjoy good success in conventional plays become not as effective when applied to UNC plays, more effective approaches are in high demand. However, in oil and gas exploration, the most critical phase is the early land appraisal and initial development of the so-called green fields wherein the available data is usually scarce. This poses a great challenge to both domain experts and Machine Learning practitioners. How can Machine Learning and its related techniques be applied to help in early land appraisal and sweetspotting to greatly improve the capital efficiency? This paper describes our recent advances in developing a Machine Learning sweetspotting workflow and presents our results and findings in identifying the higher production potential areas in an example. The workflow uses a imputation scheme and a more generic and powerful ensemble learning technique which combines the strengths of a set of different Machine Learning algorithms. Consequently, our new workflow has achieved very good results in terms of R^2 (0.83) from leave-one-out cross validation.
-
-
-
Carbonate Reservoir Cementation Factor Modeling Using Wireline Logs and Artificial Intelligence Methodology
Authors F. Anifowose, C. Ayadiuno and F. RashedianAn approach, comprising statistical and artificial intelligence techniques, to modeling rock cementation factor in a Saudi Arabian carbonate reservoir using wireline logs is presented. The objective is to obtain a more accurate prediction of rock cementation factor, denoted by the exponent, m, in Archie’s equation, as a variable log using multivariate linear regression (MLR), artificial neural networks, and support vector machines. Published equations by Nugent, Lucia and Shell are empirical derivations based on porosity logs and assumptions that may not be applicable in other geological settings. Typically, log analysts use the average of m values obtained from special core analysis (SCAL) measurements. Such constant values do not account for formation heterogeneity resulting in inaccurate water saturation and pore volume estimation with high operational and economic costs. In this study, six wireline logs from seven wells were combined with their corresponding core measured m values to build and optimize the proposed models to predict the m values for new wells or uncored sections of existing wells. The predicted m values produced by the MLR model closely matched available m data from SCAL measurements. This study fulfills the pressing need for variable m as a more accurate input to water saturation models.
-
-
-
Adjusting well plan trajectory through 3-D seismic litho-fluid classification, A Case Study
Authors K. Kazemi and M. DelnavaAdjusting the geological well plan trajectories with geophysical methods could be an effective way in assisting to prepare more precise and accurate plans for drilling new wells. The main objective of this study was to check the proposed well path through geophysical sections and position the well trajectory with respect to the target layers and reduce the drilling risks and also costs as much as possible. For this reason at first a pre-stack simultaneous seismic inversion was conducted to generate Acoustic impedance and Vp/Vs ratio cubes. Consequently elastic and petrophysical well log data were evaluated to determine different litho-fluid classes including Hydrocarbon Sand (HC Sst.), Wet Sand (Wet Sst.) and Shale classes. Bayesian-derived probability density functions (PDFs) for each litho-fluid class were calculated from well log computations of AI and Vp/Vs. Using the PDFs and pre-stack seismic inversion results, probability cubes for the individual litho-fluids in addition to final litho-fluid cube were calculated. Based on generated results, well plan trajectory was adjusted to pass through the well-defined HC Sst (the target reservoir layer). Results of this study illustrated the usefulness of the Litho-fluid cubes derived from 3D seismic data in reducing the drilling risks and costs.
-
-
-
Subsurface Integrity Management of UGS
Authors F. Favret, R. Del Potro and M. DiezWell integrity has been specifically documented in particular in Norsok D-010 and in ISO/TS 16530 with leak paths and subsequent risk assessment, well barrier elements (WBE), monitoring and maintenance. Several companies have developed their own methodology for well integrity management including new software that has been developed to help operators plan and optimize maintenance, based on lessons learned from well equipment failure. However to ensure the safety of a storage facility, well integrity management has to be complemented by storage integrity management (Bonnier et al., 2015). Getting information on storage status during operation is difficult since these huge storing volumes cannot be accessed for direct in-situ controls. For example, in leached salt caverns, indirect observations are possible but the logistics required prevents continuous observations. Here we present two complementary indirect methods that contribute to the assessment of storage integrity: microseismic monitoring and PVT modelling. This joint approach is applicable to all storage types. A case of a multi salt cavern gas storage facility is presented in this document.
-