- Home
- Conferences
- Conference Proceedings
- Conferences
First EAGE Digitalization Conference and Exhibition
- Conference date: November 30 - December 3, 2020
- Location: Vienna, Austria
- Published: 30 November 2020
1 - 50 of 93 results
-
-
Prediction of Hydrocarbon Using 1-D Numerical Modelling of Low Temperature Thermochronology for Various Basin Scenarios
Authors J. Singh and D. M. WhippSummaryThermochronometer age provide time since our temperature thermochronometers have cooled down till present. Degree of maturation of organic matter i.e. kerogen in the rock depends on the maximum temperature it has achieved in its thermal history and the amount of time it has spent in that temperature range. Low temperature thermochronology helps to relate the age of cooling to the oil and gas window and help to identify whether the kerogen has been overcooked or preserved in the rock. As low temperature thermochronometers have low effective closure temperature which is in junction to oil and gas window, AFT have closure temperature ∼120° C while range of oil window lies between 50° – 150°C. Numerical model for predicting the Apatite Fission Track (AFT) age using low temperature thermochronology has been developed using Python under various condition of exhumation and subsidence of a basin in 1-D. The model requires certain material parameters & characteristics such as subsidence and exhumation rate, conductivity, thermal diffusivity, thermal gradient, transition time, specific heat, etc. Model provides with the AFT age along with the fission track length distribution statistics in the mineral at present time and depth history of the rock.
-
-
-
Geostatistic Recognition of Genetically Distinct Lacustrine Shale Facies Based on Big Data Technology
By S. LinSummaryWith the breakthrough and progress of shale oil and gas exploration, the strong heterogeneity of shale has been highly paid attention. Relying on the traditional geological methods, the classification of shale is difficult to meet the current needs of exploration and development. It is a very important problem that how to complete the genetic classification of shale based on the comprehensive consideration of many parameters including genesis and characteristics. This paper demonstrates that the geostatistic recognition based on big data technology analysis can processing numerous data and identifies genetically distinct shale facies, which improve understanding of changes in a single shale formation. It has functions (1) of assigning genetic affinities and (2) of making it available that a confidence level in the classification for any additional shale samples.
-
-
-
Geological Synthesis and Analysis of Potential Petroleum Systems of the Bida and Sokoto Basins in Nigeria
More LessSummaryThe Bida and Sokoto Basins are two of Nigeria’s inland basins that constitute another set of a series of Cretaceous and later rift basins in Central and West Africa whose origin is related to the opening of the South Atlantic. Geophysical aeromagnetic interpretation has assisted in the interpretation of the geology of the basins. Organic geochemical studies show that the Kudu Shale in the Northern Bida Basin equivalent to the Ahoko Shale in the Southern Bida Basin and the Dukamaje Formation (dark shales and limestones) in the Sokoto Basin constitute the source rocks in the potential petroleum system. With averages for source rock thickness of 40m, area of basin of 45,000km2, TOC of 9.0wt%, and HI of 220mgHC/gTOC, charge modeling indicates 623 million barrels of oil equivalent extractable hydrocarbons in the Bida Basin at the appropriate kitchens.
-
-
-
Deep Learning a Poro-Elastic Rock Physics Model for Pressure and Saturation Discrimination
Authors W. Weinzierl and B. WieseSummaryUtilizing time lapse seismic for determining pore pressure and saturation effects is relevant for hydrocarbon production as well as natural gas and CO2 storage. Its quantitative interpretation enables a detailed understanding of 4D evolution of fluid/gas migration. We focus on the rock physics model to invert for rock physical parameters. A training dataset is generated with a forward modeling operator, with parameters adapted from a 65 m deep unconsolidated high porosity reservoir from the Svelvik field laboratory, Norway. Two independent rock physical formulations are considered and multiple deep fully connected neural networks conditioned and trained to invert for different rock physics parameters. The network can rapidly derive rock physical parameters such as pressure, saturation, and porosity from seismic attributes, thus acting as an inversion tool. Subsets of the input parameters can be preset based on prior knowledge of a site. Utilizing neural networks in discriminating pressure and saturation allows real time field site conformance verification during seismic campaigns targeting 4D effects in an operations scenario.
-
-
-
IoT and Openness as the Design Language for Changing the Landscape for Oilfield Workflows
Authors R. Toolsi, A.A. Aqrawi and K. JansaSummaryDigital transformation has been a buzzword for several years; however, this transformation was both technological and hierarchical. Similar to that ethos, if we look at IoT as simply just new technology with edge gateways, sensors, and actuators, then we are only driving innovation across technology. We can start looking at IoT as a design language and making changes to the organization, communication, processes, and people.
I propose is to have IoT as part of the operational roadmap, that can liberate data, remove silos and bring about new business opportunities. Management can get real-time dash boards of all operations and make better Opex and Capex decisions, it’s also possible to automate several functions, thereby having better control and improving reaction time. IT would no longer be an vertical by itself by rather integrated as necessary to better support business objectives.
The advantages for employees, with better access to quality data, you can improve your decision making and it also gives your visibility across several different aspects of the operations. Thereby empowering employees to make decisions horizontally where traditionally there were only able to do so vertically.
-
-
-
Lifecycle Management in an Dynamic Open Ecosystem
Authors K. Jansa, A.A. Aqrawi and R. ToolsiSummaryThe rise and growth of Software Ecosystems over the last decade has resulted in a significant transition in software engineering practices.
This presents a tremendous opportunity in the E&P sector to deliver tailored, instant solutions to the user. By transitioning to a cloud-based E&P environment, we can effectively manage the lifecycle of software used in a monolithic desktop application from development, validation, deployment and ultimately to retirement. In this shared environment, the user has complete control and the extensibility of a monolithic desktop application in a cloud setting can be maintained without losing invested value in technology.
Lifecycle Management incorporates multiple disciplines: project management, requirements management, software development, software testing, quality assurance and customer support.
The Ecosystem allows the platform owner to communicate with the software developer community the release milestones of the platform and clearly state the requirements. Hence the developers may adjust their development cycle to ensure that their updates are submitted to validation on time to keep pace.
-
-
-
Machine Algorithm for Predicting Shale and Sand Arrangement Using Seismic Attributes
Authors S. Gabitova and M. NaugolnovSummaryThe work is devoted to a tool for lithology prediction using seismic data inversion and a creation of a classifier based on machine learning algorithms. There was found a hidden connection between the input data seismic interpretation results: P-Impedance, Vp/Vs and NTG and with high precision predict the probability of the water and oil saturated layers in the understudied and unexplored field sectors.
There were held multiple express tests of different machine learning classification algorithms from examples (labeled data): based on boosting - Gradient Boost, XGBoost, Cat Boost, bagging - Random Forest; Support Vector Machines (SVM), K-Nearest Neighbors (KNN). The best methods: Gradient Boost, XGBoost, Cat Boost were studied more carefully. As a result there was found a method based on machine learning algorithm (Boosting) that allows evaluating data, building a model and then forecasting with a high precision sand and shale arrangement in undrilled and poorly studied zones. This method could be applied when there is hardly distinguishable shales and sand as the insufficient contrast of elastic inversion data. We showed that this method that exhibit relatively high classification accuracy and allows to classify lithotypes: sands, shales and build maps of the sand probability.
-
-
-
Super Resolution of Fault Plane Prediction by a Generative Adversarial Network
Authors F. Jiang and P. NorlundSummaryInterpreting seismic data enhances understanding of subsurface geological features, particularly for assisted fault interpretation. The results of assisted fault interpretation workflows can provide valuable information to optimize hydrocarbon production during drilling and stimulation treatments. However, given the complexity of seismic data such workflows can generate incorrect or misleading interpretations, such as discontinuous fault segments and mispositioned fault planes, particularly when deep-learning convolutional neural networks are used. Fault extraction results often face difficulties locating the fault plane where low reflectivity or signal-to-noise ratio exists. In this abstract, a novel approach is introduced that implements a super-resolution generative adversarial network to help improve the resolution of fault prediction results. Synthetic fault data were generated to train an adversarial model, which was then applied to different field data sets. This approach could serve as a standard post-processing workflow to decrease the uncertainty as part of an assisted fault interpretation approach and provides an efficient method of helping improve the fidelity of fault prediction results.
-
-
-
Clastic Reservoir Rock Grain Size Estimation from Wireline Logs Using a Random Forest Model: Initial Results
Authors F. Anifowose, S. Shahrani and M. MezghaniSummaryGrain size is a key input to various reservoir models. The models require a continuous log of grain size. Core samples are usually not available over the entire reservoir section. The most accurate grain size measurement is obtained from sieve and laser particle size analyses. These methods are expensive. The conventional method, the visual core description, is time-consuming, subjective, and nonreproducible. Alternative methods include the use of empirical equations, nuclear magnetic resonance (NMR) relaxation time, and acoustic velocities. These latter methods require inputs that are not sufficiently available, not applicable to different geological settings, or not available for all wells. This paper proposes a new methodology that estimates reservoir rock grain size for a new well or reservoir section from archival core description data and their corresponding wireline logs using machine learning technology. Nine wells from a clastic reservoir are used. Seven wells are combined to build the training set while the remaining two are used for model validation. Three machine learning methods are implemented and trained with optimized parameters. The results showed that, despite the subjectivity and bias associated with the core description data, the machine learning methods are capable of estimating the grain size for the validation wells.
-
-
-
Pathways to Exploration Success – Orchestrating the Steppingstones
Authors S. Roy, C. Castagnac, T. Levy and S. NolletSummaryIn the overall E&P industry the definition of corporate exploration success differs from organization to organization. It could vary widely from investing early into the right piece of acreage for an investment company, to drilling a discovery well for an exploration only company, to matching the producing volume of hydrocarbon from subsurface with the discovery volume claimed by the Exploration department for a major oil and gas producer. Most of the organizations define their business processes as per their corporate goals and establish stage gate processes for decision making.
The exploration processes framework is the guidance to identify and prioritize the work to be done as per the business goal, facilitate collaboration between domains, make each domain understand their roles and responsibility to achieve the common goal, and how their work is going to be used in the next step.
The digital solution is embedding the exploration processes framework into cloud native application, with the preservation of data and knowledge, empowers the capacity to integrate analysis from multiple exploration domains, increase the understanding of regional geology to find and mature opportunities. The digital technology successfully establishes the connection between the business goals and the technical project execution.
-
-
-
Inverting Elastic Model Properties Using ResNet
More LessSummaryWe develop a novel seismic data inversion method to estimate the properties of subsurface layered elastic models using Convolutional Neural Network (CNN). Specially, we use ResNet (Residual Neural Network) to predict the parameters of layered elastic models, including layer depth, layer density, P-wave, and S-wave velocities for its unique identity block architecture. The entire dataset consists of 10,000 layered elastic models and their corresponding single shot records. We use 80% pairs from the dataset to implement the training process, and then the trained network could make predictions on the rest of the models in the dataset. Our trained network presents satisfying prediction results on both simple (i.e., few-layer) and complex (i.e., multi-layer) models, and it suggests that the proposed approach could be a useful tool for data processing, especially when dealing with near-surface layered models.
-
-
-
Robust Evaluation of Fault Prediction Results: Machine Learning Using Synthetic Seismic
Authors M. Sarajaervi, T. Hellem Bo, B. Goledowski and M. NickelSummaryMetrics to assess machine learning methods are necessary for evaluation of results and for comparisons to existing methods. For the segmentation of faults in seismic data, we suggest the use of a robust Jaccard metric that allows for small lateral inaccuracies in fault positioning. This error tolerance is necessary because interpretations are often inaccurate or subjective as a result of low seismic resolution, noise or other image deficiencies. The metric is used to evaluate results during the development of a 3D convolution neural network. In practice, this is done by applying new versions of the convolutional neural network to field data and by using metrics to compare machine learning results to the manual interpretations.
-
-
-
Harnessing Data Standards and Cloud Computing to Achieve Global Screening and Compare Exploration Potential
Authors M. Treloar, T. Butt and K. HeyburnSummaryExploration for conventional hydrocarbons has become increasingly challenging. Exploration frontiers represent high-risk opportunities, and explorers require time and cost-effective tools to consistently understand and rank these opportunities within an exploration portfolio. This is particularly true in the continuing environment of risk-aversion and budgetary pressure. In addition, despite the ongoing digitalization of the industry providing more data than ever before, exploration teams remain reduced in size and often lack the resources to make full use of the ever-increasing volumes of data at their disposal.
These challenges speak to a need for faster, better-integrated, and more rigorous screening of exploration opportunities. This article examines how the need can be addressed by combining the efficiency gains in data access, integration, and processing capabilities offered by cloud technologies with standardized geological interpretations. As a case study, all Cretaceous clastic exploration potential in offshore basins has been assessed globally. The technology, workflows and inputs used to achieve this are covered, alongside the importance of applying a consistent framework for data integration.
-
-
-
A Transformational Journey from Unstructured Geoscience Data to a Digital Analytical Experience
Authors D. Slidel and I. FletcherSummaryThe automated standardization of legacy datasets can facilitate the generation of valuable new insights, when consumed within a business intelligence platform. Where the digital transformation in the energy services sector has successfully led to the automation of data gathering and interpretation processes, a geoscientist’s time can be freed up to undertake the more valuable interpretive tasks that are so vital in understanding subsurface geology. A recent advancement in managing data that has significantly helped with this transition has been the adoption of business intelligence and analytics platforms to produce advanced visualizations and automated analysis. This presentation and article focusses on a recent example of this process in action where play cross sections in an unstructured PDF format are processed for use within an analytical platform.
-
-
-
Real Time Well Engineering for Intelligent Rig State Identification: An Edge Computing Use Case
Authors V. Kemajou, R. Samuel and M. YasirSummaryThe internet of things has brought better connectivity among devices in various industries including the oil and gas industry. With this improved connectivity comes improved applications, especially in the area of real-time data processing and analytics. A major requirement for real-time application is minimized latency. Cloud computing, despite its many benefits, is often limited in that area. A lag often starts and worsens when the realtime data has to go back and forth between field locations and data centers. On the other hand, edge computing enables faster processing speeds. Edge computing applied to real-time well engineering is presented and discussed. It has the potential to be an adequate approach for applications dependent on real-time processing such as geosteering, which requires instantaneous processing to drastically enhance the economic profitability of a well. With edge computing, some sensor data can be cleaned and processed instantaneously to automatically identify the rig state, which is a key step in the real-time well engineering workflow during drilling.
-
-
-
Machine Learning for Wax Deposition Prediction
Authors M. Nait Amar and A. Jahanbani GhahfarokhiSummaryAccurate prediction of wax deposition is of vital interest in digitalized systems to avoid issues that interrupt the flow assurance during production of hydrocarbon fluids. The present investigation aims at establishing rigorous intelligent schemes for predicting wax deposition under extensive production conditions. To do so, multilayer perceptron (MLP) optimized with Levenberg-Marquardt algorithm (MLP-LMA) and Bayesian Regularization algorithm (MLP-BR) were established using 88 experimental measurements. The obtained results showed that MLP-LMA achieved the best performance with an overall root mean square error of 0.2198 and a coefficient of determination (R²) of 0.9971. The performance comparison revealed that MLP-LMA outperforms the prior approaches in the literature.
-
-
-
Geoscience Workflow Tracking by Means of RESQML Standard Format
Authors M. Piantanida, M. Dalla Rosa and B. VolpiSummaryThe paper describes Eni’s implementation of a RESQML database, capable of tracking the models exchanged across different G&G applications along interpretation and modelling workflows, together with all the metadata that will allow the later identification of the right model to be re-used for future work.
The implementation is targeted at sustaining Eni’s vision of a future ecosystem of small geo-apps, each focused on performing at best a small piece of the workflow, and with the possibility of easily composing the apps into a full workflow by exchanging the RESQML models across each other. This vision must be coupled with a powerful model tracking database, capable of identifying which models were used at each step of the workflow. The paper describes the approach used by Eni for such implementation, including the capability of disaggregating RESQML models into the basic data components to avoid duplication or inconsistencies, as well as some examples of the metadata used to correctly label, store and retrieve the RESQML models within the database.
-
-
-
Principal Component Analysis and Deep Learning along Directional Image Gathers for High-Definition Classification of Subsurface Features
By B. De RibetSummaryDiffraction imaging has proven to be an attractive method for delivering high-resolution subsurface images containing different types and scales of continuous and discontinuous geometrical objects. For depth domain 3D subsurface models, Koren and Ravve (2011) described an imaging method which is based on the ability to decompose the full recorded seismic wavefield into continuous full-azimuth directivity components in situ at the subsurface image points. This method follows the concept of imaging and analysis in the “Local Angle Domain” and allows us to generate azimuthal directivity gathers, from which we can separate specular and diffracted energies.
As part of the ongoing effort to automatically enhance procedures for classifying directivity driven image data into N geometrical features such as continuous reflectors, faults, point diffractors, acquisition noise, and ambient noise, Itan et al. (2017) presented a Deep Learning (DL) approach to this challenging task. This work expands on this method, as in addition to vertical section image patches, we also train the net with horizontal patches. This leads to further improvements, particularly in areas masked by ambient and coherency noise for classifying different geometrical features. We demonstrate our method on seismic data from the Eagle Ford and Barnett unconventional shale plays.
-
-
-
The Big Loop: an Innovative Solution for Automating Subsurface Workflows and Transforming Inter-Disciplinary Asset Team Collaboration
Authors C. Cosson, A. Plougoulen, M. Morin and G. MaisonneuveSummaryThe efficient management of subsurface operations depends on the asset team’s technical excellence. Skilled practitioners in various domains interpret, model and predict reservoir performance. This, however, leads to silos between domains, each with its own language, methods and technologies, resulting in: a) An overly long cycle time for passing through domains, analyzing data and delivering key information that helps management decisions. b) The challenge of creating consistency between specialists and building confidence in the results. This presentation demonstrates a workflow that improves collaboration by helping practitioners work synchronously, based on fully automated reservoir modeling technologies allied with agile collaborative methods. The technology driver is a workflow orchestrator, which runs atop optimized reservoir modeling and simulation software packages and scripts. A cognitive approach is used: initial models are built from available data, and then refined iteratively as new information arrives. Updates are automatically propagated through domains and deliver new results. Consistency across domains is preserved and models are evergreen. Continuous alignment is guaranteed, and results reflect the asset’s needs. This solution, called Big Loop™, is software-agnostic, customizable to the needs of the individual organization, and has been shown to significantly improve asset development and management efficiency.
-
-
-
Kriging Season 3: From Geosciences to Geo_DATA_Sciences
Authors L. Sandjivy, S. Valentin and L. LacailleSummaryBig Geo Data and cloud computing are a real E&P game changer taking us from Geosciences to Geo DATA sciences. Probability theory offers a consistent mathematical framework for developing specific “kriging based” machine learning algorithms for automating reservoir modelling and updating in real time. This is the season 3 of the Kriging algorithm saga in the oil industry, after season 1, kriging as an interpolator, and season 2, kriging as a geophysical workflow optimizer. In season 3, kriging-based software packages give way to kriging-based software apps for operating digital E&P projects
-
-
-
Automatic Method for Anomaly Detection while Drilling
Authors M. Golitsyna, A. Semenikhin, I. Chebuniaev, V. Vasilyev, V. Koryabkin, V. Makarov, I. Simon, T. Baybolov and O. OsmonalievaSummaryA lot of anomalies can occur and lead to failures during drilling process. It is crucial to detect these deviations from normal process as soon as possible, so engineers can analyse and decide what activities to take in order to prevent potential NPT.
In this work we propose a new machine learning based approach for detection abnormal drilling behaviour in an online manner. The idea is to cluster drilling data, which is preprocessed in a very special way. Our aproach allows using all available data for training as it does not need any labeled data and incorporates both raw drilling parameters and expert knowledge, thus enhancing prediction results.
-
-
-
Uncover 2% Advanced Production Optimization across Complex Operational Plants through Industry 4.0, AI and Digital Twin
Authors D. Piotrowski and J. KalagnnanamSummaryThis is a client case study illustrating an advanced implementation of Industry 4.0, AI, and digital twin to achieve material gain in production optimization across complex, interdependent plant processes. From working with leadership and selecting a the right impactful business case to implementation and garnering support from operational stakeholders, we demonstrate how end-to-end value chain optimization is possible.
-
-
-
Production Optimization Under Constraints: Development and Application of Software Combining Data Science and Petroleum Engineering Knowledge
By G. JoffreSummaryOMV New Zealand gas/condensate fields’ gas production is limited by commercial demand, which also constrain production of associated condensate. No test separators nor individual well multiphase flow meters are installed, only single-phase gas flow meters (V-cones flow meters and orifice plate) for each individual well. In order to produce the maximum revenue for the fields, the wells with the highest condensate-gas-ratio need to be prioritized, while still ensuring that well and facilities constraints are managed.
An agile crew of engineers, developers and data scientists, have been mobilized to design and create reliable, easy to use and easy to maintain software solutions to solve three different parts of the optimization problem: A live, dynamic visualization of the wells operating envelopes for dynamic monitoring of the current status of individual wells versus the constraints and direct comparison with simulation models results. A software solution to automatically identify step-changes in well gas, water and condensate rates at facility output level, using these changes to improve CGR and WGR allocated value for each individual well. A software application to calculate the best combination of individual well rates to meet gas export demand while maximizing condensate production, within facility limits and well operating envelopes.
-
-
-
Data-Driven Detection of Well Events in Mature Gas Fields
Authors J. Poort, P. Shoeibi Omrani and A.L. VecchiaSummaryThe production optimization of mature gas fields is severely complicated by the occurrence of certain undesired well events such as salt precipitation, liquid loading, or gas/water coning. Learning from production data of periods in which such events have taken place could help operators improve the process optimization. However, due to the current manual process of interpreting production data, many well events can go unreported. Reanalyzing historic data could retrieve missed events, but this is a time-consuming and costly process. In this study, the dynamic time warping (DTW) algorithm was used in a developed workflow that automates the process of detecting well events which can be operational both in an offline and real-time manner. Such a workflow supports operators in finding well events within production data based on characteristics of target events provided by operators. Based on a case study using field data for a gas well suffering from salt precipitation, the workflow has been proven to be accurate and significantly computational-efficient in finding 8 new events which were not detected by the operator. Additionally, the algorithm was robust in detecting well events even after introducing up to 10% of added noise.
-
-
-
Automated Surface Fault Block Delineation
By T. BrennaSummaryWe present a fully automated method for delineating potential compartments in faulted reservoirs based solely on the geometry of a single reservoir horizon interpretation. Such a technology has potential applications in for instance reservoir compartmentalization studies where it is often advantageous to have an a priori delineation of the reservoir compartments as a credible starting point for the analysis.
In our solution we integrate methods from the geometric modeling discipline, for extracting high-quality curvature information, and novel extensions of existing image processing techniques for segmentation. The result is a fully transparent, deterministic and extensible workflow.
Getting automation right will create value in itself by freeing domain experts from manual laborious work to focus on more fulfilling, higher-value activities. Also, automation could be an enabler for entirely new intelligent, or even transformational, workflows by effectively letting us bypass processes requiring manual user interaction to ultimately leverage alternative applications of the technology stacks. The impact of automation in the emerging digital space will empower us with new capabilities enabling accelerated hydrocarbon discovery.
-
-
-
Deep Bayesian Neural Networks for Fault Identification and Uncertainty Quantification
Authors L. Mosser, S. Purves and E.Z. NaeiniSummaryThe interpretation of faults within a geological basin or reservoir from seismic data is a time-consuming, and often manual task associated with high uncertainties. Recently, numerous approaches using machine learning, especially various types of convolutional neural networks, have been presented to automate the process of identifying fault planes within seismic images, which have been shown to outperform traditional fault detection techniques. While these proposed methods show good performance, many of these approaches do not allow investigation of the associated uncertainties that arise in the fault identification process. In this study, we present an application of Bayesian deep convolutional neural networks for identifying faults within seismic datasets. Using an approximate Bayesian inference method a Bayesian deep neural network was trained on a large dataset of synthetic faulted seismic images. The model is then applied to a benchmark dataset and a real data case from NW shelf Australia to identify fault planes, and to investigate the associated uncertainty in the predictive distribution.
-
-
-
Processing Thin Section Photos with Neural Networks and Computer Vision
Authors S. Polushkin, Y. Volokitin, I. Edelman, E. Sadykhov, O. Lokhanova, Y. Murzaev and S. PastushkovSummaryThe neural network, which was designed for diagnosing cardiovascular diseases was trained to identify and analyze grains at thin section photos. Identifying pores and pore throats is done with computer vision. About 150 thin section photos were processed in about 20 minutes. The output contains grain sizes and mineral composition for more than 10000 grains, and pores and pore throat diameters for several thousand of pores. Comparison with alternative methods of determining pore size distribution like Cap Curves and NMR is presented.
-
-
-
CESI Is a Numerical Approach for Oil Field Study Optimization
Authors O. Melnikova, B. Belozerov and I. PavelevaSummaryThe main goals of oil field study are risks reduction during appraisal and exploration stage and uncertainty decreasing during exploration and development stages.
Software module (patent name is KOGI, English equivalent of this abbreviation is CESI), which based on methodology of complex exploration state estimation (CESI), is a digital tool for identifying zones of insufficient researches. Consequently, these are zones of high risk and uncertainty in terms of STOIIP calculation and planning future investigations.
Methodology is still developing and it is in process adding qualitative features of productive formations (such as complexity, architecture aspects etc.) as well as value of information (VOI) getting from studies.
-
-
-
Digital Multiscale Flow Modeling for Fractured Carbonates with Hessian-Based Cracks Detection
Authors I. Varfolomeev, N. Evseev, O. Ridzel, V. Abashkin, A. Zozulya, S. Karpukhin and M. MiletskySummaryThe results of the pilot project on studying petrophysical and transport properties of core from a fractured carbonate gas-condensate reservoir are described. The studied whole core samples are characterized by low absolute permeability of matrix and highly heterogeneous multiscale network of cracks and fractures. Modern full core 3D X-Ray computed tomography was unable to resolve the geometry of thinner cracks, which made it impossible to create a regular binary solid/void digital rock model typically used for pore-scale hydrodynamic modeling. Thus, a Hessian-based crack detection method, allowing to differentiate voxels with different permeabilities, was employed to construct a model with effective properties. To calibrate the effective properties, smaller sub-plugs were scanned at substantially higher resolution and their images were spatially registered to the whole-core image. The density functional hydrodynamics + chemical potential drive method was used to carry out numerical simulation of three-phase water-gas-condensate flow on the constructed whole core digital rock model with effective properties.
-
-
-
Smarter Well Engineering Concepts Aid in Reducing Planning Time and Increasing ROP
Authors N. Islam, A. Rosener, W. Souza and M. YasirSummaryModern well engineers struggle with digital confusion; they have either too much data or not enough, and the quality is often questionable. Additionally, well engineers are usually operations focused and might not fully appreciate optimization through data-driven insight. This paper illustrates how to optimize the rate of penetration (ROP) in any given field using an automated and timesaving process for designing wells using machine-learning (ML) techniques.
By prescribing optimized ROPs through automated ML of offset well attributes, free from subjective human bias, engineers can push technical limits. Automated analysis, regression, and visualization of high-volume data can reduce planning time significantly and help establish optimized operational parameters to reduce drilling time and costs.
The next step is to build a real-time downhole advisory system to help achieve the predicted ROPs by predicting and prescribing drilling parameters ahead of the bit.
-
-
-
Solving Problems with the Discrete Smooth Interpolation Framework, from Geomodelling to Geophysics and Beyond
Authors A. Tertois and Z. KorenSummaryA number of algorithms developed in geomodelling software rely on the Discrete Smooth Interpolation (DSI) method, a mathematical framework which enables interpolation of sparse values with geological and geophysical constraints on any type of discrete models such as triangulated surfaces or volumetric grids. Leaning perhaps more towards data integration than machine learning, this powerful tool is also evolving as part of our digital transformation. Today’s dynamic environment is favourable to building upon DSI’s principles and ability to add geological or physical concepts as constraints in discrete models.
DSI already offers solutions to many geomodelling problems as part of a successful commercial software suite. The Fourth Industrial Revolution is an opportunity to rejuvenate DSI by lifting it out of the geomodelling toolkit and making it available as a separate entity for any scientist to use, as a seamless and invisible link between linear equations and elegant solutions.
In this paper, we first review the Discrete Smooth Interpolation theory, then show how we currently apply it to various geomodelling problems and finally, we look towards its future in helping us solve our digital challenges in different domains.
-
-
-
Multi-Sensor Acoustic Parameter Analysis System for Monitoring, and Performance Prediction of Deep Drilling and Stimulation Operations
More LessSummaryAcoustic Emission (AE) based systems have been under development and used at Fraunhofer IEG to monitor, evaluate, and control conventional and novel drilling processes and their pertinent equipment used in geothermal and drilling applications. Moreover, novel jetting and drilling operations in deep geothermal reservoirs do heavily rely on such new technologies in order to be able to control them properly and thus, to result in a viable technical and economical option.
AE monitoring is based on the detection and conversion of elastic waves into electrical signals, which are associated with a rapid release of localized stress-energy propagating within a material. It is passive testing, logging, and analysis method to evaluate changes in the properties and behavior of machines and mineral type materials such as rocks. Such changes may be induced by drilling, jetting, or other drilling methods and being recorded, characterized, and evaluated via an AE system and will be used ultimately used for process performance prediction using machine learning methods. This is the core of the novel monitoring system development, the AE based, so-called Multi-Sensor acoustic parameter analysis as the primary control and monitoring mechanism during rock breaking, drilling, jetting, and stimulation.
-
-
-
Standardized Direct Data Transfers Between Applications Accelerates Workflows and Improves Operational Adoption of Innovative Technologie
More LessSummaryGeoscience and engineering workflows are applied to increasingly complex reservoirs. Collaborative teams require the use of different vendor solutions to apply the best technologies to solve problems and deliver the most accurate models and predictions. A new direct data transfer protocol based on existing mature industry standards simplifies and speeds up the data connection between applications. It also ensures better data integrity and complete flexibility in assembling and executing workflows. Based on the mature WebSockets protocol, this new standard has the necessary sub-protocols to reliably handle complex data relationships, very large data arrays as well as unique item identifiers. In addition to accelerating workflows and making them more reliable, this new protocol simplifies the addition of new innovative technologies alongside proven ones, for the best outcome within the tight resource and time limits imposed by the upstream industry.
-
-
-
A Digital Methodology for Large Scale Integrated Optimization of Production Planning and Operations
By M. ScottSummaryProducing assets and their gathering networks are multi-faceted, with multiple diverse data sets and modelling and analysis tools. Consolidating these into a single automated, operational environment can greatly streamline surveillance and management of these assets. However, this type of modelling does not always properly represent the asset as a whole, as individual elements have impacts on preceding and subsequent areas. The purpose of this presentation is to show a simple and effective methodology to generate this integrated model, to allow optimization of production planning and operation processes. By leveraging modern data integration, modelling and orchestration tools, up to date insight into all aspects of the operation can be provided across the business, enhancing planning, forecasting and decision making capabilities.
-
-
-
Accelerating Seismic Data Access, QC and Vendor-Independent Automated Workflows with Cloud-Based Seismic Datastore and API
Authors C. Caso, P. Aursand and T. StraySummarySeismic data discovery, quality assessment, and retrieval are often time-consuming and iterative processes between geoscientists and data managers.
In this paper, we describe the implementation of a seismic datastore in the Aker BP cloud environment, as part of the company’s digital program Eureka. The objectives of this implementation have been to get fast, tool-independent access to the Aker BP seismic data through an API allowing queries of the whole survey but also of subsets of the seismic data; overview of the actual data within each survey; preview of a seismic section (inline, crossline or arbitrary line); comparison of the same section from two different seismic cubes; and enabling 3rd party applications to run as services on top of the seismic. The implementation was carried out in a five-month project, involving software developers and the support of data managers, in an Agile setup with demos every two weeks and continuous feedback from the end-users. The solution has been delivered as a cloud-based API architecture to ingest, store, query, visualize and consume seismic data.
-
-
-
Facies Classification: Combining Domain Knowledge with Machine Learning Solutions
SummaryAutomated facies identification workflows which use Machine Learning (ML) are publicly available but perform sub-optimally (accuracy in the order of 60%) due to a lack of integration with geological domain knowledge. Existing tools consider well log values mostly on a depth-by-depth basis, using only very basic feature engineering. Our solution aims to integrate ML with well-established geoscience principles (also referred to as geo-rules) such as sequence stratigraphy, proximal-distal trends, and log-trend patterns. Geological knowledge is incorporated into ML to improve the quality and robustness of facies prediction and is captured as additional geologically-inspired features added to the dataset. These features include the mean value and other derived properties of intervals, density-neutron separation, segmentation and wavelet transform. All ML algorithms tested with this augmented set of features show significant improvement in performance metrics as compared to solutions with basic logs only.
-
-
-
Deep Learning Applications to Unstructured Geological Data: From Rock Images Characterization to Scientific Literature Mining
Authors A. Bouziat, S. Desroziers, M. Feraille, J. Lecomte, R. Divies and F. CokelaerSummaryIn the last decade, Deep Learning applications to unstructured data, such as images and texts, has known significant technical progress and democratization. However successfully adapting these technologies to geological data and activities is far from straightforward. As a contribution to the digital transformation of the subsurface industries, in this study we present three promising Deep Learning applications to unstructured geological data. The first use case is an automated classification of macroscopic rock samples pictures with convolutional neural networks. The second use case is an accelerated delineation of foraminifera micro-fossils on thin sections scans using segmentation algorithms. The third use case is an assisted mining of scientific texts to characterize hydrocarbon source rock formations, based on an entity extraction engine. From these use cases, we highlight the main challenges to expect in similar projects and share some good practices. Notably, we describe innovative methods to embed prior geological knowledge in the algorithms, to handle situations where only little training data is available, and to distribute the corresponding codes to geologists in user-friendly ways.
-
-
-
Analysis of Seismic Attributes to Assist in the Classification of Salt by Multi-channel Convolutional Neural Networks
Authors F. Jiang, P. Norlund and Z. WeiSummaryRecently, many deep-learning approaches have been applied to geophysical problems, such as seismic processing and interpretation, to aid in the exploration of hydrocarbon reservoirs. Convolutional neural networks (CNNs) are a popular new method to identify salt bodies in seismic data, by analyzing image segmentation and feature extraction. In this study, four ensemble classifiers were trained to analyze the importance of various seismic attributes with respect to the predictability of a salt body. By choosing seismic attributes with the highest importance as input data to a multi-channel CNN architecture, we successfully improved the accuracy of salt prediction. Both binary and multi-label salt classifications are shown, as well as comparisons of salt classification probability maps generated from models trained by seismic-only data vs models trained using seismic-plus-attributes data. The results demonstrated that using seismic-plus-attributes models significantly improved the continuity of salt boundaries and reduced unwanted artifacts, whilst also converging faster during training.
-
-
-
Preliminary Assessment of Structural Controls in the Sokoto Basin, Northwestern Nigeria Using Non-Evasive Techniques
More LessSummaryPreliminary assessment of the structural controls of the Nigerian sector of the Iullemmeden Basin, northwestern Nigeria has been carried out using non-evasive techniques. The Sokoto Basin deepens towards Niger Republic. Depth to basement interpretations from aeromagnetic data show eight major depressions in the basin comprising the Yerimawa-SabonBirni-Isah trough, Wurno-Rabah trench, Sokoto-Bodinga-Tambulwa trench, Tureta-Bakura ditches, Lema-Tambo sinks, Koko-Giro sinks, Gada holes and Kiwon Allah-Sokwoi-Illela pits. Structural interpretations show that three major fault lines trending NW-SE modified the sagged basement over geologic time. Integrating depth to basement and structural interpretations show that the Sokoto-Bodinga-Tambulwa trench, Kiwon Allah-Sokwoi-Illela pits and Lema-Tambo sinks are possibly connected by parallel faults trending NW-SE. Evidence from field studies of surface tectonic structures as well as the presence of a deep seated fault below the Wurno hill leads us to the conclusion that the Wurno hill is possibly tectonically controlled. Furthermore, the presence of a reverse fault and rollover anticline along Goronyo-Taloka road indicate possible convergent plate boundary and regional active faulting respectively. This may play a significant role in the maturity of organic rich sediments of the Taloka and Dukamaje Formations, flow of fluids as well as mineralization in the basin.
-
-
-
Study on Geological Feature Extraction from FMI Logging Data by Using Deep Learning Neural Network
More LessSummaryThis paper firstly studies the structure and algorithm principle of deep neural network which is divided into two processes of “pre training” and “fine tuning”, and it can avoid falling into local minimum and improve the learning speed. As an efficient feature extraction method, deep learning can complete the most essential description of the data.
-
-
-
Modelling Hydraulically Fractured Tight Gas Reservoirs with an Artificial Intelligence (AI)-Based Simulator, Deep Net Simulator (DNS)
Authors S. Ghassemzadeh, M. Gonzalez Perdomo, E. Abbasnejad and M. HaghighiSummaryHydraulic fracturing in tight gas reservoirs increases the connectivity of the well to further areal regions, thus boosting the production as well as the net-present-value of the asset. This type of reservoir typically exhibits considerable uncertainty in rock and fracture properties, which coupled with significant heterogeneity makes history matching, uncertainty quantification, and optimisation time-consuming tasks. Therefore, engineers are always looking for processes to reduce simulation time. Artificial intelligence enables machine-learning to learn from data. This allows for time-consuming fluid flow equations to be explicitly formulated while keeping the accuracy found through the implicit approach. This is achieved through the use of deep learning. In this research, a fully standalone simulator is developed for a range of hydraulically fractured tight gas reservoirs in a 2-dimensional space. Considering the low value of metrics (RMSE<65 psi, MAPE < 0.99%, and R2 ≈ 1) for training, validation and test sets, the results confirmed that the developed model, Deep Net Simulator (DNS), is accurate and reliable when compared with numerical models. Furthermore, DNS shows remarkable reliability when comparing the results of 140 unseen complete reservoir models over a 4-year period against a numerical simulator. The average value of MAPE for all 140 cases is 10.55%.
-
-
-
Optimizing the Dynamic Behavior of Wells and Facilities with Machine Learning and Agent Negotiation Techniques
Authors M. Piantanida, A. Amendola, G. Esposito, P. Iorio, S. Carminati, D. Vanzan, F. Castiglione, D. Vergni, P. Stolfi and C.N. CoriaSummaryThe paper proposes an approach to deal with the day by day dynamic behaviour of Oil & Gas assets, providing support for optimized decisions on wells and facilities. The approach is based on:
• A set of software agents, trained with a machine learning approach to understand the health status of the components of the reservoir/well/plant system and capable of proposing optimization actions for the corresponding subsystem;
• An inter-agent negotiation approach, capable of evaluating the optimization actions of the single agents in the wider picture of the overall optimization of the producing asset.
The paper will describe how this approach has been implemented, as well as an example application.
-
-
-
Deep Learning for Seismic Data Reconstruction: Opportunities and Challenges
Authors O. Ovcharenko and S. HouSummaryNatural and instrumental conditions during field seismic survey lead to noise and irregularities in acquired seismic data. In this work, we explore challenges and opportunities related to denoising and interpolation of seismic data by deep convolutional neural networks. In particular, we apply three network configurations to field data and match them with suitable applications. We show that U-Net is beneficial for denoising applications while adversarial generative networks (GAN) are superior in interpolation tasks. Enhanced interpolation capability of GANs, however, comes at cost of increased uncertainty in the results and we raise awareness about this observation. In the end, we consider the pitfalls of conventional metrics and outline the requirements for data-driven approaches to be suitable in production applications.
-
-
-
Novel Digital Rock Simulation Approach in Characterizing Gas Trapping by Modified Morphological Workflow
Authors F. Zekiri, J. Steckhan, S. Linden, P. Arnold and H. OttSummaryThe quantification of trapped non-wetting phase saturation and distribution in petroleum reservoirs is essential to understand hydrocarbon recovery efficiency. Laboratory experiments on core samples are regarded industry best practice to estimate hydrocarbon trapping. To implement entrapment characteristics in reservoir modeling, empirical correlations between initial saturation and respective residual non-wetting phase saturation (trapping curves) are commonly used.
To overcome long lead times for setting up reservoir models due to time-consuming laboratory workflows, pore-scale simulations of fluid flow on digital representation of the pore space - so called digital twins - imaged by micro computed tomography have been considered a viable alternative to estimate hydrocarbon entrapment. In this study, we compare simulation results for water/gas capillary dominated imbibition in a sandstone reservoir. So far, digital rock simulations could not predict representative trapped phase saturation levels with the classical morphological approach. This was the motivation to adapt the simulation concepts by inclusion of sub-resolution wetting-phase layers to the pore-structure. As a result, it was possible to simulate representative spatial distribution of the trapped non-wetting phase in the pore-space and to estimate realistic residual saturations. For verification purposes, the simulated results have been compared to the trapping model by Land (1968) .
-
-
-
Using Blockchain and Smart Contracts for Marine Seismic Data Integrity and Contract Control
By L. FloodSummaryThere are opportunities using blockchain combined with smart contracts to enhance data integrity and contract control within the marine seismic industry for individual contracts. Further an implementation of blockchain and smart contracts at the industry level will redefine industry standards and create a payment platform for the industry, and associated subcontractors, making contracts easier to administer and control.
-
-
-
How to Leverage Advanced TensorFlow and Cloud Computing for Efficient Deep Learning on Large Seismic Datasets
Authors C.E. Birnie and H. JarrayaSummarySeveral seismic applications benefit from using all available receivers and a long time-window, allowing greater representation of signal and noise. Neural networks have the ability to utilise spatio-temporal data and extract high level patterns thanks to their non-linear function compositions. However, the training of such networks is memory intensive, often resulting in the downsizing of data introducing constraints on the number of traces and/or the length of the recording. Through the example of developing a deep learning model for passive seismic event detection on a large array of ∼3500 sensors, we describe an end-to-end workflow from synthetic labelled data creation to distributed model training to model deployment. We demonstrate how to overcome the memory challenges of large input data by utilizing TensorFlow’s data generators for on-the-fly generation and loading of large seismic recordings during the training procedure. Furthermore, we illustrate how training time can be drastically reduced by distributing training across multiple machines with GPU capability. Kubernetes and cloud resources are leveraged for ease of orchestration of compute resources and scaling up horizontally. Finally, we highlight that whilst training is computationally expensive, the trained model can be deployed on a standard, non-GPU machine for real-time detection of passive seismic events.
-
-
-
Digital Analysis of the whole Core Photos
Authors V. Abashkin, I. Seleznev, A. Chertova, A. Samokhvalov, S. Istomin and D. RomanovSummaryIn this work, we present the technique for automatic processing of whole size slabbed core digital images. The technique helps to identify areas of the photo that are close in properties that can correspond to different rock types, facies, etc. These distinguishing features can be used to predict petrophysical rock properties using available laboratory measurements. Obtained data can be used in complex log interpretation, construction and further validation of the reservoir hydrodynamic model, refinement of well geomechanical models. Texture characteristics of whole core surfaces obtained from the images of an average and high resolution can considerably increase the accuracy of the rock classification and more reliable distribution of rock properties on larger scales. Image color clustering is carried out using the Gaussian approximation for image pixels’ density in the digital color-coding space. The technique complements available log and core data by specific properties descriptors (porosity, permeability, natural gamma-rays emission, etc.), including rock mass color characteristics, texture, bedding planes angle and thickness, and shape and size of clastic inclusions. The possibility of the generation of high-resolution curves for physical properties measured on core plugs was demonstrated.
-
-
-
Generating Custom Word Embeddings for Geoscientific Corpi
Authors C.E. Birnie and M. RavasiSummaryIn the field of natural language processing, word embeddings are a set of techniques that transform words from an input corpus into a low-dimensional space with the aim of capturing the relationships between words. It is well known that such relations are highly dependent on the context of the input corpus, which in science varies highly from field to field. In this work we compare the performance of word embeddings pre-trained on generic text versus custom made word embeddings trained on an extensive corpus of geoscientific papers. Numerous examples highlight the difference in meaning and closeness of words betweeen geoscientific and generic context. A prime example is the term ghost which has a specific definition in geophysics, different to that of its common usage in the English language. Moreover, domain specific analogies, such as ‘Compressional is to P-wave what shear is to… S-wave’, are investigated to understand the extent to which the different word embeddings capture the relationship between terms. Finally, we anticipate some use cases of word embeddings aimed at extracting key information from documents and providing better indexing.
-
-
-
Architecting a Digitalization Platform to Deliver Transformative Business Results
Authors J. McConnell and P. QuinnSummaryDesigning and implementing a data platform to support both traditional applications and modern analytical techniques is difficult in any industry - and building an oil company solution for subsurface information brings additional requirements and potential pit-falls. Defining your digitalization strategy may provide direction, but that alone is not enough. How do you ensure the strategy remains valid, and guarantee its adoption and overall success?
Feedback, agility and de-coupling are required in order to build-in fundamental flexibility and longer-term openness to change and innovation. To balance this against rock-solid operational concerns, this therefore must by underpinned by robust data and architectural principles and governance at both a high and low level. Without this cooperative and coordinated effort, investments in digitalization programs are unlikely to see their expected value fully realised.
-
-
-
Tools for Automated Rock Description
Authors E. Baraboshkin, D. Orlov and D. KoroteevSummaryThe algorithms for the classification of images are well developed in recent years. They work well when the classes are clearly defined between each other. The geological classes are not like that because they can be observed in different scales and classification paradigms.
To tackle this problem, we compare different feature extraction techniques and classification (semi- and supervised) algorithms. We present methods which help to increase the accuracy of rock type classification.
-