- Home
- Conferences
- Conference Proceedings
- Conferences
Third EAGE Digitalization Conference and Exhibition
- Conference date: March 20 - 22, 2023
- Location: London, United Kingdom
- Published: 20 March 2023
21 - 40 of 58 results
-
-
Exploring Convolutional Neural Networks and Machine Learning for Oil Sands Drill Core Image Analysis
Authors F. Anzum, H. Hamdi, U. Alim and M. Costa SousaSummaryPermeability is one of the key reservoir rock properties that can substantially affect the performance of the oil sand reservoirs. Standard methods for estimating permeability do not work well in oil sands. However, permeability can be estimated from mean grain size (MGS) obtained from particle size distribution (PSD). This paper investigates the use of the convolutional neural network (CNN) and machine learning (ML) to estimate MGS from drill core photos from oil sand reservoirs. Three approaches are explored for classifying core photos based on facies analysis, including (1) transfer learning on the pre-trained VGG-16 CNN model, (2) fine-tuning the top layers of VGG-16, and (3) the combination of VGG-16 and ML algorithms. These approaches are further utilized to estimate MGS from core photos to investigate their accuracy in predicting the facies. The results demonstrate that MGS can be accurately estimated from core photos using a random forest model trained on the features extracted from the last convolutional layer block of the VGG-16 CNN model. This work is one of the first research on the application of ML and CNN techniques for characterizing drill cores using digital images.
-
-
-
Indonesian Oil and Gas Showroom for Increasing the Oil and Gas Exploration and Production
Authors S.E. Saputra, A.D. Wibisono, R.N.A.C.P. Pratama and F.T. AmirSummaryThe Indonesian government has launched a program to increase oil production in Indonesia to 1 million BOPD and 12 BSCFD in 2030 as a form of meeting Indonesia’s high demand for petroleum. One of the strategies in IOG 4.0. is developing a web-based oil and gas showroom platform. This platform has the concept of an e-showroom, which will make it easy for potential investors to find out and learn about prospective areas in Indonesia.
-
-
-
Information Retrieval from Oil and Gas Unstructured Data with Contextualized Framework
More LessSummaryIn Oil and Gas industry, risk identification, operation monitoring and equipment assessment are critical in helping the engineers in their daily job. This information is captured as operator’s comments in Daily Production Report (DPR), Daily Drilling Report (DDR), or well completion report to communicate important events of field performance and well progress. The high frequency and huge volume of these report as well as unstructured nature of the comments, it becomes impossible to manually extract and interpret all the key insights. The manual extraction process is time consuming usually it take 2 to 3 days to analyze the comments. Without the underlying analysis, it could lead to poor well monitoring and delays in taking action to mitigate risk and improve the well performance continually. This loss in valuable insight results into revenue deficit because of late or uninformed decisions when an issue is reported. The solution is showing a framework to contextualize the information retrieved from the unstructured data.
-
-
-
Automating the Interpretation of 715 000 Drill Cuttings Samples with Active Learning
More LessSummaryOver 715,000 cutting images were produced through the Norwegian released well initiative (RWI) with this paper covering the process by which we hope to automate the interpretation of these images through machine learning. Our method for achieving this is through a multi-model approach, including both image classification and semantic segmentation models, which can be compared to each other and XRF based geochemistry models for active learning. Through this method we hope to produce accurate lithological descriptions for all cuttings as well as further details such as sand grain size and cementation.
-
-
-
High-Efficient Cloud Seismic Data Solution for Real-Time Geosteering Optimization
Authors A. Zaputlyaeva, A. Van Welden and I. KuvaevSummaryThe high cost and complexity of lateral wells require advanced technological equipment, state-of-the-art IT solutions, and data interpretation practices to keep the well within the target zone. Nevertheless, data integration and communication within the operations team are one of the most common challenges during the drilling activities. Modern geosteering software allows users to integrate all types of geoscience data, Log While Drilling, and 3D seismic surveys within one cloud-based platform. This IT solution can compress 3D seismic surveys using VDS technology that helps to optimize the storage and accelerate the visualization and interpretation processes. Streaming seismic data from the cloud simplifies the geosteering workflow and aids the geosteerer to make more confident decisions. Access to the free VDS APIs enables data liberation and innovative, cloud-native workflows for geosteering.
-
-
-
Extracting seismic outliers using GAN-like training of autoencoders
Authors E.B. Myklebust and T. StangelandSummaryThe detection of outliers eases the work load on analysts by providing a ranked list of possible events of interest. We have adopted a GAN-like training scheme of autoencoders to identify outliers in a seismic network. A scattering network is used for feature extraction, an autoencoder is trained with apparent and latent losses, and finally, a discriminator is used to separate the input and the reconstruction of the autoencoder. This method acts as regularization of a vanilla autoencoder and we are able to detect low amplitude outliers due to the method’s scale invariance. We show examples of outliers and inliers that demonstrate the power of the method. However, the method still requires certain optimization, but we leave that to future work.
-
-
-
An Integrated Deep Learning Workflow for Geologically Sequestered CO2 Monitoring
Authors W. Hu, S. Phan, C. Li, T. Shao and A. AbubakarSummaryWe propose a comprehensive deep learning workflow to substantially reduce the cost of CO2 monitoring using time-lapse seismic data. This integrated deep learning workflow covers various stages of CO2 monitoring from data acquisition to long-term plume body evolution monitoring, including sparse monitoring data reconstruction and optimal sparse monitoring data acquisition survey design, deblending of simultaneous-source monitoring data, time-lapse data repeatability enforcement, and rapid estimation of subsurface CO2 plume body status without conventional seismic data processing procedures. The numerical experiments validated the efficiency and accuracy of these algorithms contained in this workflow. The promising results indicate its potential in large-scale CCS projects.
-
-
-
Do Not Make the Cloud a Landfill for Your Subsurface Data
By P. GibbSummaryMigrating data to Cloud storage can be a cost-effective way of managing data in the long term and retaining it for all who may need it in the future. But, it is important to deploy the appropriate solutions which are capable of connecting to files and to application data on tape, disk, or cloud and crucially, recording extensive metadata to aid search, filtering, and analysis; only then can data managers make informed decisions and build the appropriate policies to effectively manage all data.
However, time constraints, financial pressure, resources, and strategic change might result in data being ‘zipped up and tarred off’ to the Cloud - much like the way we dispose of old/unrequired items using landfill – without first considering the appropriate policies that will help data managers to govern this valuable data in the longer term.
When done right, using the appropriate policies and solutions, cloud storage offers many benefits to O&G companies, with many moving applications and data to the Cloud to; improve access and quality of active data; archive older inactive projects and to make significant savings on costs.
-
-
-
Borehole Image Logs: New Approaches to Automated Surface, Breakout and Facies Interpretation
More LessSummaryBorehole image logs (BHI) are a valuable dataset that provide a link between core and well scale datasets, allowing the geoscientist to understand subsurface lithologies and structure. However, interpreting BHI’s is typically a slow and subjective process usually requiring a significant amount of interpretation time, generating thousands of similarly oriented bedding picks and determining where certain facies occur, rather than focusing on intervals and features of most importance. This paper focuses on image logs from 26 wells predominantly in the Santos basin, covering approximately 20,000m. The interpretation of these image logs were initially done manually, but as part of the project and in order to improve manual BHI workflows we focussed on automating parts of the workflow, namely; the auto-picking of bedding surfaces and classification and generation of facies intervals as well as the identification and classification of breakout. We developed an algorithmic approach for auto-picking that uses the low-level image features in image logs. To classify facies, we trained a convolution neural network (CNN) as an image classifier. Finally, we interpolated the image log and used contour detection to identify breakout.
-
-
-
Challenges and Opportunities for Carbon Capture and Storage Monitoring Data Management
More LessSummaryMonitoring, Measurement, and Verification (MMV) are crucial for successfully implementing a Carbon Dioxide Capture and Storage (CCS) project. However, effective management of CCS monitoring data presents challenges and opportunities.
One challenge is the need to standardize technical requirements for data storage and ensure that data is easily accessible to all relevant parties. Another challenge involves the need to perform data analytics on both historical and real-time CCS monitoring data.
Opportunities arise for using advanced visualization technologies and data analytics to improve CCS monitoring data management effectiveness. We propose a solution with key strategies to tackle issues associated with CCS monitoring data.
- A hybrid data storage system to preserve the massive geophysical survey data on near-line storage instead of pricier online storage.
- Metadata extraction to enrich the metrics dashboard and GIS information.
- A web GIS portal for users to access, visualize, and analyze trends and patterns of the monitoring data. This suggested solution provides a highly effective and cost-efficient option for CCS monitoring data management.
-
-
-
Revisiting Induced Seismicity in Groningen Field Using 3D CNN
By R. PerezSummaryIn the Netherlands, the Groningen gas field was discovered in 1959 and has been in production since 1963. The goal of the study is to explain how using AI techniques, such as fault segmentation, can help us learn more about the geology and tectonic history of the Groningen field and how to deal with the induced seismicity in the field.
From 1991 to 2021, there were 1396 events with local magnitudes ranging from 0.5 to 3.6. Depict the field’s great structural complexity throughout its geological history. Artificial intelligence-based fault segmentation works at the pixel level instead of the image level, which is different from traditional trace-based seismic attributes like coherence. The quality of the faults detected by this approach is superior to what was previously acquired.
In particular, the faults that intersect the body of the Zechstein Group may be regarded as an impermeable rock seal, but these faults can be used as an escape route if this field is repurposed as a CO2 store. All these factors are crucial in the context of the global energy transition. This sort of study contributes much to knowledge, especially when considering the consequences of hazard and risk assessments.
-
-
-
Accelerating Decision Making with Automated QC Workflows for Full-Waveform Inversion
Authors D. Halliday, S. Roy, D. Kulakov, J. Wu, A. Billa, J. Xu, M. Elbadry and R. BloorSummaryAccelerating decision making can improve turnaround time for full-waveform inversion (FWI) projects. Quality control (QC) is a key but time-consuming part of the decision-making process.
We consider the integration of digital technologies to create a new end-to-end QC workflow for full-waveform inversion. QC datasets are generated automatically as part of a new cloud-native FWI solution. Those QC datasets are stored in a cloud seismic data management system, where they can be readily accessed from Dataiku, an end-to-end data science platform, and TIBCO Spotfire, a visual analytics package.
The generated QCs introduce a significant amount of additional data to be analysed. To address this, we use the data science platform to orchestrate an unsupervised learning workflow, that allows for faster guided analysis of these QCs. The guided QCs are exposed to the user through the visual analytics package. This can lead to faster decision making in FWI.
-
-
-
Shell’s Road to a New Subsurface & Wells Digital Ecosystem – thoughts Halfway along the Journey
More LessSummarySubsurface and wells users in Shell have relied for decades on a partially connected, on-premise network of databases and tools that combined vendor products and in-house developed applications where innovation by the software engineering industry fell short. Patchy data governance leads to a true “spaghetti” of database connections that limits data access. For a globally connected business that derives value from end-to-end integration, this seriously limits value creation.
Technological progress and business imperatives such as cost and cycle-time improvements have combined to shape a new future – with the energy sector catching up in the race to adopt both “large” and “small” digital tooling and modern workflows. The industry has come together in new partnerships and attempts to standardise towards OSDU.
At the start of 2023, this digital revolution can be said to be in an adolescent state, roughly half-way to adulthood. The most complex and costly period is now: the movement from the old to the new has many change management, cultural and technical challenges.
We are not there yet - no single party understands all roadblocks ahead. Yet, the excitement of opportunity, and confidence that “digital” is not just a cost item, grows with early, credible use cases.
-
-
-
Benefits of Earth Intelligence® when modeling O&G reservoirs: an exploration case study
Authors L. Sandjivy and M. ColletSummaryEarth Intelligence® (EI) is Artificial Intelligence applied to modelling of O&G reservoirs. It allows for optimizing the parametrization of geomodelling workflows and automating them, thanks to minimizing a unique “cost function” that is the “estimation error”.
EI probabilistic workflows best valorize the full set of seismic and well data available at the time of making reservoir E&P decisions. It unifies deterministic building of alternative scenarios as one single consistent P10 P50 P90 stochastic P_scenario.
Building on a previous EAGE 2016 presentation, we show with quantified KPIs the benefits of EI for making more confident business decisions: The same North Sea exploration case study is revisited using an integrated EI software platform and positively illustrates the game changer that artificial intelligence brings to the oil and gas exploration and production sector.
Revisiting the 2016 EAGE case study using an Earth Intelligent software in 2023 illustrates the main operational KPIs that must be expected from artificial intelligence when modeling O&G reservoirs:
- reduced turnaround time
- optimized workflow performance with quantified confidence intervals.
- generation of pdf reports including all input, process, output descriptions and parameters,
- sharing of smart P-scenarios instead of “black box numerical models
-
-
-
Data Foundations: Unlocking the Potential of Subsurface Machine Learning Workflows
Authors J. Tomlinson and S. EdmondSummaryThere is increased utilization of machine learning in subsurface workflows, with the objective of enabling interpreters to produce more accurate results in less time. Many authors have published results of machine learning workflows, and frequently they conclude that high quality input data is required to deliver reliable results. This paper reviews various machine learning workflows that have been utilized by the subsurface community and analyses the data quality requirements to support those workflows. Based on multiple data foundations projects the key components of a solution to these data quality challenges are presented. The outcome is an aggregated conditioned dataset which allows both humans and machines to rapidly find relevant and quality-controlled data for the workflows they are looking to perform. The aggregated data along with the data standards and business rules developed for these varied data types can be utilised in future digital initiatives including the OSDU where data standards for many data types are still being developed.
-
-
-
Analogue Identification and Evaluation for Field Development Planning
Authors R. Vhanamane, G.S. Shergill, D. Lucas-Clements and P. WebberSummaryThe purpose of this digital solution is to create Machine Learning workflow that can enhance the search of analogues for field development planning. The solution addresses existing challenges faced by field development teams during search for similar fields from real world datasets. Also, the solution focuses on the most common use cases from customers for field acquisition and development. Based on the analysis with examples, the workflow assists field development teams to find the most accurate and mathematically validated analogues with similarity scores which are consistent among team members.
-
-
-
Application of Machine Learning in Integrated Modeling of the Oil and Gas Fields
Authors K. Pechko, A. Afanasyev, N. Brovin, E. Belonogov and M. SimonovSummaryDigitalization is an urgent task in the petroleum industry. Access to a huge amount of data, the development of processing and analyzing methods opens new opportunities for the petroleum companies. There is an important direction which insists to create uniform digital model from three elements of the field: reservoir, wells and ground network. It is necessary to inject account production, technological and economic constraints jointly. The classical approach to describe the elements of integrated model is creation of physical and mechanical models based on empirical data. These models must correspond to a field data and also require a lot of computing power. This factor leads to a calculation time increase for real business cases, which is a critical problem applied to oil and gas fields development. And especially important for serial calculations on large and complex models. This article proposes a new approach — using machine learning models on each element of an integrated model. A new approach will increase the speed of calculating models and eventually will make it possible to optimize the field production. Furthermore, validity and quality of decisions also will be increased.
-
-
-
Fully Automatic Procedure of Fault Surfaces Detection
Authors A. Kozhevin and S. TsimferSummaryField modelling is necessary to understand oil migration routes and find potential deposits: this step is essential for correct placement of producing wells. We propose fast and accurate faults detection procedure, which consists of two main stages. The first is producing of the faults probability attribute. The second is attribute processing in order to get separate fault surfaces which are approximated by fault sticks. The resulting solution makes it possible to detect fault surfaces in a matter of hours, even in a large field in its entirety, saving weeks and months of work for a geophysicist.
-
-
-
Delivering business value through OSDUTM – accelerating adoption with Transitional Architecture
Authors C. Hanton and M. WisemanSummaryThis paper presents a blueprint for OSDU implementations that introduces ‘Transitional Architecture’ as an approach that harnesses new and existing technology to accelerate delivery of value of the platform to the business community in operators.
This paper explores what OSDU, the value proposition and some of the commonly raised concerns of the platform, before presenting a solution which is built around business requirements of subsurface departments
-
-
-
Deep Learning Strategies for Seismic Demultiple
Authors M. Fernandez, N. Ettrich, M. Delescluse, A. Rabaute and J. KeuperSummaryAn important step in seismic data processing to improve inversion and interpretation is multiples attenuation. Radon-based algorithms are often used for discriminating primaries and multiples. Recently, deep learning (DL), based on convolutional neural networks (CNNs) has shown promising results in demultiple that could mitigate the challenges of Radon-based methods. In this work, we investigate new different strategies to train a CNN for multiples removal based on different loss functions. We propose combined primaries and multiples labels in the loss for training a CNN to predict primaries, multiples, or both simultaneously. We evaluate the performance of the CNNs trained with the different strategies on 400 clean and noisy synthetic data, considering 3 metrics. We found that training a CNN to predict the multiples and then subtracting them from the input image is the most effective strategy for demultiple. Furthermore, including the primaries labels as a constraint during the training of multiples prediction improves the results. Finally, we test the strategies on a field dataset. The CNNs trained with different strategies report competitive results on real data compared with Radon demultiple. As a result, effectively trained CNN models can potentially replace Radon-based demultiple in existing workflows.
-