- Home
- Conferences
- Conference Proceedings
- Conferences
Third EAGE Digitalization Conference and Exhibition
- Conference date: March 20 - 22, 2023
- Location: London, United Kingdom
- Published: 20 March 2023
1 - 50 of 58 results
-
-
Transformers for Site Assessment for Carbon Capture and Sequestration using Legacy Well Data
More LessSummaryCarbon Capture and Sequestration (CCS) is one of the vital components in keeping the global temperature rise within the 1.5 degrees Celsius target. Depleted oil and gas reservoirs are suitable locations for sequestering CO2 owing to their rock and structural properties and easy access to required infrastructure. Abandoned wells in these reservoirs can be used to inject CO2 saving both time and cost. Understanding the well integrity is important for CO2 containment and leakage prevention. It requires significant effort from a subject matter expert(SME) to identify well integrity information in legacy well documents which often results in longer lead times of up to an year for a CO2 sequestration site to mature. Transformer based pre-trained state-of-the-art models are utilized to inject useful information extracted from the well documents, which can be used instantly by SMEs to classify a well as potential high or low risk to inject CO2. Fine-tuning these models on generic datasets and less than 50 domain-specific examples shows that we can infer relevant well information not provided by traditional rule-based approaches. Reducing the lead time in maturing a site for CO2 injection could contribute to faster CCS project delivery timelines and broader goal of achieving net-zero targets.
-
-
-
OSDU Journey in Totalenergies: from Concept to Operational Deployment
Authors T. Akimova, B. Joudet, E. Katimbo, M. Nsamba, P. Mungujakisa and D. LingoSummaryAfter the success of very ambitious OSDU SURINAME use case project delivered by TotalEnergies in 2021 it was decided to extend the pilot on an operational affiliate.
The Uganda affiliate was chosen as a pilot for the first OSDU deployment in operations with the objective to be delivered in Q2 2023. For the Uganda affiliate OSDU deployment is a great opportunity to do the “right thing” from the beginning by avoid data duplications, leverage existing applications and, identify fit-for-purpose digital solutions within the TotalEnergies portfolio to help affiliate face its operational challenges.
Uganda affiliate has a very ambitious drilling program with more than 400 wells in 4 years. The OSDU Data Platform will focus on 3 main phases from the well data workflow: well preparation, well operations and post-drilling phase.
The main applications supporting the workflow are connected to the platform by dedicated connectors. A big work stream of Data Management on the top of OSDU Data Platform is delivered to understand the data management systems maturity in the OSDU.
With the first operational deployment, TotalEnergies is convinced that it is a right time to demonstrate all the advantages of OSDU across the Company.
-
-
-
A Geophysics Library of Trained Machine Learning Models
Authors A. Huck, P. De Groot and H. RefayeeSummaryMachine Learning models, which were trained to solve generic problems can be reused on unseen datasets. Applying such models is easy and valuable results can often be obtained much faster than through alternative workflows involving reprocessing and expert knowledge. Trained models therefore have potential to save time and money in operational settings by changing the way we work. Here, we discuss which problems are suitable, which type of models are available and how models can be added to a library of shared models. We will show examples of seismic and well log models that are applied to blind test data sets. These models are released in a library in the cloud that is accessible to users of OpendTect Machine Learning software.
-
-
-
AWS Solutions for New Energy Wind Solar Farm Management
By N. MaSummaryAmazon is committed to building a sustainable business for our customers and the planet. In September 2019, Amazon co-funded The Climate Pledge — a commitment to be net zero carbon across our business by 2040, 10 years ahead of the Paris Agreement. As part of this pledge, Amazon has made ambitious commitments toward reaching this goal. Amazon is the world’s largest corporate purchaser of renewable energy and is on a path to powering operations with 100% renewable energy by 2025.
Amazon Web Services (AWS) is known as a cloud innovator. The AWS energy transition wind solar farm data management and operational dashboard allows for the efficient and effective management of data and operations for wind and solar farms by adopting AWS cloud native services. The dashboard allows users to monitor and track energy production, manage maintenance and repairs, and optimize performance through data analysis and visualization.
AWS is sharing solutions developed internally and is enabling our customers and partners to build solutions on top of what we believe are the broadest and deepest set of cloud services available to accelerate customer’s energy transition.
-
-
-
An Integrated Workflow for Windfarm Site Characterization by Deep Learning
Authors H. Di and A. AbubakarSummaryWith the target of net-zero carbon emissions by 2050, the demand for renewable energy is increasing exponentially, and offshore wind farms are identified of the most potential of investment. Developing an offshore wind farm requires robust characterization of the near subsurface soils for turbine foundation design, construction, and monitoring, which faces with many challenges related to seafloor topography mapping, shallow geohazard detection, structure interpretation and modeling, soil type analysis, geotechnical parameter estimation and so on. This work revisits these existing challenges in from the perspective of pattern recognition, investigates the feasibility of deep learning in resolving them, and proposes an integrated workflow of great potential in accelerating the process of windfarm site characterization. Its values are demonstrated through applications to the public HKZ dataset within the Dutch wind farm zone for accelerating three major tasks, including picking multiple major horizons, mapping the seafloor topography, and estimating the essential geotechnical parameters. Future efforts are expected for more components in the proposed workflow for further integration and automation.
-
-
-
AUTO Channel: AI-driven Automatic Channel Scalar Correction
Authors C. Sutton, Y. Sang, W. Pan, P. Webster, J. Chen and M. SidahmedSummarySeismic shot processing is critical for the quality of subsurface images, and incorrect shot processing parameterization can degrade an image. Within the realm of deepwater streamer acquisition, channel amplitude correction compensates for amplitude variation among different receivers. However, it can take days for multiple parameterization efforts to perform the conventional processing, which involves manually iterative optimization of processing parameters based on the quality control (QC) of processing outputs. Artificial Intelligence (AI) is proven to be able to perform human-level QC with reduced processing cycle time. Therefore, we propose a new workflow combining a new AI-based QC agent with an automatic optimization method to automate channel amplitude correction, thus reducing the cycle time. The automatic workflow has been successfully applied to multiple Shell assets.
-
-
-
Data-Centric, Interactive Deep Learning for Complex Geological Features: a Groningen Case Study
Authors S. Salamoff, J. Chenin, B. Lartigue and P. EndresenSummaryDetailed interpretation of complex facies intervals within high-resolution 3D seismic data is a tedious and time-consuming process, even with the assistance of traditional deep learning methods. Traditional, windowed waveform classification algorithms can have a non-unique solution and are impacted heavily by interpreter bias and laterally varying data quality. This is the case in deformed facies intervals, such as post-depositional deformation of complex geological sequences, where tectonic reactivation and/or salt tectonism have re-worked sequences of post-salt siliciclastics into complicated packages that are difficult to interpret. These heavily reworked zones are prolific throughout the North Sea and can play an important role in fluid migration and containment. With manual interpretation methods, it is extraordinarily difficult to map these re-worked sediments. Their complexity usually means such sequences are under-interpreted, which introduces pre-drill uncertainties about the well path or target itself. Therefore, we propose a new, data-centric, and interactive deep learning methodology that leverages neural networks to accurately predict separate deformed facies in the Groningen Area. The results were obtained in a fraction of the time compared to traditional interpretation workflows and allow geoscientists to better characterize complex geologic units while also determining its impacts on prospective petroleum systems or planned well paths.
-
-
-
Using Machine Learning Property Modeling with Assisted Forward Stratigraphic Modeling for Offshore Wind Farm Site Characterization
Authors A. Ahmad, K. Eder and S. CourtadeSummaryThe integrated workflow discussed in this abstract is a combination of two-2 step approaches for creating predictive models for understanding the shallow sediment distribution effectively and to reduce the related uncertainty when it comes to designing and feasibility of piling foundations for Offshore Wind Farms. It is important to mention here that this workflow is essential and driven by the generation of answer products for reducing uncertainty on the foundation installation of Offshore Wind Farms, by creating predictive sediment models and use of these in estimating capacity for foundational installations by deploying machine learning property modeling workflow.
-
-
-
Give Geological Context to Seismic Attributes Through Artificial Intelligence, Using Neural Style Transfer
By R. PerezSummaryIn the oil and gas industry, seismic attributes are used to study and understand the subsurface geology. However, their usefulness is limited by a lack of sufficient geological context. In this work, seismic attributes (spectral decomposition) are put into a geological context using the neural style transfer (NST) algorithm to visualize a paleoriver system.
To transfer the style from the reference image to the content image, the stylized image is initialized with the content image, and the total loss is optimized with respect to the pixels of the stylized image. Adam optimizer is used and the content weight and style weight can be adjusted to control the relative importance of the content and style in the final stylized image.
The output image demonstrates how the stratigraphic feature highlighted by the spectral decomposition attribute would appear if it were captured from a satellite image today. This output image is easy to understand for anyone, with none to low expertise in geoscience. Neural style transfer can be a valuable tool for analyzing and visualizing stratigraphic systems.
-
-
-
Manual Active Learning for Salt Interpretation: an Empirical Study to Avoid Forgetting During Incremental Trainings
Authors L. Evano and F. CubizolleSummaryIn seismic applications, the labelling is a challenging and tedious task due to the broad areas covered by the seismic data and requires expert knowledge. Consequently, finding solutions to limit the labelling effort is a priority to accelerate workflows and to optimize the human resources. The technique of active learning can help in reaching these goals. It consists in selecting the best data to label in order to improve the model performance based on an iterative approach during which, at each step, unlabeled data are chosen to be labelled and used to train the model. This process is repeated until the model reaches acceptable performances. The main challenge when incrementally training a neural network is the forgetting of the patterns learned during the previous training iterations. We showed that the choice of the old/new labels ratio in the training and validations sets, as well as the choice of the learning rate and the patience can help mitigate the knowledge loss in the case of incremental trainings.
-
-
-
Futureproofing Rich Metadata File Ingestion with OSDU
Authors T. Hewitt and R. GadrbouhSummaryActing as a technology-agnostic, standards-based data platform, the OSDU has reduced energy data silos and provided the capability for applications developers to build new solutions and data ingestion services.
The current OSDU schemas are primarily created to store file metadata to allow users to query common business content that can be extracted from the files. We utilized a machine-learning and subject matter expert classification process to auto-generate detailed file metadata for millions of files and ingest them directly to the user OSDU instance with source files.
The file classification process currently generates a graph database representation of files and rich metadata labels at a data-object level. The classification results, alongside data lineage and quality, are stored in OSDU work product components and datasets and ready to migrate to the OSDU platform.
The process prevents users having to manually fill or supply the file manifests during file ingestion to their OSDU implementation. With over 700 distinct data types and 250,000 entities of subsurface terminologies, millions of ingested files can be enriched with highly granular metadata manifests that guarantee rapid data search and access to high-quality data.
-
-
-
Automated Extraction of Images of Interest in Document Collections: End-to-End Workflow and Operational Case-Study
Authors M.T. Nguyen, C. Cornet, L. Mattioni, A. Bouziat and G. RumbachSummaryData extraction is the process of analyzing and transforming unstructured information into structured data. Structured data can then generate meaningful insights for reporting and analytics in companies. Automation of such tasks can improve the efficiency of operational workflows and help professionals save time for more advanced and higher-value activities in their daily work. Recently, Machine Learning, Computer Vision and Natural Language Processing have been intensively developed and largely employed to automate information extraction. However, still few practical case-studies on operational geoscience data are documented. In this paper, we develop an integrated workflow to automate the extraction of images of interest and the associated information in geoscience documents. The developed workflow relies on a combination of free Python packages for Natural Language Processing, Computer Vision, Optical Character Recognition and Machine Learning. This workflow was applied on a case study using data from the LUGOS Oil Field. The objective was to automatically extract and document the evolutive interpretation of principal structural maps during several decades of field development. The proposed workflow provided very positive results, as the whole automated process had a success rate above 90% on the case-study, while lasting only 5 hours instead of several weeks of manual work.
-
-
-
Seismic Tiles, a Data Format to Enable Analytics on Seismic by Digitalization of G&G Logic
Authors Ø.M. Skjæveland, S. Torset and C.C. NilsenSummarySeismic Tiles is a data structure for representing seismic reflectors in tabular form. A tile is a small surface, (reflector segment) that aligns with a seismic reflector. In a similar fashion as how the roads of Google Map are represented by connected road segments, connected tiles will represent seismic reflector surfaces. The value of Seismic Tiles is similar to the value of the Google Map data structure, in that logic now can be applied to seismic reflectors in a straight-forward way. We can now automate interpretation tasks such tasks as prospect identification, 4D anomaly hunting, faults and horizon interpretation) by explicit (and thus explainable) logic. In contrast to the machine learning (ML) way where the machine learns by example, Seismic Tiles allows interpretation logic in “digitalized” form to be applied directly. In contrast to ML approaches, this process does not require any training data, and is fully transparent in its workings. We believe this can be a game changer in the automation of seismic analysis, as Google Maps style technology has been in road navigation.
-
-
-
Well Control Optimization Using Smart Proxy Models
Authors A. Jahanbani Ghahfarokhi and A. ChaturvediSummaryOptimal well controls to maximize the net present value (NPV) are usually obtained by coupling the numerical reservoir simulator with optimization algorithms. This approach requires significant number of simulations that are computationally expensive. Proxy models have a high capability to identify complex dynamic reservoir behavior in short time.
This study proposes a methodology by developing smart proxy models (SPMs) using Artificial Neural Network (ANN) for a synthetic field model to predict field production profiles. The method then integrates the established proxy models with Genetic Algorithm (GA) to solve the well control optimization problem. From SPM-GA coupling, the optimum well control parameters, namely bottom hole pressures of the injectors and producers are investigated to maximize NPV.
The developed SPMs produce outputs within seconds, while the numerical simulator takes an average time of 30 minutes for the case study. SPM-GA coupling works well for well control optimization by finding BHP configuration that gives an increase of over 30% in NPV, and requires fewer simulations compared to the traditional approach. The results show that the established proxy models are robust and efficient tools for mimicking the numerical simulator performance in well control optimization. Significant reduction in computational time and resources is observed.
-
-
-
A Simple Machine-Learning Approach for the Discovery of Digital Subsurface Geoscience Analog Data
Authors S. Sheyh Husein, R. Vhanamane, A. Laake, F. Stabell and M. D SouzaSummaryThe need for accelerating and improving the quality of opportunities in the asset maturation life cycle encouraged us to develop a digital solution to help geoscientists extract hidden value in their structured datasets. The focus was on creating an unsupervised machine-learning (ML) algorithm that can be trained on a structured dataset to enable the geoscientist to be presented systematically with a ranked list of analogs that meet a predefined set of weighted criteria. This has time-saving and quality-improving implications for prospect risk and volume screening, benchmarking, quality assurance and subsurface insights. The ML-assisted analytics workflow will result in more confident estimates of volumes and risk, and a list of similar reservoirs that can provide insights and new interpretation scenarios.
-
-
-
Reuse and Recycle Knowledge, Not Only Data
Authors C. Warren, N. Masurek, A. Laake and C. WraySummaryExploration, development, and investment decisions in any energy project requires consistent, trusted, and auditable technical and economic support material, that often takes too long to create and delivers results that are not comparable. In particular, valuable time is wasted while trying to find adequate knowledge and data or by duplicating work. If the users could easily access, consume, and recycle corporate knowledge digitally, which contains the direct link back to the raw data, considerable time and cost would be saved and results would be more consistent helping to support corporate decisions.
One technology to efficiently search corporate digital knowledge is the knowledge graph using defined ontologies enabling deep and efficient utilization of corporate knowledge. The knowledge graph accommodates the relationships inherent to the knowledge and associated data. By using a graph database, information can be accessed that enables users to find the knowledge and data they require, thus saving valuable time and cost.
-
-
-
The Required Value of Open Digital Platforms: an Example of Connection to Third Party Applications.
More LessSummaryNew digital journeys should easily integrate with already well-proven technology adding value to open digital platforms that account for some Application Programming Interfaces (APIs) that allow users to connect to third party applications. The digital footprint of all this data becomes essential, with new ways of analysing the results and new workflows that can utilise other cloud and non-cloud based existing solutions to create new insights and value to decision makers. Interconnecting applications by exporting results, exchanging tokens and validating users as well as ingesting results from other applications seamlessly becomes key to maximise technology investments. In that way we can expand the capabilities of new workflows, without having to focus on developing technology that is already existing, nor having to duplicate data. The results of the running external engines, once dynamic simulation has been completed, get stored in the original contextualised cloud service for further analysis and results evaluation. Extracting value from digital data should therefore not be about a scattered search for some relationships in data, but having a deliberate approach to query the data for the information the energy industry could utilise for specific decision making.
-
-
-
Visualize OSDU™ Data with Geospatial Consumption Zone and No Code Maps
Authors B. Boulmay, Y. Gubanov and D. TishechkinSummaryGeospatial data integration remains a key challenge across the Energy Industry. Join AWS and Esri as they explore the latest subsystem developed for The OSDU Data Platform. A large cross industry working group of Operators, Independent Software Vendors (ISVs) and Cloud providers came together to build the Geospatial Consumption Zone (GCZ) to enable easy access to map-based Application Program Interfaces (APIs) representing all of your data in OSDU. The presentation will touch on OSDU, how the GCZ capability was started, some of the technical architecture and finally demo how you can use the map services today in a no code environment to enable access to OSDU content for search, analytics and visualization. This platform approach and openness of OSDU is helping to accelerate digitalization across the industry.
-
-
-
Rock Property Prediction Ahead of the Drilling Bit Using Dynamic Time Warping and Machine Learning Regression
Authors A. Christ, A. Bouziat, C. Cornet, Y. Djemame, J. Fortun, J. Lecomte, A. Meunier and P.N.J. RasolofosaonSummaryIdentifying the lithology while drilling is a crucial part during geosteering when drilling a new well. Conventional geosteering uses extensive seismic, geological models, borehole images which are not necessarily available in an exploration context. In such challenging context, where only scarce data are available (e.g., Gamma ray (GR) log), we propose a new method for predicting logging responses ahead of the drill bit upstream of geosteering workflow. The method is based on performing machine learning regression and dynamic time warping on available well log data from neighboring wells as well as from the currently drilled well. Combining both technologies allows to reliably predict formation rock properties ahead of the drill bit and therefore enables to guide the geosteering in anticipation of future lithology changes. The prediction can be done in near real-time while drilling because the computational time of only a few minutes is largely inferior to the drilling time for such a distance, which is typically longer than 6h. We successfully applied this method to well log data from Offshore Western Australia and could predict the GR response up to 100m ahead of the drill bit. The proposed workflow is easily transposable to any other well log data.
-
-
-
Predicted Stratigraphy: A Case Study from the Sureste Basin (Gulf of Mexico)
Authors P. Kiss, J. Clayton, J. Cullum, W. Lee and U. AkramSummaryBiostratigraphy represents one of the key disciplines of geology by allowing the arrangement of geological formations in space and time based on fossil assemblages. Due to its significance in the oil and gas industry and the fast pace of technological innovations and developments in geosciences, the interpreted biostratigraphical data is prone to become quickly outdated and thus preventing its use in future interpretations. However, the presence of a large amount of available data provides an excellent opportunity for novel studies aiming to update species taxonomies, detect reworking specimens, train machine learning models and test prediction models in order to digitalize biostratigraphic approaches. In this study therefore, we use various data science and machine learning techniques to demonstrate the potentials of an automated biostratigraphic approach. We take advantage of legacy data collected in the Sureste Basin, Gulf of Mexico, where we transformed the original dataset into a final stratigraphic framework. Our inferences indicate that we can get an accurate first insight into the stratigraphy at the studied location within a very short timeframe. Even though inconsistencies were found, our approach proved its potential for future work, which could be improved by increasing the prediction accuracy of biostratigraphic events.
-
-
-
Exploring Convolutional Neural Networks and Machine Learning for Oil Sands Drill Core Image Analysis
Authors F. Anzum, H. Hamdi, U. Alim and M. Costa SousaSummaryPermeability is one of the key reservoir rock properties that can substantially affect the performance of the oil sand reservoirs. Standard methods for estimating permeability do not work well in oil sands. However, permeability can be estimated from mean grain size (MGS) obtained from particle size distribution (PSD). This paper investigates the use of the convolutional neural network (CNN) and machine learning (ML) to estimate MGS from drill core photos from oil sand reservoirs. Three approaches are explored for classifying core photos based on facies analysis, including (1) transfer learning on the pre-trained VGG-16 CNN model, (2) fine-tuning the top layers of VGG-16, and (3) the combination of VGG-16 and ML algorithms. These approaches are further utilized to estimate MGS from core photos to investigate their accuracy in predicting the facies. The results demonstrate that MGS can be accurately estimated from core photos using a random forest model trained on the features extracted from the last convolutional layer block of the VGG-16 CNN model. This work is one of the first research on the application of ML and CNN techniques for characterizing drill cores using digital images.
-
-
-
Indonesian Oil and Gas Showroom for Increasing the Oil and Gas Exploration and Production
Authors S.E. Saputra, A.D. Wibisono, R.N.A.C.P. Pratama and F.T. AmirSummaryThe Indonesian government has launched a program to increase oil production in Indonesia to 1 million BOPD and 12 BSCFD in 2030 as a form of meeting Indonesia’s high demand for petroleum. One of the strategies in IOG 4.0. is developing a web-based oil and gas showroom platform. This platform has the concept of an e-showroom, which will make it easy for potential investors to find out and learn about prospective areas in Indonesia.
-
-
-
Information Retrieval from Oil and Gas Unstructured Data with Contextualized Framework
More LessSummaryIn Oil and Gas industry, risk identification, operation monitoring and equipment assessment are critical in helping the engineers in their daily job. This information is captured as operator’s comments in Daily Production Report (DPR), Daily Drilling Report (DDR), or well completion report to communicate important events of field performance and well progress. The high frequency and huge volume of these report as well as unstructured nature of the comments, it becomes impossible to manually extract and interpret all the key insights. The manual extraction process is time consuming usually it take 2 to 3 days to analyze the comments. Without the underlying analysis, it could lead to poor well monitoring and delays in taking action to mitigate risk and improve the well performance continually. This loss in valuable insight results into revenue deficit because of late or uninformed decisions when an issue is reported. The solution is showing a framework to contextualize the information retrieved from the unstructured data.
-
-
-
Automating the Interpretation of 715 000 Drill Cuttings Samples with Active Learning
More LessSummaryOver 715,000 cutting images were produced through the Norwegian released well initiative (RWI) with this paper covering the process by which we hope to automate the interpretation of these images through machine learning. Our method for achieving this is through a multi-model approach, including both image classification and semantic segmentation models, which can be compared to each other and XRF based geochemistry models for active learning. Through this method we hope to produce accurate lithological descriptions for all cuttings as well as further details such as sand grain size and cementation.
-
-
-
High-Efficient Cloud Seismic Data Solution for Real-Time Geosteering Optimization
Authors A. Zaputlyaeva, A. Van Welden and I. KuvaevSummaryThe high cost and complexity of lateral wells require advanced technological equipment, state-of-the-art IT solutions, and data interpretation practices to keep the well within the target zone. Nevertheless, data integration and communication within the operations team are one of the most common challenges during the drilling activities. Modern geosteering software allows users to integrate all types of geoscience data, Log While Drilling, and 3D seismic surveys within one cloud-based platform. This IT solution can compress 3D seismic surveys using VDS technology that helps to optimize the storage and accelerate the visualization and interpretation processes. Streaming seismic data from the cloud simplifies the geosteering workflow and aids the geosteerer to make more confident decisions. Access to the free VDS APIs enables data liberation and innovative, cloud-native workflows for geosteering.
-
-
-
Extracting seismic outliers using GAN-like training of autoencoders
Authors E.B. Myklebust and T. StangelandSummaryThe detection of outliers eases the work load on analysts by providing a ranked list of possible events of interest. We have adopted a GAN-like training scheme of autoencoders to identify outliers in a seismic network. A scattering network is used for feature extraction, an autoencoder is trained with apparent and latent losses, and finally, a discriminator is used to separate the input and the reconstruction of the autoencoder. This method acts as regularization of a vanilla autoencoder and we are able to detect low amplitude outliers due to the method’s scale invariance. We show examples of outliers and inliers that demonstrate the power of the method. However, the method still requires certain optimization, but we leave that to future work.
-
-
-
An Integrated Deep Learning Workflow for Geologically Sequestered CO2 Monitoring
Authors W. Hu, S. Phan, C. Li, T. Shao and A. AbubakarSummaryWe propose a comprehensive deep learning workflow to substantially reduce the cost of CO2 monitoring using time-lapse seismic data. This integrated deep learning workflow covers various stages of CO2 monitoring from data acquisition to long-term plume body evolution monitoring, including sparse monitoring data reconstruction and optimal sparse monitoring data acquisition survey design, deblending of simultaneous-source monitoring data, time-lapse data repeatability enforcement, and rapid estimation of subsurface CO2 plume body status without conventional seismic data processing procedures. The numerical experiments validated the efficiency and accuracy of these algorithms contained in this workflow. The promising results indicate its potential in large-scale CCS projects.
-
-
-
Do Not Make the Cloud a Landfill for Your Subsurface Data
By P. GibbSummaryMigrating data to Cloud storage can be a cost-effective way of managing data in the long term and retaining it for all who may need it in the future. But, it is important to deploy the appropriate solutions which are capable of connecting to files and to application data on tape, disk, or cloud and crucially, recording extensive metadata to aid search, filtering, and analysis; only then can data managers make informed decisions and build the appropriate policies to effectively manage all data.
However, time constraints, financial pressure, resources, and strategic change might result in data being ‘zipped up and tarred off’ to the Cloud - much like the way we dispose of old/unrequired items using landfill – without first considering the appropriate policies that will help data managers to govern this valuable data in the longer term.
When done right, using the appropriate policies and solutions, cloud storage offers many benefits to O&G companies, with many moving applications and data to the Cloud to; improve access and quality of active data; archive older inactive projects and to make significant savings on costs.
-
-
-
Borehole Image Logs: New Approaches to Automated Surface, Breakout and Facies Interpretation
More LessSummaryBorehole image logs (BHI) are a valuable dataset that provide a link between core and well scale datasets, allowing the geoscientist to understand subsurface lithologies and structure. However, interpreting BHI’s is typically a slow and subjective process usually requiring a significant amount of interpretation time, generating thousands of similarly oriented bedding picks and determining where certain facies occur, rather than focusing on intervals and features of most importance. This paper focuses on image logs from 26 wells predominantly in the Santos basin, covering approximately 20,000m. The interpretation of these image logs were initially done manually, but as part of the project and in order to improve manual BHI workflows we focussed on automating parts of the workflow, namely; the auto-picking of bedding surfaces and classification and generation of facies intervals as well as the identification and classification of breakout. We developed an algorithmic approach for auto-picking that uses the low-level image features in image logs. To classify facies, we trained a convolution neural network (CNN) as an image classifier. Finally, we interpolated the image log and used contour detection to identify breakout.
-
-
-
Challenges and Opportunities for Carbon Capture and Storage Monitoring Data Management
More LessSummaryMonitoring, Measurement, and Verification (MMV) are crucial for successfully implementing a Carbon Dioxide Capture and Storage (CCS) project. However, effective management of CCS monitoring data presents challenges and opportunities.
One challenge is the need to standardize technical requirements for data storage and ensure that data is easily accessible to all relevant parties. Another challenge involves the need to perform data analytics on both historical and real-time CCS monitoring data.
Opportunities arise for using advanced visualization technologies and data analytics to improve CCS monitoring data management effectiveness. We propose a solution with key strategies to tackle issues associated with CCS monitoring data.
- A hybrid data storage system to preserve the massive geophysical survey data on near-line storage instead of pricier online storage.
- Metadata extraction to enrich the metrics dashboard and GIS information.
- A web GIS portal for users to access, visualize, and analyze trends and patterns of the monitoring data. This suggested solution provides a highly effective and cost-efficient option for CCS monitoring data management.
-
-
-
Revisiting Induced Seismicity in Groningen Field Using 3D CNN
By R. PerezSummaryIn the Netherlands, the Groningen gas field was discovered in 1959 and has been in production since 1963. The goal of the study is to explain how using AI techniques, such as fault segmentation, can help us learn more about the geology and tectonic history of the Groningen field and how to deal with the induced seismicity in the field.
From 1991 to 2021, there were 1396 events with local magnitudes ranging from 0.5 to 3.6. Depict the field’s great structural complexity throughout its geological history. Artificial intelligence-based fault segmentation works at the pixel level instead of the image level, which is different from traditional trace-based seismic attributes like coherence. The quality of the faults detected by this approach is superior to what was previously acquired.
In particular, the faults that intersect the body of the Zechstein Group may be regarded as an impermeable rock seal, but these faults can be used as an escape route if this field is repurposed as a CO2 store. All these factors are crucial in the context of the global energy transition. This sort of study contributes much to knowledge, especially when considering the consequences of hazard and risk assessments.
-
-
-
Accelerating Decision Making with Automated QC Workflows for Full-Waveform Inversion
Authors D. Halliday, S. Roy, D. Kulakov, J. Wu, A. Billa, J. Xu, M. Elbadry and R. BloorSummaryAccelerating decision making can improve turnaround time for full-waveform inversion (FWI) projects. Quality control (QC) is a key but time-consuming part of the decision-making process.
We consider the integration of digital technologies to create a new end-to-end QC workflow for full-waveform inversion. QC datasets are generated automatically as part of a new cloud-native FWI solution. Those QC datasets are stored in a cloud seismic data management system, where they can be readily accessed from Dataiku, an end-to-end data science platform, and TIBCO Spotfire, a visual analytics package.
The generated QCs introduce a significant amount of additional data to be analysed. To address this, we use the data science platform to orchestrate an unsupervised learning workflow, that allows for faster guided analysis of these QCs. The guided QCs are exposed to the user through the visual analytics package. This can lead to faster decision making in FWI.
-
-
-
Shell’s Road to a New Subsurface & Wells Digital Ecosystem – thoughts Halfway along the Journey
More LessSummarySubsurface and wells users in Shell have relied for decades on a partially connected, on-premise network of databases and tools that combined vendor products and in-house developed applications where innovation by the software engineering industry fell short. Patchy data governance leads to a true “spaghetti” of database connections that limits data access. For a globally connected business that derives value from end-to-end integration, this seriously limits value creation.
Technological progress and business imperatives such as cost and cycle-time improvements have combined to shape a new future – with the energy sector catching up in the race to adopt both “large” and “small” digital tooling and modern workflows. The industry has come together in new partnerships and attempts to standardise towards OSDU.
At the start of 2023, this digital revolution can be said to be in an adolescent state, roughly half-way to adulthood. The most complex and costly period is now: the movement from the old to the new has many change management, cultural and technical challenges.
We are not there yet - no single party understands all roadblocks ahead. Yet, the excitement of opportunity, and confidence that “digital” is not just a cost item, grows with early, credible use cases.
-
-
-
Benefits of Earth Intelligence® when modeling O&G reservoirs: an exploration case study
Authors L. Sandjivy and M. ColletSummaryEarth Intelligence® (EI) is Artificial Intelligence applied to modelling of O&G reservoirs. It allows for optimizing the parametrization of geomodelling workflows and automating them, thanks to minimizing a unique “cost function” that is the “estimation error”.
EI probabilistic workflows best valorize the full set of seismic and well data available at the time of making reservoir E&P decisions. It unifies deterministic building of alternative scenarios as one single consistent P10 P50 P90 stochastic P_scenario.
Building on a previous EAGE 2016 presentation, we show with quantified KPIs the benefits of EI for making more confident business decisions: The same North Sea exploration case study is revisited using an integrated EI software platform and positively illustrates the game changer that artificial intelligence brings to the oil and gas exploration and production sector.
Revisiting the 2016 EAGE case study using an Earth Intelligent software in 2023 illustrates the main operational KPIs that must be expected from artificial intelligence when modeling O&G reservoirs:
- reduced turnaround time
- optimized workflow performance with quantified confidence intervals.
- generation of pdf reports including all input, process, output descriptions and parameters,
- sharing of smart P-scenarios instead of “black box numerical models
-
-
-
Data Foundations: Unlocking the Potential of Subsurface Machine Learning Workflows
Authors J. Tomlinson and S. EdmondSummaryThere is increased utilization of machine learning in subsurface workflows, with the objective of enabling interpreters to produce more accurate results in less time. Many authors have published results of machine learning workflows, and frequently they conclude that high quality input data is required to deliver reliable results. This paper reviews various machine learning workflows that have been utilized by the subsurface community and analyses the data quality requirements to support those workflows. Based on multiple data foundations projects the key components of a solution to these data quality challenges are presented. The outcome is an aggregated conditioned dataset which allows both humans and machines to rapidly find relevant and quality-controlled data for the workflows they are looking to perform. The aggregated data along with the data standards and business rules developed for these varied data types can be utilised in future digital initiatives including the OSDU where data standards for many data types are still being developed.
-
-
-
Analogue Identification and Evaluation for Field Development Planning
Authors R. Vhanamane, G.S. Shergill, D. Lucas-Clements and P. WebberSummaryThe purpose of this digital solution is to create Machine Learning workflow that can enhance the search of analogues for field development planning. The solution addresses existing challenges faced by field development teams during search for similar fields from real world datasets. Also, the solution focuses on the most common use cases from customers for field acquisition and development. Based on the analysis with examples, the workflow assists field development teams to find the most accurate and mathematically validated analogues with similarity scores which are consistent among team members.
-
-
-
Application of Machine Learning in Integrated Modeling of the Oil and Gas Fields
Authors K. Pechko, A. Afanasyev, N. Brovin, E. Belonogov and M. SimonovSummaryDigitalization is an urgent task in the petroleum industry. Access to a huge amount of data, the development of processing and analyzing methods opens new opportunities for the petroleum companies. There is an important direction which insists to create uniform digital model from three elements of the field: reservoir, wells and ground network. It is necessary to inject account production, technological and economic constraints jointly. The classical approach to describe the elements of integrated model is creation of physical and mechanical models based on empirical data. These models must correspond to a field data and also require a lot of computing power. This factor leads to a calculation time increase for real business cases, which is a critical problem applied to oil and gas fields development. And especially important for serial calculations on large and complex models. This article proposes a new approach — using machine learning models on each element of an integrated model. A new approach will increase the speed of calculating models and eventually will make it possible to optimize the field production. Furthermore, validity and quality of decisions also will be increased.
-
-
-
Fully Automatic Procedure of Fault Surfaces Detection
Authors A. Kozhevin and S. TsimferSummaryField modelling is necessary to understand oil migration routes and find potential deposits: this step is essential for correct placement of producing wells. We propose fast and accurate faults detection procedure, which consists of two main stages. The first is producing of the faults probability attribute. The second is attribute processing in order to get separate fault surfaces which are approximated by fault sticks. The resulting solution makes it possible to detect fault surfaces in a matter of hours, even in a large field in its entirety, saving weeks and months of work for a geophysicist.
-
-
-
Delivering business value through OSDUTM – accelerating adoption with Transitional Architecture
Authors C. Hanton and M. WisemanSummaryThis paper presents a blueprint for OSDU implementations that introduces ‘Transitional Architecture’ as an approach that harnesses new and existing technology to accelerate delivery of value of the platform to the business community in operators.
This paper explores what OSDU, the value proposition and some of the commonly raised concerns of the platform, before presenting a solution which is built around business requirements of subsurface departments
-
-
-
Deep Learning Strategies for Seismic Demultiple
Authors M. Fernandez, N. Ettrich, M. Delescluse, A. Rabaute and J. KeuperSummaryAn important step in seismic data processing to improve inversion and interpretation is multiples attenuation. Radon-based algorithms are often used for discriminating primaries and multiples. Recently, deep learning (DL), based on convolutional neural networks (CNNs) has shown promising results in demultiple that could mitigate the challenges of Radon-based methods. In this work, we investigate new different strategies to train a CNN for multiples removal based on different loss functions. We propose combined primaries and multiples labels in the loss for training a CNN to predict primaries, multiples, or both simultaneously. We evaluate the performance of the CNNs trained with the different strategies on 400 clean and noisy synthetic data, considering 3 metrics. We found that training a CNN to predict the multiples and then subtracting them from the input image is the most effective strategy for demultiple. Furthermore, including the primaries labels as a constraint during the training of multiples prediction improves the results. Finally, we test the strategies on a field dataset. The CNNs trained with different strategies report competitive results on real data compared with Radon demultiple. As a result, effectively trained CNN models can potentially replace Radon-based demultiple in existing workflows.
-
-
-
A machine learning methodology to forecast well production: illustration with the Volve dataset
Authors B. Auffray, T. Duval, G. Suzanne and M. FerailleSummaryThis abstract presents the workflow built in order to forecast well production parameters and its application on the Volve field dataset. It consists of a sequence of four neural networks trained to forecast well downhole pressure, total liquid rate, water-cut and wellhead pressure. This workflow allows the engineers to rapidly evaluate scenarios for their day-to-day optimization operations.
-
-
-
Exploring Volume Data Store format to speedup seismic image segmentation in the Cloud
Authors L. Boillot and G. FussSummaryImage segmentation is key in seismic interpretation, aiming at detecting geological object at pixel scale in 3D cubes. Nowadays techniques based on deep learning accelerate this tedious task. In practice, the cubes are split into 2D images following different directions and the results are then restituted into cubes. These directions are the orthogonal axes but also surfaces that are not parallel to any axis. Legacy data storage formats stack the 3D cube into 1D array, offering straightforward memory access only for aligned or strided patterns. Typically, the seismic slices or horizons are the worst cases and lead to important access time. Volume Data Store (VDS) is a storage format that has been designed by Bluware Inc. to tackle this issue, with a contribution to the Open Group OSDU Data Platform. In this paper we proposed a benchmark of this VDS format against a legacy format such as SEG-Y and SEP. Different parameters are taking into account especially the computing capability of the machine specification. The preliminary results show an important speedup of VDS format in the slice direction of extraction. Further comparisons are on-going, involving VDS special capabilities like use of compression and decimation.
-
-
-
Improving Reservoir Property Prediction Using Synthetic Data Catalog and Deep Neural Network in Poseidon field, Australia
By P. DidenkoSummaryThe ultimate goal of reservoir characterization is to predict the distribution of elastic properties, porosity, and fluids in the target area. For many years Machine Learning techniques have been used in geophysics for different applications, including reservoir property prediction.
In these supervised learning approaches, the relationship for predictions is derived from the data. One of the major limiting factors for these workflows is the lack of labelled data covering the expected geology, therefore, it is challenging to train the neural network sufficiently. To overcome this, the hybrid Theory-Guided Data Science-Based method was applied.
The aforementioned workflow is divided into two main steps: first, generate many pseudo wells based on the statistics of the real well data in the project area. The reservoir properties, such as porosity, thickness, water saturation and mineralogy, are varied to cover different geological situations. Elastic properties and synthetic seismic gathers are then generated using rock physics and seismic theory.
The resulting set of synthetic data is used to train the neural network. The operator, derived during neural network training, is then applied to the real seismic data to predict properties throughout the seismic volume.
-
-
-
Digital Levers to Cut Cost and Risk of Gas Supply and Carbon Capture Storage in Transition
Authors K. Armitage, A. Hardwick, T. Brierley, P. Mewett and G. RobertsSummaryWe present a workflow (Rejuvenate) that reduces cost by using big data, rule based and expert systems (ES) integrating geology and geophysics datasets for the energy transition such as Carbon Capture and Storage (CCS). The implications are a substantial reduction in risk, cost, and confidence in reservoir properties.
ES derives geology from seismic data itself. The workflow provides resources and intelligence to clients so that green gas with CCS can bridge the gap to sustainable renewable energy towards a net-zero target. By background, a major oil company drilled 17 exploration wells spread over several sub basins. All wells were dry. Using our approach, we found that the information existed in the seismic and ancillary data that could have avoided this expense. ES is based on decades of research into dry wells and associated seismic, well data and geology with patents in place. Proven to increase efficiency by more than fifty percent, anomalies can be identified in geology and directly linked to seismic patterns. This learning can now be migrated to Machine Learning (ML) using risk matrices for the wells of today and the future; in essence a knowledgebase of seismic that did not fit the real geology that adapts.
-
-
-
MPI-free FWI for Cloud Spot Markets: Faster and Cheaper Results
Authors C. Mavropoulos and A. UmplebySummaryAs the industry shifts to more computationally intensive data-driven applications, so does the need for more scalable and efficient processing power. Running such applications on the cloud is the obvious solution as the resources can scale per the requirements and stage of the project. We propose an Infrastructure as Code (IaC) environment: S-Cube Cloud (SCC) to launch and control large volumes of computational resources needed for new seismic processing applications. To effectively leverage the cloud, spot instances must be utilised, which are offered at a large discount but may be interrupted at any time. A key limitation we address is the absence of an efficient and fault-tolerant parallelisation scheme which is cloud-native as, without it, usage of discounted spot instances is unachievable. We propose RIPS(SCI) - Robust Inter Process Simple Socket Communication Interface - which allows for the utilisation of spot instances through its fault tolerance. Applied in real-world conditions, RIPS communicates between thousands of instances and handles spot instance interruptions. Furthermore, RIPS relieves major bottlenecks in the master process bypassing processing terabytes of data per iteration compared to MPI. Savings of 70%–80% are observed in processing workloads in customer workflows using spot instances enabled by RIPS.
-
-
-
Citizen Data Scientist Toolbox for petrophysicist domain experts: case study Petroleum Industry of Serbia
Authors T. Micić Ponjiger, S. Šešum, M. Naugolnov, S. Perunić and V. MihajlovićSummaryMain scope of this paper is to present a tool created for petropysicists in Petroleum Industy of Serbia, in order to perform advanced analytics and machine learning (ML) models as a citizen data scientist. A petrophysicist as a citizen data scientist creates or generates the ML models that use advanced diagnostic analytics or predictive and prescriptive capabilities, but with primary job function outside the field of statistics, analytics and computer science. By using the standard standard software platform for petrophysicist to implement Citizen Data Scientist Toolbox we minimized the negative acceptance outcome, typical for new digital tools and applications in industrial companies.
-
-
-
Predicting Geologic CO2 Storage and Plume Evolution from Sparsely Available Well Data Using Barlow Twins
Authors C.A.S. Ferreira, T. Kadeethum and H.M. NickSummaryCarbon Capture and Storage (CCS) is an important practice for reducing greenhouse gas emissions and combating climate change. However, accurately monitoring carbon storage operations using simulations can be challenging due to data availability, subsurface complexity, uncertainty, and computational cost. Machine learning can help to address these challenges by providing cheaper data-driven approaches. For instance, Continuous Conditional Generative Adversarial Networks (CCGAN) can be used to predict CO2 plume propagation with sparsely available data. This model enables fast prediction with reasonable accuracy and a substantial reduction in computational cost when compared to numerical simulations. Another approach, Barlow Twins (BT), provides better results than other deep learning-based approaches and comparable results to traditional methods for linear subspace and nonlinear manifold problems. In this work, we compare the accuracy of predictions of CO2 plume propagation based on data from three well locations using a BT-based approach to those obtained with the CCGAN. Our findings suggest that BT-based approaches could be a viable option for data-driven simulation of CO2 plume propagation in the subsurface when data is limited.
-
-
-
Productization of Digital Transformation in the Subsurface
By C. HantonSummaryDigital Transformation has become familiar lexicon to those in the subsurface departments of energy operators in the past decade. But while the term is industry wide, the effects and benefits do not have such distribution. Success stories in the media are dominated by large NOCs and super-majors, while surveys show that the industry as a whole, lags behind its peers in effectiveness and success of digital transformation effects.
In 2022, our industry was given clear goals - provide cheap, reliable and low carbon energy to the globe; solving the ‘Energy Trilemma’ as it became known. To achieve this, low-cost, highly efficient operations are a fundamental requirement, ensuring rapid development of existing assets and assisting in both near-field and exploration appraisal. Geoscientists are key to enabling this by developing insightful views of the subsurface that inform the decision making process, but face additional challenges of restricted resources due to high levels of turnover during the Covid-19 pandemic.
In this paper, the author explores common flaws in implementation of digital technology and explores how productization of digital transformation initiatives could yield increased success rates while reducing delivery times for operators of all sizes.
-
-
-
Machine Learning Application for Joint Rock-Physics Model Optimization, Facies Classification and Compaction Modeling: North Sea Example
Authors R. Filograsso, A. Mur, R. Beloborodov and M. PervukhinaSummaryPresentation of results of rock physics guided machine learning method to improve efficiency of geoscience workflows by automating petrophysical facies log interpretation, petro-elastic depth trend production and rock physics model parameterization. We present a case study for an inversion of 7 wells set in Central north Sea, within the Forties field. The application of the new rock-physics guided machine learning toolkit demonstrates the versatility of the application, agreement with manual facies interpretation, and importance of cross-disciplinary integration.
-
-
-
Exploring the Potential of Denoising Diffusion Probabilistic Models for Generating Realistic Geological Rock Thin Section Images
By R. PerezSummaryThis study examines the use of Denoising Diffusion Probabilistic Models (DDPMs) for generating realistic geological rock thin section images. The accuracy and realism of DDPM-generated images are evaluated and compared to real-world photographs. The results indicate that DDPMs can produce high-quality samples that closely resemble real-world samples and could potentially offer a solution to the challenges associated with obtaining and maintaining geological rock thin section photographs. Some suggestions for future research include looking into how DDPMs could be used in other areas of geoscience, coming up with ways to get around their limitations, and using other machine learning techniques to improve the accuracy of the images they create.
-