Sixth EAGE Digitalization Conference & Exhibition
- Conference date: March 9-12, 2026
- Location: Stavanger, Norway
- Published: 09 March 2026
1 - 20 of 99 results
-
-
Segmentation based Automated Depth Attribution and Standardization of Core Sample Images using Large Vision Models (LVMs)
More LessAuthors G. Chirila, M. Sadah, S. Khan, M. Amarni and J. Estevez GonzalezSummaryAcquisition of rock core samples during the well drilling process is a crucial step that yields substantial subsurface information, thereby enabling the comprehensive characterization of geological formations and the identification of potential petroleum system elements. Core samples are cylindrical sections of rock extracted from subsurface formations during drilling operations. Following the cutting, retrieval, measurement, and cleaning processes, core samples are typically marked to document their orientation and the depth interval from which they were extracted. Photographs of these core samples are taken at the core lab to maintain a digital record. An innovative workflow is proposed to address the challenge of extracting core data (core depth, well and plug number) from images, thereby enhancing the efficiency and accuracy of the process. This paper presents the workflow to employ the depth attribution and standardization of core sample images using segmentation and Large Vision Models (LVMs). It provides an effective way of labelling the images with the right depth and standardization of the images in a standard template.
-
-
-
Evaluating Fouled Ballast Using Digitized GPR Reflected Patterns
More LessAuthors C. KuoSummaryThe employment of transmissive motors to ascertain the track base fountain interface is a highly efficient method; however, the interpretation of the information can be challenging when the data set is extensive. The advent of digitalization, grid computing, and data processing has rendered the process of identifying the object to be measured and the interface more objective and expeditious. Subsequently, the scope may be expanded to encompass the domains of computerized learning and artificial intelligence.
-
-
-
Seismic Data Migration from On-Prem Data Management Systems to OSDU
More LessAuthors R. Krishnamachari, K. Sharma and S. ParkhiSummaryThe proposed workflow for seismic data migration from on-prem data management system (DMS) to OSDU is a well-structured and innovative approach to address the challenges faced by E&P companies during their digital transformation journey. Below are a concise analysis and summary of key aspects of the workflow:
During the transition from corporate data management systems to OSDU, the challenges posed by manual, error-prone, and resource-intensive seismic data migration workflows make automation a necessity.
The methodology introduces a systematic, automated approach to seismic data migration, encompassing Metadata Migration, Seismic Data Migration, and Data Validation.
Enhanced Visualization: Metadata ingested into OSDU is visualized via OSDU native applications.
Incremental Migration: Supports on-demand, incremental migration of seismic data, offering flexibility to clients based on their business priorities.
Automation-Driven Reliability: The automated workflow reduces human intervention, minimizes errors, and ensures a scalable, efficient migration process.
Conclusion: This automated solution provides an effective framework for migrating Petabytes of seismic data from DMS to OSDU, addressing industry pain points such as data integrity, scalability, and compatibility. By leveraging open standards, integration with existing interpretation platforms, and real-time monitoring (BI dashboards), the workflow empowers E&P companies to unlock full potential of their seismic data in a cloud-native ecosystem.
-
-
-
A Data-Driven Alarming and Control Advisory System to Improve Well Performance
More LessAuthors N. Zhang, A. Shchipanov and C. RongSummaryModern wells provide vast amounts of pressure and rate data from permanent downhole gauges and flow meters. However, transforming these datasets into actionable insights that improve well performance remains challenging. Recently developed automated well monitoring workflow employs time-lapse pressure transient analysis to derive proxy PTA-metrics that quantify well performance profiles, separating well and reservoir contributions.
Building on these performance profiles, this paper proposes a data-driven alarming and control advisory system designed to transform well monitoring data into actionable insights, facilitating timely well intervention and control decisions to enhance well performance. This system comprises three components: 1) performance anomalies detection with suggested possible causes and recommended reactive remedial actions; 2) diagnostic support, which links performance indicators to operational parameters; 3) control advisory that provides optimal operating envelopes for operational parameters, proactively helping to prevent potential performance issues.
A prototype of the alarming and control advisory system has been developed and was used to demonstrate the concept on a horizontal water injector on the Norwegian Continental Shelf. The case study highlights how integrating the automated well performance monitoring with alarming and advisory capabilities can empower engineers to make timely, actionable decisions that safeguard and improve well performance.
-
-
-
Leveraging the Value of Legacy Sidewall Core Data
More LessAuthors A. BakshSummaryThis study addresses the critical industry challenge of leveraging valuable but fragmented legacy sidewall core (SWC) data. With over 100,000 pages of disparate reports, this geological resource has been historically underutilized. We present a systematic digital transformation workflow utilizing Optical Character Recognition (OCR) and Natural Language Processing (NLP) to automatically extract, standardize, and validate data from multi-format documents, including handwritten records.
The results demonstrate a high-accuracy (98.4%) and efficient process, achieving a 60% reduction in data retrieval time and a 40% reduction in processing time. This successfully converts legacy data into a structured, unified database. The key innovation is making this historical information instantly accessible and reliable, enabling its integration with other subsurface datasets and advanced applications like machine learning. This work unlocks the full potential of legacy SWC data, transforming it into a vital asset for enhanced reservoir characterization, informed operational decision-making, and future exploration activities.
-
-
-
Combining Data-Driven Physics-Informed Methods to Automate Permanent Monitoring of Well Performance
More LessAuthors A. Shchipanov, B. Cui, V. Starikov, K. Muradov, N. Zhang, V. Demyanov and R. BerenblyumSummaryMost of the wells drilled on the Norwegian Continental Shelf during the last two decades are equipped with permanent gauges installed both at the wellhead and downhole. The gauges enable both on-the-fly and long-term monitoring of well performance, crucial for both the short-term optimization on the well level and the long-term reservoir management. The industry of today needs automation of the gauge data analysis and interpretation. Scalable data-driven solutions are being explored, while inheriting physics of well and reservoir flows is still crucial. This paper presents a recently developed workflow combining both pure data-driven and physics-informed methods to automate analysis and interpretation of big datasets from the permanent gauges. The workflow focuses on combination of pressure and rate data, commonly used in monitoring well performance with time-lapse pressure transient analysis, although many methods may be applied to other datasets such as temperature measurements or even other industries where transient behavior of measurements is observed. We concentrate further on the value of integrating data-driven approaches with physics-informed proxy-metrics for monitoring the well and reservoir performances. The paper concludes with the current and potential application areas of the methods developed.
-
-
-
GWater: An Automated Toolbox for Water Wells Requirements Estimation Ahead of Drilling
More LessAuthors F. Anifowose and H. Ur RehmanSummaryWater wells are the primary source of water used in various petroleum engineering operations and hence, play an important role in the oil and gas industry. In the spirit of digital transformation, GWater, an automated platform, developed in-house, fulfilled the need for a streamlined workflow that improves efficiency, reduces heavy human involvement, and removes the subjectivity in the traditional approach to the water well drilling requirements estimation process. The platform removes the human subjectivity in the estimation of geological, hydrological, geochemical, and completion parameters of water wells ahead of drilling. The product guides the drilling crew on the possible locations for coring, casing, and target aquifers. The huge time saving could be channeled to more intensive tasks associated with providing more support to the field and laboratory personnel. The novelty in the methodology employed in the development of this product has generated a couple of intellectual properties and has huge potential for commercialization.
-
-
-
Integration Between Seismic Interpretation and Data Management through Agentic Workflows
More LessAuthors E. SørensenSummaryDespite the advances in both AI chat bots and data management systems, geoscientists still struggle with disconnected workflows, spending significant time switching between platforms and managing data context. This paper presents an integrated solution combining a data management platform (DMP), real-time collaborative interpretation AI software, and a multi-modal LLM system. The system we propose enables geoscientists to seamlessly access, interpret, and collaborate on subsurface data while maintaining complete data lineage and project context.
-
-
-
Legacy Fossil Records Extraction, Analysis and Contextualization using Generative AI
More LessAuthors M. Abdrabu, M. Sadah, A. Mahdi, G. Chirila and S. KhanSummaryFossils are crucial in geology, providing evidence for stratigraphic correlation and paleoenvironmental reconstruction. However, traditional fossil records often suffer from disconnection between specimen details and collection points, limiting their scientific utility. Large amounts of valuable fossil data remain locked in legacy reports, handwritten notes, and scanned charts. To address this, a novel AI-driven framework integrates Large Vision Models (LVMs), Large Language Models (LLMs), and GIS to extract and process fossil data from over 7,000 legacy documents. The methodology employs YOLO-based object detection to identify key terms, such as formation names and faunal data, followed by Handwritten Text Recognition (HTR) and LLM-based refinement. This approach achieved significant results, resolving 2,410 fossil localities across Saudi Arabia. Quality control measures, including OCR error correction and validation against benchmarks, ensured accuracy, with a Word Error Rate of just 5%. Further, LVMs like Qwen2-VL and TrOCR enabled extraction from both handwritten and printed records, while duplication analysis reduced redundancy. A major breakthrough was achieved through AI-assisted geographic reasoning, which integrated textual descriptions with quadrangle maps, expanding locality data by 1,474 points. Ultimately, 98.5% of localities were geographically verified, demonstrating the framework’s potential to transform inaccessible fossil archives into structured, standardized datasets for future scientific research.
-
-
-
LangGraph-Orchestrated AI Pipelines for Geological Data Interpretation: A Comparative Analysis
More LessAuthors M. Alakkas and B. HungundSummaryDigitizing legacy geological reports is important for enabling modern analytics, yet most existing Retrieval-Augmented Generation (RAG) pipelines struggle with accuracy, often producing hallucinations or inconsistent answers. In this work, we explore how LangGraph can be used to make these workflows more reliable by adding correction loops and structured state handling. We tested three large language models—Meta LLaMA-3-90B, Anthropic Claude Sonnet, and DeepSeek R1—on geological well data, and evaluated them using three perspectives: expert scoring (LLM-as-a-Judge), lexical alignment (TF-IDF with embeddings), and semantic similarity (Word2Vec with embeddings). Our results show that DeepSeek provides the strongest semantic understanding, Claude Sonnet aligns best with expert phrasing, while LLaMA-3 delivers competitive but more variable outcomes. Overall, the LangGraph approach reduced errors, improved consistency, and provided clearer evaluation of how different models perform in scientific Q&A. This study shows that LangGraph is not just a research framework but a practical method for making generative AI more dependable in specialized fields like geoscience, and the methodology can be extended to other industries facing similar digitization challenges.
-
-
-
E & P Data Cataloguing - An Innovative Approach
More LessAuthors R. Krishnamachari, K. Sharma and J. Titus JasperSummaryIn the digital transformation of Exploration and Production (E&P) companies, data serves as a key enabler of operational efficiency, informed decision-making, and innovation. As E&P companies continue to drill more wells, the volume and complexity of data have grown exponentially. This has introduced several challenges related to data management, accessibility, and usability.
A few key challenges are:
- Vast data volumes: Over the years, E&P companies have amassed enormous datasets, often reaching petabyte scales. Seismic data alone accounts for 85–90% of this data, underscoring its critical importance in exploration workflows.
- Siloed data storage: Data is often stored in disparate systems such as shared drives, SharePoint, OneDrive, and other storage solutions. These systems are typically not integrated, leading to data silos that hinder collaboration and accessibility.
- Unindexed and Unstructured data: A significant portion of the data remains unindexed and unstructured, making it difficult to search, retrieve, and utilize efficiently.
To address these challenges, the paper focuses on indexing, metadata extraction, and seamless integration with advanced data discovery and analysis tools.
The workflow supports automated ingestion of indexed data into the OSDU platform. This integration ensures interoperability and facilitates data sharing across the organization.
-
-
-
Automated Well Injection-Rate Control Based on On-the-fly Interpretation of Pressure Data
More LessAuthors A. Ambrus, U.J.F. Aarsnes, A. Holsaeter, A. Shchipanov and J. MugishaSummaryOn-the-fly well testing and control is a new possibility for production engineers, made available following wide installation of permanent gauges in new wells. Automated interpretation of well measurements and real-time follow-up of well performance changes become a feasible option with recent developments combining data-driven physics-informed methods. In this paper, on-the-fly well testing and control are explored based on new methods for detection of induced fracture opening during Step-Rate Tests (SRT) combined with a rate-advisor approach preventing the opening. It was shown that interpreting SRT data with time-lapse pressure-transient analysis (PTA) can reveal incipient fracture opening before injectivity losses become visible on conventional p-Q plots. We extend this capability into rate advisor that (i) builds no-fracture Safe Operating Envelope (SOE) using the Bourdet derivative, (ii) detects envelope violations within subsequent rate steps, and (iii) automatically reduces the rate to the previous safe level. The rate advisor requires only bottom-hole pressure and rate as input to advise rate changes. The entire loop, including the PTA, decision logic, and rate update, is implemented in Python. Finally, we test the rate advisor in a simulated live-data-channel environment, where synthetic well responses are coupled with the rate advisor within a commercial automated well monitoring workflow.
-
-
-
Agentic AI for Seismic Data Processing: Automating Outcomes, Not Workflows
More LessAuthors G. Hennenfent, A. Sekar, T. Nemeth, I.L. Chen Ning, V. Selotkin, I. Stets, Y. Solohub and H. NasrSummarySeismic data processing is challenged by limited domain expertise and manual, labor-intensive workflows, which restrict scalability and consistency. This work introduces an agentic artificial intelligence framework that automates seismic noise attenuation outcomes by embedding expert judgment within three autonomous agents: one for diagnostics, one for remediation, and one for quality control. Unlike previous approaches that focus on scripting workflows or enhancing tool usage, this solution operationalizes domain expertise, enabling agents to collaborate and deliver noise-attenuated seismic data. The system employs image similarity and feature embeddings to diagnose, remediate, and validate seismic data in a self-regulating quality loop. This proof-of-concept demonstrates practical decision-making and establishes the feasibility of fully agentic seismic processing pipelines. By automating outcomes rather than tasks, this approach addresses critical bottlenecks and provides a foundation for future development of transformative subsurface workflows.
-
-
-
From Pages to Graphs: Intelligent Retrieval of Geoscience Knowledge using Graph RAG
More LessAuthors B. Hungund and R. AlsubaieSummaryThis study presents a **Graph-based Retrieval-Augmented Generation (Graph RAG)** framework for intelligent retrieval of information from legacy **geoscience and drilling reports**, which are often unstructured or scanned. Using the **1995 Statoil geological summary report** as a case study, text was extracted via **GPT-4o OCR**, cleaned, and divided into page-wise chunks. Each chunk was represented as a **node** in a knowledge graph built using **LightRAG**, with **edges** capturing semantic and conceptual relationships between related geological entities such as formations, lithology, and drilling operations. Unlike conventional RAG systems that rely solely on vector similarity, Graph RAG performs **graph-based retrieval and traversal**, enabling the system to capture contextual and relational dependencies across document sections. Experimental results show that Graph RAG achieves **higher retrieval accuracy and completeness**, successfully extracting formation-level information that naïve RAG models missed. The approach demonstrates the value of **relationship-aware retrieval** in geoscience applications, offering a scalable framework for interpreting and querying legacy petroleum reports. Future work will focus on integrating **geological ontologies** and expanding the pipeline to multi-document collections for enhanced exploration data management.
-
-
-
Evaluating Retrieval-Augmented Generation for Subsurface Applications: Lessons learned
More LessAuthors V. Escobar and D. DuraSummaryThis abstract discusses the implementation and evaluation of Retrieval-Augmented Generation (RAG) systems in the subsurface domain, where professionals face challenges with large volumes of unstructured data and the need for effective knowledge transfer. The RAG tool, developed by a cross-functional team, leverages text-based experiences from subsurface professionals, enriched with metadata, to enable efficient information retrieval and support faster decision-making.
Three key lessons emerged from the evaluation phase:
- Metrics Alone Are Insufficient: Quantitative metrics like retrieval recall and faithfulness provide valuable diagnostics but do not fully capture system performance. Qualitative user feedback is essential to interpret results and guide improvements.
- User Feedback Is Critical: The system incorporates a feedback loop where users evaluate queries, retrieved documents, and generated answers. This feedback refines both the content and structure of responses, ensuring the tool meets real-world needs.
- Curated Evaluation Datasets Are Crucial: Domain-specific datasets, informed by user interactions, are necessary for robust evaluation. Building such datasets is challenging but vital for meaningful assessment and continuous improvement.
The abstract concludes that RAG systems, when combined with expert feedback and tailored evaluation data, can enhance knowledge retrieval and decision-making in safety-critical subsurface applications.
-
-
-
Automated Decline Curve Analysis - DCA from a Data Science Perspective
More LessAuthors T. Odland and K.U. HollundSummaryDecline Curve Analysis (DCA) is arguably the simplest method for forecasting production. For wells on steady decline, and particularly for fast and efficient forecasting at scale, its simplicity is also its strength: DCA is robust, fast, fully automated and easy to interpret. Still, simple does not mean easy - to get good performance the least-squares curve fitting routine should be modified. Care must be taken in data preprocessing, model construction and hyperparameter tuning. If done right, DCA offers great promise in automating and streamlining forecasting for many wells in the NCS.
-
-
-
Scampi - Exploring The Fossil Frontier
More LessAuthors D. Wade, S. Stefanowicz and A. CullumSummaryWe present the Scampi system, a digital solution designed to automate and accelerate species identification in palynology. Traditional biostratigraphy is hindered by slow, manual fossil identification and a shrinking pool of specialists. Scampi addresses these challenges by leveraging content-based image retrieval (CBIR) using vision transformers, enabling efficient and accurate search of microfossil images.
The software allows experts to scour large image databases, refine results through active learning, and visualize depth distributions of identified specimins. A case study on 23 slide scans from a North Sea well showed that Scampi gave a 96% match to traditional methods, delivering interpretations ten times faster and halving the overall workflow time. The approach is expected to scale well for larger studies.
Scampi’s integration of advanced machine learning and user-friendly interfaces promises to transform biostratigraphy, making analyses faster and more cost-effective.
-
-
-
Virtual Core: Bridging the Gap Between Core Data Management, Advanced Visualization and Virtual Image Generation
More LessAuthors D. Baldini, L. Raimondi, D. Mezzapesa, S. Camici, S. Cacciatore and P. ParadisiSummaryInformation from core samples is vital for accurately characterizing subsurface formations during well evaluation. Conventional core analysis yields strategic insights into fluid presence, distribution, and deliverability. For these reasons we present a novel digital platform that consolidates core and log data, enabling integrated 3D visualization and analysis within a unified IT ecosystem. Key features include manipulation of core and log data, seamless integration and visualization of laboratory measurements and core description tools, all within a collaborative virtual workspace. Centralized data access and integration are streamlined via the company geoscience data platform, ensuring efficient data management and retrieval. Beyond visualization, the system aimed at creating 3D digital twins of actual rock samples, capturing their geological and physical properties. AI models are trained to reconstruct virtual cores in intervals lacking physical samples, extending interpretive capabilities. This platform represents a transformative approach to core data analysis, enhancing both subsurface characterization and collaborative geoscience workflows.
-
-
-
Behind the Big-Box Black Box: The Hidden Cost of Enterprise AI
More LessAuthors S. SalamoffSummaryArtificial intelligence has rapidly entered geoscience interpretation, but most commercial implementations remain automation-first—closed, opaque, and detached from human reasoning. These systems accelerate processing but erode scientific accountability by producing untraceable results that interpreters must validate without understanding.
This presentation argues for a paradigm shift toward interactive deep learning (IDL), in which interpreters remain within the learning loop through continuous, real-time feedback. Such systems promote transparency, reproducibility, and data-centric iteration, allowing experts to guide model behavior instead of merely assessing its output. Quantitative comparisons demonstrate that interactive workflows achieve faster network convergence, drastically lower discard rates, and significantly reduce post-processing workload relative to automated “black-box” methods.
Theoretical foundations draw from human–computer symbiosis, emphasizing that cognition and decision-making should remain distributed between human and machine. Responsible adoption of interactive deep learning in seismic interpretation rests on three principles—transparency, accountability, and synchronous guidance – which together reestablish the interpreter as a scientific partner rather than a passive consumer of algorithmic output. Re-centering AI on interactivity transforms it from a tool of automation into an engine of collaboration that enhances, rather than replaces, human expertise in subsurface interpretation.
-
-
-
Enhancing AI Litho-Fluid and Virtual Sonic Models for Geological and Geophysical Well Real-Time Characterization
More LessAuthors A. Di Palo, S.M. Cacciatore, F. Gusmini, M.V. Beduschi, P. Paradisi and A. CrottiniSummaryThis paper presents advancements in AI-driven Litho-Fluid Interpretation and Virtual Sonic Log Reconstruction models for real-time geological and geophysical well characterization. Integrated within a real-time digital ecosystem, these models enhance operational decision-making during drilling by automating lithology and fluid identification and sonic log generation. The Litho-Fluid model uses a facies-based expert system with six one-class classifiers, improving accuracy over previous approaches. It operates in three tiers—Drilling, Simple, and Complete—based on data availability. The Virtual Sonic model reconstructs sonic logs using while-drilling data, addressing compaction effects through depth-based inputs like TVDBML and DT-Vel. Case studies show improved carbonate interpretation and accurate sonic log reconstruction, aligning closely with acquired data. These innovations reduce interpretation latency, improve safety, and optimize drilling performance. Future work aims to simplify models by removing depth-based inputs and inferring compaction directly.
-