- Home
- Conferences
- Conference Proceedings
- Conferences
Fifth EAGE Conference on Petroleum Geostatistics
- Conference date: November 27-30, 2023
- Location: Porto, Portugal
- Published: 27 November 2023
1 - 20 of 69 results
-
-
Evaluation of CO2 Cross-Flow in Compartmentalized Reservoirs in Gas Field Using Material Balance Modelling
Authors L. Malencic, M. Dragosavac and N. NemanjicSummaryThe material balance study was conducted using an application of Petroleum Experts’ MBAL software and determined that significant volumes of CO2-rich gas were cross-flowing between reservoirs trough fault system that was activated with a start of production and pressure depletion. The magnitude of the cross-flow volumes significantly impacted well and reservoir performance and modelling. Quantification of CO2 gas volumes in production is crucial for future field redevelopment.
The novelty of the use of MBAL multi-tank model in this scenario is in the ability to history match the model in reasonable time. This is achieved while effectively managing reservoir uncertainties. This is critical for key business decisions, business planning, general reservoir management and production. This has provided high confidence in the model’s robustness, and validates the adopted methodology, which has broader applications to enable material balance modelling of reservoir cross-flow. The purpose of this article is to present the methods and practices and to show how they can be applied to other fields/reservoirs.
-
-
-
Application of RAKSVD Method Based on Adaptive Dynamic Particle Swarm Optimization in Seismic Data Denoising
More LessSummaryAmong many seismic data denoising methods, the KSVD denoising method is an effective method. However, due to its ill-conditioned problem, it is necessary to introduce regularization terms to improve the KSVD method, and the automatic optimization of regularization parameters is extremely important. Particle swarm optimization algorithms are widely used in parameter optimization, but the inertia parameters of conventional particle swarm optimization algorithms are generally fixed, which leads to the decline of the search efficiency in the later stage. In this paper, the adaptive dynamic particle swarm optimization algorithm is used to improve the setting of regularization approximation parameters, and the advantages of the AKSVD method for weak signal identification are used to propose the RAKSVD denoising method with the optimized regularization parameters of the adaptive dynamic particle swarm algorithm. Model testing and practical applications show that this method can not only achieve the expected denoising effect, but also pay more attention to the protection of weak seismic signals. After seismic denoising, the weak seismic signals are basically not distorted, which is conducive to the extraction and recognition of weak signal. At the same time, the computational efficiency of this method has also been improved.
-
-
-
Automated Prediction of Geometrical, Zone, and Petrophysical Parameters for a Gas-Bearing Reservoir
Authors M. Abdelrahman, R. Valadez and N. SzaboSummaryInsightful data on a variety of factors, such as geometrical properties and storage capacity, were obtained through borehole geophysics, which is crucial for assessing the subsurface reservoirs close to the drilling well. The optimization of reservoir exploration, however, depends on the accurate assessment of these characteristics. Additionally, because they are time-consuming and vulnerable to biases in interpretation, classic sequential interpretation procedures of well-logging data are used today. Borehole geophysical datasets from artificial and actual field boreholes were used to test the suggested procedure. In order to verify and validate the prediction of various lithological units and their petrophysical properties in the presence of 5% Gaussian noise, a synthetic dataset was employed. The procedure was further expanded to incorporate other zone factors, including shale parameters and Archie’s coefficients. The gas-bearing reservoir in Egypt is a suitable case study to evaluate and validate the suggested workflow in a challenging, deep reservoir with significant variability. To introduce several lithological units with various petrophysical and zone characteristics, our automated procedure recorded the interaction patterns and hidden linkages.
-
-
-
Impact of Ensemble Size and the Need for Localization in Ensemble History Matching
By G. EvensenSummaryThis paper investigates the implications of ensemble size and the importance of localization in ensemble history matching. Increasing the ensemble size is known to reduce the impact of sampling errors, but determining the appropriate size, N, to ensure a strong signal-to-noise ratio remains challenging. Ensemble history-matching methods use the correlation between uncertain parameters and predicted measurements to compute linear updates to the input parameters. However, the nonlinear relationship reduces the correlations.
Practical limitations in computational resources and simulation time restrict the ensemble size, often leading to spurious updates in the posterior ensemble when computing the global analysis. To address this issue, we introduce a consistent correlation-based localization method. Instead of using physical distances, this method selects measurements based on their correlation strength with the updated parameter.
The paper presents examples using the REEK model, demonstrating how increasing the ensemble size improves the correlation functions and how localization reduces spurious updates and avoids underestimating the posterior variance.
The paper also indicates the need to include uncertainties in historical-rate controls and account for measurement error correlations when computing update steps. An ensemble size of around 200 is suggested for the current example to ensure physically significant correlations.
-
-
-
Multiscale Regionalized Direct Sampling for Conditioning Process-Based Geological Models to Well Data
More LessSummaryThe process-based geological modeling mimics the physical laws that govern depositional and diagenetic processes, thus can generate geologically realistic models. However, these models do not necessarily honor well data and other field measurement. This is because these data are not directly integrated in the process-based modeling, but rather used to infer a set of appropriate input parameters of the model.
Conditioning process-based geological models to well data is critical and has been challenging for decades. We adopt a multiple-point statistics (MPS) approach where a process-based geological model is used as a training image for statistical pattern recognition. Then, we propose a multiscale regionalized direct sampling (MS-RDS) method to achieve the well data conditioning.
The conditional model is defined on the same grid as the process-based training image. The sampling is contained in a region around each grid cell instead of the entire training image. This makes the conditional model reproduce the spatial trend of the process-based model while improving tremendously the computation efficiency. Furthermore, unlike the traditional multigrid implementation of MPS, we use multiscale templates to avoid data relocation, thus to accurately honor well data while representing geological patterns at various scales.
-
-
-
A Functional Surrogate Modelling Approach for Fault Stability Studies
Authors A. Forello, L. Dovera, A. Cominelli, S. Monaco, A. Corradi, S. Petroselli and S. ManticaSummaryCarbon capture and sequestration in subsurface porous formations is a key technology to reduce CO2 emissions to reach the world net-zero emissions by 2050. Modelling studies to identify injection conditions which do not imply fault reactivation and induced seismicity are mandatory to support project implementation. Fault stability analysis requires to couple full-physics reservoir and geomechanical simulations, limiting the possibility to run multiple sensitivities due to the high computational cost. In this work, a cost-efficient workflow suitable for optimization and uncertainty quantification based on Functional Kriging (FK) surrogate models is proposed. Two different FK techniques are implemented and applied to two synthetic models, simulating CO2 sequestration in a depleted sweet gas reservoir using a compositional model with CH4 and CO2 as components. Field gas injection rate, fault modelling parameters but also geomechanical properties are considered as key variables to study fault reactivation. The proxies are trained on output from few simulations to estimate the time dependent Coulomb Failure Function on each element of the fault triangular mesh monitoring the fault behavior during the fluid injection. The results show that FK proxies accurately replicate fault stability indicators with low average errors on a set of blind simulations.
-
-
-
Improving Toc Estimation Accuracy Using Simulated Annealing Algorithm: a Data Science Approach
Authors R. Valadez Vergara, M.M. Gomaa Abdelrahman and N.P. SzabóSummaryIn this study, a new approach was introduced to enhance the accuracy of Total Organic Carbon (TOC) prediction. The modification involved the application of Simulated Annealing (SA), an optimization technique inspired by physical annealing processes, to the non-linear estimation method known as Icl-BS. The Icl-BS technique, initially proposed by Bibor and Szabo in 2016, aimed to establish a linear regression relationship between TOC content and parameter Δd, computed using the Clay Indicator Method (Icl) developed by Zhao et al. in 2016.
The modified method referred to as Icl-BS-SA, demonstrated remarkable improvements over traditional approaches. Comparisons were made against the conventional linear regression-based Icl method and the nonlinear regression-based Icl-BS technique using the Levenberg-Marquardt (LM) algorithm. The results indicated that the Icl-BS-SA method outperformed these approaches in terms of both accuracy and parameter estimation.
By incorporating the SA technique, the Icl-BS-SA method achieved significantly enhanced outcomes, showcased through reduced relative data distance and Root Mean Square Error (RMSE). This innovative approach holds promise for advancing TOC prediction accuracy, making it a noteworthy contribution to the field.
-
-
-
Improvements to Multifidelity Functional Surrogates Construction for Fast Uncertainty Quantification in Reservoir Simulation
Authors F. Ravanetti, E. Abbate, A. Menafoglio and E.L. Della RossaSummaryIn this work we propose a novel procedure for covariance identification, based on the cross-validation of the correlation between primary and secondary responses within the context of multi-fidelity functional surrogate for uncertainty quantification. The method is validated on a synthetic reservoir model of realistic size and complexity. The outputs are obtained by simulating two different levels of fidelity, referred to as FINE level (or high-fidelity) and COARSE level (or low-fidelity). These outputs are considered as the realizations of a stochastic process and are used for the surrogate construction, by means of the Universal Trace-Cokriging approach. This method incorporates the information embedded in both the primary (FINE) and the secondary (COARSE) data. We show that the surrogate accurately predicts outputs at unknown points of the uncertainty space, but its construction requires a consistently reduced computational effort with respect to the single-fidelity Trace Kriging predictor. As a further contribution we introduce two possible transformation strategies for the analysis of simulator outputs characterized by complex functional shapes.
-
-
-
Big Loop Uncertainty for Industrial use
Authors T. Chugunova and B. DecrouxSummaryThe subject of integrated uncertainty quantification of reservoir is not new; all majors and service companies are proposing their big loops. It is more robust and coherent to sample together static and dynamic parameters whether for uncertainty quantification or for optimization purpose. It avoids working with only few static models and thus avoids the bias in decision making. But in practice, handling in a big loop structural, geological and dynamic uncertainties often results in a complex workflow and lines of script. A specialist or an experienced engineer should be present in a project team.
TotalEnergies E&P has developed a Big Loop Uncertainty tool allowing an easy and intuitive integration of reservoir uncertainties in a single workflow. The objective was to give a tool that any engineer can manage with only understanding of the discipline. Based on the experimental design and surface response solid basis, the tool “hides” the complexity and routine links. The single parametrization allows to explore differently global and local uncertainties, impacts or do optimization.
Here we are presenting an application of this tool to a rather complex case with 9 geological and 24 dynamic parameters mainly associated with several levels of heterogeneity, active aquifer and fault transmissivities.
-
-
-
Blending Stationary Gaussian Random Fields for Locally Varying Anisotropy
By J. SkauvoldSummaryNonstationary gaussian random fields are more challenging to work with than their stationary counterparts, as standard methods are not generally applicable to them. One type of nonstationary that is of interest to geomodellers is locally varying anisotropy (LVA), whereby dominant directions of correlation are location-dependent. We describe a practical way to generate realisations of a nonstationary GRF with locally varying anisotropy. Our approach is to generate realisations of stationary gaussian random fields first, and then blend them together in a way that achieves the desired effect. We briefly demonstrate the use of the method on a truncated gaussian model of facies. The approach can be extended beyond locally varying anisotropy, to more general types of nonstationary.
-
-
-
Statistics in Vector Scanning for Microseismic
More LessSummaryMicroseismic monitoring can provide important parameters for the evaluation of fracturing, water/gas injection production, and gas storage, etc. in oil and gas energy sector. Due to tiny magnitude and huge number of microseismics, their signals are generally drowned out in the background noise; thus, the traditional seismic location fails, which highlights the large arrived amplitude. Therefore, the methods of probability and statistics have to be used to identify and analyze microseismicity. In the principle and denoise of our vector scanning, we applied the correlation analysis to find the microseismicity, and periodic disturbance from a large number of records. In the real-time automation process of data collation, denoising, and interpretation, we determine the range and/or threshold of characteristic parameters of noise coherent based on data statistics. We also statistically integrate the random spatiotemporal distributions of energy released to suppress the residual and occasional noise coherence, and obtain the space-time geometry of the slit network with high probability connection.
-
-
-
Transdimensional Sampling of Two-Dimensional Layered Geological Models with Variable Slope: a Proof of Concept
Authors J. Herrero, G. Caumon, T. Bodin and M. LacheuxSummaryThis study presents a novel approach to handle structural uncertainties in the interpretation of stratigraphic structures using a transdimensional sampler, the reversible jump Markov chain Monte Carlo (RJMCMC) algorithm. Moving beyond conventional reservoir simulation models which use a fixed number of layers, the layering size is treated as variable. Hence, the number and position of model layers are unknowns to be determined by the inverse problem. The proposed method extends the application of one-dimensional RJMCMC by introducing a dip parameter, enabling the simulation of sedimentary wedges and erosional features around a well, which is not typically accounted for in standard layer-cake models used for instance in well test interpretation. To keep the model dimension low, a two-dimensional, piecewise constant permeability field is determined for each layering configuration. This inverse method, tested on a synthetic well log, facilitates robust interpretations of geological reservoir geometries and could improve the prediction of permeability spatial distribution in sedimentary settings. Demonstrating successful recovery of target models, the study opens interesting perspectives both for near-well subsurface models and full field reservoir models, and identifies the need for future work on allowing interface intersections to simulate more complex stratigraphic structures.
-
-
-
A Fast Stochastic Inversion to Leverage the Full Power of the Stochastic Process
Authors C. Msika, R. Findlay and B. LeflonSummaryIf geostatistical inversion benefits are widely recognized for geo-energy studies, their current implementations make it difficult to be incorporated into routines. The Fast Stochastic Inversion (FSI) presented in this paper is a new methodology leveraging different methods to speed up the stochastic process while preserving the quality and the high-resolution content of the result. It is based on the spectral merging method which enables the integration in the frequency domain of results from deterministic inversion with stochastic simulations from well data avoiding the need to run a multitude of iterations of simulations for each location. The simulations in FSI are run using the turning band methodology, which is faster than sequential Gaussian simulations (SGS), outputting unconditional simulations which are then conditioned by kriging. The result quality is similar to SGS, however the computation times are significantly reduced as presented in an example. As a consequence, FSI can be used more easily than standard geostatistical inversions to enable geoscientists to perform comprehensive uncertainty assessment.
-
-
-
Unsupervised Classification of Flow Regime Features in Pressure Transient Responses
Authors V. Starikov, V. Demyanov, K. Muradov and A. ShchipanovSummaryPressure transient analysis (PTA) is a standard tool for interpreting reservoir response in well pressure to production and injection operations and characterizing reservoirs in subsurface engineering. It involves studying flow regimes and the associated pressure changes to extract information about well and reservoir parameters. New opportunities to support reservoir decision-making have opened with more abundant availability of PTA data. A wide deployment of permanent downhole gauges (PDG) in wells provided the wealth of high-frequency data for PTA and has encouraged researchers to explore the integration of PTA into current well monitoring and reservoir evaluation and modelling techniques.
This paper describes an automated method designed and developed to detect and classify reservoir flow regime features using pressure transient data obtained from production and injection wells. The proposed is focused on feature recognition model and solving the optimization problem for pattern classification. The method is applied and validated on real field data. The method presented in this study demonstrated a high level of accuracy, successfully identifying flow regime features with an 89% success rate based on a single pressure transient within the chosen real well dataset.
-
-
-
Ensemble History-Matching Workflow Using Interpretable Spade-Gan Geomodel
Authors K. Fossum, S. Alyaev and A.H. ElsheikhSummaryThis study introduces a pioneering approach at the intersection of generative artificial intelligence and geostatistics for reservoir history matching. Leveraging the latest conditional Generative Adversarial Networks (GANs) with spatially adaptive denormalization (SPADE), we establish a novel ensemble-based workflow that effectively captures complex geological patterns. The Ensemble Randomized Maximum Likelihood method assimilates data into an ensemble of coarse-scale maps, interpreted as the channel proportions, that serve as SPADE-GAN input. We demonstrate this Bayesian data assimilation on a combination of “hard” well data and “soft” flow data, thus extending the usability of pre-trained SPADE-GANs in subsurface applications. Our numerical experiments convincingly demonstrate the method’s capacity to replicate previously unseen geological configurations beyond its training data. This proficiency proves particularly valuable in data-scarce scenarios typical for renewable geo-energy, where the GAN captures realistic geology, but its output geomodels need adjustment to match observed data.
Furthermore, our fully open-source developments lay the foundation and provide the starting point for future multiscale enhancements.
-
-
-
Mudstone Compaction in the Vienna Basin: Insights from Lithology-Filtered Sonic and Density Logs
Authors L. Skerbisch, D. Misch, M. Drews, K. Arnberger, V. Schuller, A. Zamolyi and R.F. SachsenhoferSummaryThis contribution presents a new workflow for compaction analysis and top seal evaluation by wireline logging data. Sonic and density logs from nine key wells in the Vienna Basin (Austria), providing a continuous depth record from 0 to 3500 m true vertical depth (TVD), were filtered by natural gamma ray, resistivity, delta rho, and caliper to bit size, to identify mudstone intervals with reliable sonic and density signals. The calculated sonic- and density-porosity depth-trends were then quality-checked with core petrophysical data, e.g., helium-porosity and hydrocarbon column heights (HCHs) calculated based on true displacement radii from mercury intrusion capillary porosimetry (MICP). Furthermore, both porosity logs were corrected by their regressions with corresponding helium-porosity values from equal depths to obtain reliable log-based compaction models. Based on these models and the relationship between helium-porosity vs. HCH from MICP, log-based HCH vs. TVD curves were generated. In conclusion, both sonic- and density-porosity trends represent the core petrophysical data well. The HCH model based on density-porosity changed more significantly upon correction, after which both sonic- and density-based HCH trends plot similarly and allow for upscaling of the core petrophysical data to all wells with available wireline logs.
-
-
-
Sustainable Geo-Energy Exploration in Oman: Unveiling Reservoir Complexity for a Greener Future
Authors H. Kiyumi, L. Bellmann, M. Glukstad and H. Al SulaimaniSummaryThe study focuses on a heavy oil field located in the South region of Sultanate of Oman. This field is in a hydrocarbon rich province, characterized by numerous discoveries in both structural and stratigraphic traps. The complexity of the reservoir has made it challenging to understand the reservoir characteristics. The targeted reservoir is Al Khlata Formation where the depositional setting is governed by a complex arrangement of highly faulted Ara Formation.
Because of the reservoir’s shallow nature and the concentration on an area with meticulous well control, a Polymer or steam injection development is currently under evaluation. This paper provides a summary of the analysis and interpretation of this complex field. It employs reprocessed seismic cubes with diverse seismic attributes and integrates various geological concepts along with well information serving as a calibration.
The workflow of the new static model is shown, encompassing observations, the conceptual geological model, and geo-statistical considerations. This has led to new insights into the distribution of facies which serves as a crucial foundation for depicting reservoir architecture, heterogeneity, and connectivity.
-
-
-
History Matching of Three Facies Channelized Reservoirs Using Ensemble Smoothers with a Convolutional Autoencoder Based Parameterization
Authors C. Moldovan, B. Sebacher and R. HaneaSummaryThe estimation and uncertainty quantification of channelized reservoirs in a data assimilation framework is very hard to achieve due to the geometrical and topological characteristics of the facies fields. The channelized structure is broken after the updated step of the data assimilation process, and this causes unrealistic updated reservoir models. Geological realism can be achieved in two ways, either by learning or by conditional sampling from the prior distribution. In this study, we train a convolutional autoencoder (CAE) to reconstruct the channelized structure of a reservoir with three facies types namely channel, levee, and shale, where there is a topological transition between them. The training is done with a set composed of pairs of images, of which one is perturbed, and the other has a correct geometrical and topological structure. The CAE is linked with a parameterization of the facies fields and became part of it. The enhanced parameterization is further coupled with ES-MDA for history matching. The results show that the updated facies fields keep the prior channelized structure, and the indicators of a good history matching are fulfilled. This means good results in terms of facies estimation, uncertainty quantification, data match, and prediction capabilities of the updated ensemble.
-
-
-
A Sparse Matrix Formulation of Model-Based Ensemble Kalman Filter
Authors H. Gryvill and H. TjelmelandSummaryThe ensemble Kalman filter (EnKF) introduced in Evensen (1994) has been used successfully in many applications to do data assimilation in state-space models. EnKF represents the knowledge about the latent process in the state-space model through a set of realizations. The filter alternates between a prediction step and an update step. The prediction step can be performed analytically, while the update step must be approximated.
In Loe and Tjelmeland (2021) a Bayesian model is used to perform the update step. Specifically, the model parameters are assigned priors and simulated. Moreover, a set of valid update procedures are derived, and the ensemble is updated according to an update procedure that satisfies a specified criterion. However, the procedure introduced in Loe and Tjelmeland (2021) is computationally demanding.
In our work we introduce a computationally efficient alternative to the approach in Loe and Tjelmeland (2021) . We formulate a prior for the model parameters which enables efficient sampling, and we partition the update of each ensemble realization into smaller blocks. Simulation studies suggest that the reduction in computational demands is considerable and that the results provided by our approach are essentially similar to the results obtained with the approach from Loe and Tjelmeland (2021) .
-
-
-
Time Series Interpolation Using Geostatistics: Lessons Learnt from a Real Case-Study on a Reservoir in Asia
More LessSummaryThis paper is presenting a comparison of two methods to quantify uncertainties on production profiles for unconventional reservoirs: the traditional method with type curves fitting and statistical analysis and the geostatistical method with time series decomposition and geostatistical interpolation. Both are applied on a true case study with several hundred wells data base. The results are in favor of geostatistical approach which provides very promising validations. some way forwards are suggested.
-