- Home
- Conferences
- Conference Proceedings
- Conferences
Petroleum Geostatistics 2015
- Conference date: 07 Sep 2015 - 11 Sep 2015
- Location: Biarritz, France
- ISBN: 978-94-6282-158-3
- Published: 07 September 2015
77 results
-
-
Keynote - From Big Data with no Geology, to Geology without Data, Quo Vadis?
By F. Alabert*The author will present a personal perspective of past, current and future trends in petroleum Geostatistics, from the double angle of his experience as Geostatistics specialist in the early 80’s, and of his various later assignments as reservoir and exploration engineer and manager. Early petroleum applications of Geostatistics mostly related to specific tasks around filtering and interpolation of geophysical data for mapping.
-
-
-
Multiple-point Statistics Simulations Accounting for Block Data
Authors J. Straubhaar*, P. Renard and G. MariethozMultiple-point statistics methods allow to generate highly heterogeneous fields reproducing the spatial features within a given training image. Whereas punctual conditioning data can be handled straightforwardly, dealing with information defined at larger scales is challenging. Among multiple-point statistics techniques, the direct sampling method consists in successively simulating each node of the simulation domain by randomly searching in the training image for a compatible pattern with that retrieved in the simulation grid. In this work, we exploit the basic principle of the direct sampling to propose an extension of the method able to deal with block data, i.e. target mean values for given subsets of the simulation grid. The proposed method is able to account for overlapping block data of any geometry and of different sizes. The approach can be used in a range of applications, including for example downscaling. Examples are presented illustrating the simulation of log-permeability fields.
-
-
-
Facies Simulation of Channel Bodies Using Multiple Point Geostatistics and Probability Perturbation - A Case Study
Authors S. Hashemi*, A. Javaherian, M. Ataee-pour and H. KhoshdelThrough recent years, multiple point geostatistics (MPS) has been greatly used in reservoir simulation workflows due to its excellent ability to reproduce complex geobodies in simulation results. In this study, one of the MPS algorithms was applied to an extremely channelized carbonate reservoir. Different sources of data were integrated into this simulation workflow through creating the training image, soft conditioning data, and hard conditioning data. At the final stage of the simulation workflow, an optimization procedure was used to fully preserve the continuity and geometry of the channel bodies in the simulation results. This optimization process, which is based on probability perturbation, tackles the optimization problem as an integration process. The Tau model is used to integrate probability function from which the initial facies model is sampled with additional data in probability form. The geometry and continuity of channels in the final optimized facies models was fully preserved as expected to be according to the available seismic data.
-
-
-
A Flexible Markov Mesh Model for Facies Modelling
Authors X. Luo* and H. TjelmelandIn this presentation we consider the problem of estimating a prior model for spatial discrete variables from a training image. To be able to combine the estimated prior model with a likelihood for observed data, we argue that one needs to use a prior formulation where an explicit formula is available for the prior distribution. Moreover, we argue that to avoid overfitting it is essential to limit the number of parameters in the prior. We propose to formulate a prior within the class of Markov mesh models, for which formulas for the point mass function are available. We define a flexible prior model within the class of Markov mesh models, where we are able to limit the number of model parameters even with a reasonably large sequential neighbourhood by restricting interactions of very high orders to be zero. To fit the Markov mesh prior to a training image we adopt a Bayesian approach, in which we consider the training image as observed data. We fit the model parameters to the training image by simulating from the resulting posterior distribution, and for this we use Gibbs sampler algorithm. We demonstrate the qualities of our approach in simulation examples.
-
-
-
Simultaneous Prediction of Geological Surfaces and Well Paths
Authors P. Dahle*, A. Almendral-Vasquez and P. AbrahamsenWe present a novel model for the true vertical depth (TVD) positioning error in horizontal wells. The model utilizes a non-stationary Gaussian process known as the integrated Ornstein-Uhlenbeck process. This process is continuous and exhibits the known systematic accumulation of vertical errors with increasing measured depth. The well corrections produced are smooth and maintain the underlying shape of the well path. The smoothness can be adjusted through a correlation range parameter. Using the geological constraints contained in the zonation, we can predict or simulate surface depths and well depths simultaneously. The size of the surface and well displacements are governed by the relative magnitude of the surface and well path TVD uncertainties. The resulting well paths and surfaces stay within their uncertainty envelopes and are consistent with the zonation. The surface/well relationships can be expressed as a highly dimensional truncated multivariate Gaussian distribution. We draw samples from this distribution using an efficient rejection sampling strategy that allows fields with hundreds of horizontal wells to be handled.
-
-
-
Modeling the Complex Stress Field - Incorporating Stochastic Models of Natural Fractures to Improve Completion Design in Unconventional Reservoirs
Authors Y. Aimene*, J. Yarus, R.M. Srivastava and Y. PandeyUnlike conventional reservoirs in which dual porosity/permeability is the primary rock property needed for fluid flow, unconventional reservoir permeability is created through hydraulic fracturing in which other key reservoir properties are needed. The success of a hydraulic fracturing job depends on the stress field around the wellbore. Unfortunately, the presence of faults and natural fractures creates a variability that could enhance the fracturing or prevent its success. Consequently, stress variability needs to be quantified to optimize the position of the fracture stages during hydraulic fracturing. This paper describes a combined use of geostatistical methods to simulate the distribution of the natural fractures and a geomechanical meshless material point method (MPM) method to account explicitly for their interaction with the regional stress. The geostatistical natural fracture network used as input in the MPM geomechanical tool enables the estimate of the complex stress field map. In resource plays such as the Eagle Ford, this can directly affect stimulation design around high density fractures. The combination of the new geostatistical natural fracture simulation method with the explicit representation of the simulated natural fractures in the MPM-based geomechanical approach is a powerful tool that could improve completion strategies in unconventional reservoirs.
-
-
-
Explicit Fracture Network Modelling - From Multiple Point Statistics to Dynamic Simulation
Authors T. Chugunova*, V. Corpel and J.P. GomezThis paper addresses the problem of an explicit fractured media modelling on an operational case. On one side, realistic fracture models are mainly used for research purposes in order to better investigate the flow behaviour impacted by the complex multi-scale fracture network. Often, a very fine grid and a consequent computational time are needed. On the other side, an operational fractured reservoir is still generally modelled using an implicit fracture media representation. The upscaled petrophysical properties and dual media are defined in a coarse grid trying to limit the computational time of dynamic simulation. The challenge of this work is to demonstrate that an explicit fracture modelling is not reserved only for the research domain but can be applied to an operational case study. The static model is constructed using a Multiple Point Statistics (MPS) approach in order to represent complex patterns of fractures and faults interconnection while managing uncertainty on fractures location. A dynamic behaviour is simulated based on this realistic fracture reservoir representation. To stay parsimonious with regard to computational time, we use a volume displacement technique which allows keeping a simple medium assumption and optimal grid refinement.
-
-
-
Bayesian Generalized Gaussian Inversion of Seismic Data
Authors K. Rimstad and H. Omre*Bayesian Gen-Gauss inversion is defined and it is demonstrated that it has great flexibility. The model is successfully used to invert seismic AVO data, and approximately 30% improvements in MSE of Bayesian Gauss inversion is obtained. The model extends easily to 3D, although the computational demands will increase considerably.
-
-
-
Multiple-point Statistics and Bayesian Inverse Theory - Some Inspirations from Mean-field and Curie-weiss Theories
Authors J.S. Gunning and A. GunningFor petroleum geostatistics, modelling of rock facies is of leading order importance: they are the dominant predictor of flow, and the dominant control of remote sensed data. Stronger geological control is desirable, and this is effectively introduced via multipoint statistics (MPS). The obscure character of MPS algorithms has hitherto prevented their clean integration with Bayesian inverse theory. We show that expressing MPS priors in terms of Gibbs energies makes this possible, and meets the dual requirements of modelling low entropy images well, and allowing rapid probability recomputations under local perturbations. By their close relation to ``standard'' Markov random field models via mean-field theory, albeit with a complex graph, their parameter inference problems are rendered easier by some analogies with classical Curie-Weiss type models. The data assimilation problem leads to an NP-hard problem equivalent to constrained binary quadratic programming, even for simple priors. This gives access to newer discrete optimisation methods like semidefinite programming (SDP). These relaxations provide remarkably good lower bounds on the optimisation, and serve as helpful validation of direct heuristic methods like annealing. Some demonstration problems on seismic AVO inversion illustrate the Gibbsian MPS formulation and its successful optimisation via both SDP and annealing.
-
-
-
Bayesian Gaussian Mixture Linear Inversion in Geophysical Inverse Problems
Authors D. Grana*, T. Fjeldstad and H. OmreWe present a Bayesian linear inversion based on Gaussian mixture models and its application to geophysical inverse problems. The proposed inverse method is based on a Bayesian approach where we assume a Gaussian mixture random field for the prior model and a Gaussian linear likelihood function. The model for the latent discrete variable is defined to be a stationary first-order Markov chain. Here, we propose an analytical solution of the posterior distribution of the inverse problem. A sampling algorithm can be used to simulate realizations from the posterior model. Two examples of applications using real data are presented. The first example is a rock physics inversion for the estimation of facies and porosity; the second example is a seismic inversion for the estimation of facies and P-impedance. For each example, we show a set of conditional simulations, and the corresponding maximum a posteriori and prediction intervals.
-
-
-
M-Factorial Kriging - An Efficient Aid to Noisy Seismic Data Interpretation
Authors J.L. Piazza*, C. Magneron, T. Demongin and N.A. MüllerThe interpretation of 3D seismic data sets is often made difficult by the presence of various types of residual noise and amplitude attenuation effects. When subtle amplitude variations related to reservoir heterogeneities, fractures or fluid effects are investigated, these flaws may become penalizing in the framework of reservoir geophysics interpretation. Several geostatistical tools are proposed by different authors to complement seismic processing with the advantage of being optimized and applied in a focused subset of the 3D seismic data set. Among them, the M-Factorial Kriging technique is found to be very efficient in terms of quality of the results and turn-around times. In the case study presented in this paper, various M-Factorial Kriging models are combined in order to attenuate different components of noise and amplitude artefacts in the interest of a better structural and stratigraphic interpretation of a deeply buried clastic reservoir.
-
-
-
Resolution of Reflection Seismic Data Revisited
Authors T.M. Hansen*, K. Mosegaard and A. ZuninoIt is commonly accepted that layers thinner than about 1/8 of the dominant wavelength cannot be resolved from reflection seismic normal incidence data. We demonstrate that there is in theory no limit the resolution of normal incidence reflection seismic data. The resolution of reflection seismic data is linked to the noise level, parameterization and a priori information.
-
-
-
Convolved Hidden Markov Models for Well-log Inversion
Authors T. Fjeldstad* and H. OmreBayesian inversion of convolved data from discrete profiles, for example well-log data from lithofacies well profiles, is studied and found feasible to make. The projection approximation of the likelihood model provides reliable approximate posterior models, which can be used as proposal in an independent-proposal MCMC M-H algorithm, to generate realizations of lithofacies profiles from the correct posterior model.
-
-
-
A Unified Framework for a Class of Ensemble Data Assimilation Algorithms in Reservoir History Matching Problems
By X. Luo*In recent years ensemble data assimilation (EnDA) algorithms, such as the ensemble Kalman filter, the ensemble smoother and their iterative counterparts have received considerable attention from researchers and practitioners in petroleum engineering, due to their relative simplicity in implementations, reasonable computational costs and reliable performance. The main goal of this paper is to extract some common structures among a class of EnDA algorithms, and establish a mathematical framework that can not only be used to analyze these existing methods in a unified way, but also entails new algorithm developments in the future. For illustration, in an example demonstrated in the paper, we transplant a deterministic inversion algorithm into the proposed framework, and derive from it an EnDA algorithm that has been applied to the Brugge field case study. The new EnDA algorithm tends to converge faster than the original inversion algorithm itself. In addition, instead of obtaining a single solution as the original inversion algorithm does, the new EnDA algorithm provides an ensemble of solutions that lays the ground for uncertainty quantification. On top of this example, we believe that one may also incorporate other deterministic inversion/optimization methods into the proposed framework and gain similar benefits.
-
-
-
Consistent Joint Updates of Facies and Petrophysical Heterogeneities Using an Ensemble Based Assisted History Matching
Authors R.G. Hanea*, T. Ek and B. SebacherIn our opinion, there are two main approaches for parameterizing the uncertainties in the subsurface characterization which impacts the flow behavior of the reservoir (disregarding structural and faults uncertainties). The first one, is to update the petrophysical properties, permeability and porosity, directly. The second strategy is to consider the underlying trends (the rock itself) as uncertain. Consequently, the facies distribution is treated as an uncertain parameter. The next logical step is to update both the facies distribution and petrophysical properties simultaneously, without losing the consistency. This paper introduces a new methodology where we are able to consistently update the facies distributions, the petrophysical properties, whilst honoring the facies information from both well logs and seismic, without the need of an extra iteration process. The results presented are using a synthetic case, a replica of a real field case in the North Sea. We will compare the new methodology against the results obtained when only the facies distributions are updated using the APS methodology. We show that the new approach captures the general trend of the facies distributions, it is closer to the true permeability distribution and it has the same predictive power.
-
-
-
Simulation of Conditional Spatial Fields Using Random Mixing
Authors S. Hörning*, A. Bardossy and S. TysonA new method for simulating conditional spatial fields is presented which improves on linear geotatistics currently used in commercial petroleum software. It preserves the continuity of extreme high and low value regions and provides an assessment of uncertainty based on the dependence of the magnitude of adjacent values. Moreover, this method includes flexibility to handle a variety of conditioning constraints, including non-linear constraints, integral equalities and inequalities
-
-
-
Interactive Earth Modeling in Unconventional Reservoirs - Principles, Methods, and a Case Study from the Mississippian, Barnett Shale
Authors J. Yarus*, C. Rodriguez, J. Dahl, C. Davila and J. SpaidDespite the decreased activity in shale resource exploitation over the last six months, long-term projections remain more optimistic for a variety of reasons, including the geometrically increasing demand for energy, the need for energy independence, and the global environmental pressure for a greener energy-based economy. This presentation focuses on increasing drilling efficiency through geostatistical modeling technology and cites practical workflows and methods that have proven successful in shale development. As an example, recent drilling success in the Barnett shale is shown to be the result of stochastic modeling and the integration of key reservoir properties into a predictive “super-variable” or quality index. The result is a continuing reduction in the price per barrel of oil equivalent (BOE) toward sustainable economic levels, even in the current market. The case study presented can be extended to other shale plays and serves as an example of practicality and effectiveness using stochastic modeling methods to more precisely design well plans to intersect the top of objective target early remaining in the zone throughout, avoid geohazards, identify optimal drilling targets (sweet spots), and assist in economical completion practices.
-
-
-
Kriging Unconventional Production Decline Rate from Geological and Completion Parameters
Authors O. Grujic* and J. CaersIn this extended abstract we propose novel kriging based technique for forecasting and uncertainty quantification in unconventional shale reservoirs. Our technique is data driven; we start from all available reservoir data including high dimensional sets of hydraulic fracturing and geological parameters, along with hydrocarbon production time series. We use functional data analysis to decompose production time series, into a low dimensional space of functional principal component scores. Which enabled us to transform the forecasting problem from complex rate vs. time into a simple regression problem of predicting functional principal component scores at new well locations. Prediction of functional principal component scores is accomplished with recently developed multivariate dice-kriging method. Entire technique is demonstrated on a real reservoir dataset containing 180 horizontal wells with 28 geological and fracturing parameters.
-
-
-
Gridless Simulation for Assessment of Volumetric Uncertainty
More LessIt is possible to assess volumetric uncertainty using a gridless geostatistical simulation method that directly simulates the shape of contour lines of continuous variables, such as thickness, net-to-gross or porosity. This method is not as fast as global Monte Carlo methods, which can easily produce millions of realizations of global averages; but the number of realizations it can produce in a reasonable amount of time is considerably more than conventional geostatistical conditional simulation, which is typically limited to hundreds of realizations. It is an improvement on global Monte Carlo methods because it honours well data, and can incorporate soft or indirect information. It is an improvement on conventional conditional simulation because it more easily honour complex information on the geometry of the oil-water contact.
-
-
-
Local Geostatistical Filtering Using Seismic Attributes
Authors R. Meunier*, H. Binet and L. PeignardFactorial kriging or kriging with filtering (Matheron, 1982) is used on post-stack or pre-stack seismic dataset to filter out unwanted components from the seismic signal. To account for non-stationarity that is often encountered within seismic data sets, kriging parameters can be locally set using Local Geo-Statistics (LGS) or M-GS (Moving-GeoStatistics) (Magneron, 2009). There are several approaches to compute the optimized parameters; local variogram parameters in adjacent areas, automatic cross-validation techniques and morphological analysis. The paper focuses on the latter approach. The idea is to determine some interesting characteristics of a seismic image that should then be transformed to local kriging parameters for the variogram model and the neighbourhood extension. Mathematical morphology techniques provide a set of tools to analyse the image, however they are not well known by geophysicists who are more familiar with seismic attributes. A seismic attribute is a quantity extracted or derived from seismic data that can be analysed in order to enhance information of a seismic image. The advantage of using seismic attributes is that they are available on common geophysical interpretation software packages. Surprisingly there is no much reference of the application of such attributes to derive the local parameters of the geostatistical filters.
-
-
-
Channel Simulation Using L-system, Potential Fields and NURBS
Authors G. Rongier*, P. Collon, P. Renard and J. RuiuChannelized environments have huge implications in many fields, from hydrogeology to mineral ressources or geotechnic. Their modeling is so of prime importance. However, some of their characteristics make this a difficult task. This is especially the case of their high continuity that is arduous to preserve while ensuring data conditioning. We propose to rely on a formal grammar system, the Lindenmayer system or L-system, to stochastically generate the channel morphologies resulting of the deposition processes. The L-system considers a channel as a succession of channel elements and puts together those elements based on user-defined rules and parameters, such as the element size or the angle between two consecutive elements. The succession of elements is then interpreted to generate non-rational uniform B-splines (NURBS) representing straight to highly meandering channels. Conditioning to hard and soft data is done through the use of potential fields that define attractive or repulsive forces toward the data. L-systems appear to be highly flexible in the generation of various channel morphologies. Preliminary results show that the method manages to simultaneously honor conditioning data and preserve at best the channel variations defined by the parameters.
-
-
-
Taking into Account Uncertainty on Fairway Borders for Lateral Offset Stacked Channels (LOSCs) Simulations
Authors D. D'Or*, E. Braccini and P. BiverAmong the process-based methods, the Lateral Offset Channels (LOSCs) simulation method has been especially designed to simulate logical chronological sequences of channels. The last (most recent) channel observed on spectral maps is fitted with a B-spline. To simulate the older channels in the sequence, the control points of the B-spline are moved along a parabolic path towards the fairway centerline, thus modifying the original B-spline in a logical way. In this paper, we particularly emphasize on the uncertainty on the fairway borders and show how actual borders corresponding to various scenarios can be drawn and how the channels simulation is adapted accordingly. In particular, we show on a field case that a larger fairway results in more meandering channels than a narrower one. All channels in a sequence are consistent with each other and can be conditioned to well data.
-
-
-
Hierarchical Parameterisation and Modelling of Deep-water Lobes
Authors L. Zhang, T. Manzocchi* and A. PonténRecent studies characterizing outcrop analogues of deep-water lobe reservoirs have demonstrated that these deposits are stacked in a hierarchical manner with characteristic element scales that vary between turbidite systems. Static connectivity and dynamic numerical modelling studies of idealized, non-hierarchical lobe systems have identified the presence of discrete net-gross or amalgamation ratio thresholds that control connectivity and flow. The objectives of the current work are to compile a geostatistical description of the hierarchy; to develop a hierarchical modelling scheme able to apply these conceptual and quantitative constraints; and, from these, to assess the importance of honoring the hierarchy in reservoir modelling.
-
-
-
Preserving Geological Realism of Facies Estimation on the Brugge Field
Authors Y. Chang*, A.S. Stordal and R. ValestrandFacies estimation using ensemble-based methods has been a popular and challenging problem in reservoir history matching. The challenges come from the difficulty of handling the discrete facies variables and preserving the geological realism in the updated facies models from ensemble-based methods. This work proposes the use of a normal score transformation as the facies parameterization approach coupled with Iterative Adaptive Gaussian Mixture (IAGM) filter to estimate the facies and non facies variables simultaneously on the complex reservoirs. We present the novel idea of using dummy wells to condition the facies modeling process for continuous channel regeneration. The overall workflow is an interaction between the data assimilation and the facies property modeling process. The proposed workflow is demonstrated on the Brugge field case and the data assimilation results provide geologically realistic facies models with better match of historical production data.
-
-
-
Seismic Wavelet Estimation and Uncertainty Quantification Using a Parametric Model
Authors J. Skauvold*, J. Eidsvik and U. TheuneEstimation of seismic wavelets from seismic amplitude data and reflection coefficient series is a difficult problem with numerous solutions. Here, the wavelet is represented by a parametric form, and parameters are estimated by Bayesian inversion. Parametrising gives increased parsimony and interpretability, but reduces flexibility compared with more generic estimation approaches.
-
-
-
Multiscale Geostatistical History Matching Using Block Direct Sequential Simulation
Authors C. Marques*, L. Azevedo, V. Demyanov, A. Soares and M. ChristieThis paper proposed a multiscale geostatistical history matching methodology using a new algorithm. The challenge of this work is the development and the application of an algorithm that couples different geological scales by recurring to Block Direct Sequential Simulation. In order to speed-up the history matching procedure we first optimize the reservoir model at a very coarse grid which is then used as an auxiliary model to perform the history matching at a very fine scale. With the development of this workflow is expected a better coarse and fine solution, an improved and more reliable reservoir model with a significant reduction of computing time and with a good integration of dynamic data in the static model. We show this novel approach in a challenging synthetic case study based on a fluvial environment.
-
-
-
Flexible Objects - A Way to Generate more Realistic Object-based Simulations
Authors A. Walgenwitz*, D. Allard and P. BiverThe object-based simulation is revisited by handling objects with complex geometry to overcome the lack of realism of standard object-based simulation. GPGPU capabilities (General-Purpose Computing on Graphics Processing Units) are used to achieve the intensive numerical computations and intensive geometric queries. The case study presented to illustrate the technique is the modeling of the internal architecture of a lobe complex in a reservoir of Western Africa.
-
-
-
Acceleration of Stochastic Seismic Inversion in Open CL-based Heterogeneous Platforms
Authors T. Ferreirinha, R. Nunes, L. Azevedo*, A. Soares, F. Pratas, P. Tomas and N. RomaThe recently proposed Geostatistical seismic AVO inversion algorithm uses direct sequential simulation and co-simulation as the model parameter space perturbation technique. To reduce the execution time of this iterative geostatistical inversion procedure, a simplified version of the sequential simulation algorithm was parallelized to exploit multi-core Central Processing Units (CPUs). By applying a straightforward functional decomposition of the algorithm, an acceleration of up to 3.5x was observed for a quad-core CPU. This solution is limited not only in scalability but also in the capacity to exploit modern heterogeneous computing systems composed of multiple processors. An efficient parallelization approach of the geostatistical seismic AVO inversion algorithm is here proposed, by considering highly heterogeneous platforms composed by several devices with different computational capabilities. Such a flexible solution is achieved by using the OpenCL API, allowing each part of the algorithm to be easily migrated among the several coexisting CPUs and GPUs.
-
-
-
Simulation of Facies Uncertainty in Field Development
Authors C. Dubreuil-Boisclair*, T. Ek, R. Hanea, C. Otterlei, E. Zachariassen and B. MassartThe aim of this paper is to assess the use of a recent facies modelling method (Adaptive Plurigaussian Simulation or APS) for facies-uncertainty analysis and sensitivity study in a complex field-development project. This method provides multiple facies realizations that honours all prior data (well, seismic, and geological concept) can be updated when conditioning to dynamical data, using ensemble methods, when production starts. The method is set up as part of an integrated and automated workflow (Fast Model Update or FMU) for reservoir modelling and characterization that is implemented in Statoil. We demonstrate that the use of APS within an FMU workflow allows for fast simulation of facies sensitivities that are reflected in the production profiles.
-
-
-
Testing Alternative Geological Heterogeneity Representations for Enhanced Oil Recovery Techniques
Authors E. Tamayo-Mas*, H. Mustapha and R. DimitrakopoulosThis paper analyzes the effects of geological heterogeneity representation in a producing reservoir, when different stochastic simulation methods are used to assess the consequent effects on flow responses for different EOR techniques employed. First, the spatial heterogeneity of a fluvial reservoir is simulated using three different stochastic methods: (1) the well-known two-point sequential Gaussian simulation (SGS), (2) a multiple-point filter-based algorithm (FILTERSIM), and (3) a new alternative high-order simulation method that uses high-order spatial statistics (HOSIM). Numerical results show that SGS suffers from the inability of describing the highly permeable channel network whereas FILTERSIM better reproduces this connectivity. By means of the recent HOSIM, a more appropriate description of the curvilinear high-permeability channels is obtained. Second, the realizations generated above represent permeability fields in EOR numerical simulations. In particular, four different methods are considered, namely: (1) surfactant, (2) polymer, (3) alkaline-surfactant-polymer and (4) foam flooding processes. The numerical results show that properly reproducing the main geological features of the reference images has a higher impact if surfactant or alkaline chemicals are injected rather than if polymer, which acts by reducing the reservoir permeability, or foam, which mitigates the heterogeneities caused by higher permeability layers, are introduced.
-
-
-
Indicator Variogram Models - Do we Have much Choice?
Authors O. Dubrule* and P. SungWith indicator simulation techniques that do not specify the underlying random set model, there is a risk of using an indicator variogram model that does not correspond to any known random set model. As a result this model may be an invalid indicator model, even though it may be a valid model for a multigaussian variable. Surprisingly, if there are known necessary conditions for an indicator variogram model to be valid, on the other hand there is no proven sufficient condition. If we make an inventory of known variogram models that remain valid as 3D indicator variogram models, the only one to emerge unchallenged seems to be the exponential model. We take this opportunity to review the properties of the exponential model in the context of indicator co-simulation, taking advantage of the work of Carle (1996). We see that the use of the transiogram model for interpreting variograms allows a much easier geological interpretation of these models. We also see that there is some flexibility in using exponential models, thanks to the addition of dampened hole-effect models in 1D.
-
-
-
Recent Advances for Facies Modelling in Pluri-gaussian Formalism
Authors P.Y.A. Biver*, D. Allard, F. Pivot and P. RuellandThe Pluri-Gaussian simulation (PGSim) formalism has been initiated in the nineties (see Le Loch and Galli 1997). This methodology has been used in numerous mining and petroleum industry applications. In this paper, we come back on this technique to present some recent interesting extensions. The first extension concerns the truncation (or assignation) diagram; using the technique developed in Allard and al. (2012), it is possible to build complex diagrams (not just composed of rectangles) and to adjust them to target proportions. This methodology provides new modelling possibilities with fit for purpose diagrams. Some applications to reproduce complex shapes similar to those observed in nature are presented. The second extension enables the use of the Pluri-Gaussian formalism in an estimation context. The idea of using an existing continuous variable (generally a geophysical attribute) to perform a truncation and to obtain a facies model was proposed in Biver and al. (2012). We present here the generalization of this formalism to Pluri Gaussian context. This generalisation allows us to perform a facies estimation consistent with two geophysical attributes
-
-
-
Modeling Three Ways from Electro-facies - Categorical, E-facies Probabilities, and Petrophysics with Assignment
Authors D.L. Garner*, J. Yarus and M. SrivastavaGeomodeling for petroleum reservoirs is conventionally done hierarchically by facies to establish regions within which rock and fluid properties can be considered “stationary.” Many reservoir models do not use depositional facies description, but use “electro-facies” created from clustering petrophysical log curves. This paper compares three approaches to the development of e-facies geomodels, 3D models of categorical codes to be used as stationary domains within which rock and fluid properties can be simulated. The first approach uses e-facies codes developed through cluster analysis as conditioning data and uses a method for simulating categorical variables, plurigaussian simulation, to directly build a 3D model of the e-facies. The second approach uses petrophysical logs at the wells as conditioning data and uses a standard method for co-simulating continuous variables to build 3D models of the log responses; these are then converted to e-facies using the rules developed through cluster analysis. The third method works directly with the e-facies probabilities that most cluster analysis techniques can provide. These probabilities are co-simulated as continuous variables in 3D, ensuring they are bounded and sum to 1, and a unique e-facies code is assigned, by taking the e-facies with the maximum probability at each location.
-
-
-
Integrating Spatio-temporal Rules for Surface-based Reservoir Modeling
Authors Y. Wang and T. Mukerji*A rule induction methodology is demonstrated to generate surface-based models of channel-network systems by transforming numerical geological process outputs into statistical rules controlling the channel network simulation. The methodology is applicable not only to outputs from numerical simulators but also to satellite images of channel network systems. We use a multi-scale line tracking algorithm (MSLTA), to extract the channel network. Based on the distribution of bifurcation points, realizations of network skeletons are generated using a space colonization algorithm. Finally geobodies and surfaces are added around the skeleton. Surface-based modeling and rule-based algorithms provide promise in capturing realistic geometric evolution of facies geobodies without the need for computationally expensive numerical solutions of process-based models, provided good rules are available to guide the algorithm. Process-based model results and physical tank experiments serve as good 'warehouses' of statistics and rules for rule-based modeling.
-
-
-
Geostatistical Simulations on Irregular Reservoir Models Using Methods of Nonlinear Geostatistics
Authors V.N. Zaytsev*, P. Biver, H. Wackernagel and D. AllardClassical approaches to geostatistical simulations are not applicable directly on irregular reservoir models (such as Voronoi polygon and tetrahedron meshed models). One of the main difficulties is that the block marginal distributions are unique for every block due to volume support effect. We propose a methodology for geostatistical simulations which overcomes this difficulty in an analytical manner and provides a robust utilization of the small support petrophysical property distribution and the covariance model for irregular reservoir models. The proposed solution is based on the discrete Gaussian (DGM) model and operates directly on blocks of the target grid. This solution is also capable to improve the quality of the classical reservoir models, such as tartan meshes, by including the volume support effect into consideration and thus-providing geologically more realistic results. Applications to Voronoi polygon grid with local grid refinements and to a tartan-meshed offshore gas reservoir model are demonstrated.
-
-
-
Geostatistics on Unstructured Grid - Coordinate Systems, Connections and Volumes
Authors H. Gross* and A.F. BoucherThe current trend in reservoir simulators is to solve transport of energy and flow equations on unstructured grids. This work addresses three challenges for geomodelers wanting to perform geostatistics directly on unstructured grids: (1) the large range of cell volumes and their arbitrary shapes, (2) the complex cell topology and its need for explicit connection specifications and (3) the use of alternate coordinate systems to better conform to geological features. This work uses triangular, hexagonal, and tetrahedral 3D grids to demonstrate the adaptation of geostatistical algorithms on unstructured grids. Hard data conditioning and locally-varying trends are also addressed. Finally we present three approaches to simulate on unstructured grid (1) the simulation on cell centroids, (2) the simulation on a finely discretized grid followed by upscaling and (3) a direct cell geostatistical algorithms.
-
-
-
An Unstructured Depositional Grid for Property Modeling
Authors A. Benabbou*, C. Daly, L. Macé, A. Levannier and C. BuchholzThe presentation first develops a transform from geological space to depositional space based on large deformation elasticity. A depositional grid is constructed using this mapping. It is shown that this grid, even though it is unstructured and capable of modeling complex geology, can be used directly in geostatistical modeling without a need for an upscaling step as is more usually the case for performant unstructured grids.
-
-
-
Value of Information Analysis of Geophysical Data for Drilling Decisions
Authors J. Eidsvik*, T. Mukerji, D. Bhattacharjya and G. DuttaValue of Information (VOI) analysis is conducted for different geophysical data (seismic and electromagnetic data). VOI is a concept in decision theory for analysing the value of obtaining additional information, before purchasing and revealing the data. Gathering the right kind and amount of geophysical information is crucial for resolving difficult reservoir decision situations. We focus on drilling decisions, and structure situations according to the spatial decision alternatives and value function complexity. The geostatistical modeling plays an integral part of the prior decision making, and for the pre-posterior evaluation of the various data gathering schemes.
-
-
-
Uncertainty Assessment Driven Exploration Program Modification - A Case Study
Authors O.A. Gorbovskaia* and B.V. BelozerovFor the last decade multiple realizations geological modelling has become a widely used practice in petroleum industry. Integrated analysis of geological and geophysical data available at the moment via multivariate modelling may be applied not only for probabilistic methods of oil-in-place (OIP) estimation but also may help to better understand reservoir and associated risks and therefore affect the exploration and development decisions. The work deals with a case study of exploration planning focused upon OIP assessment correction. It shows an example of multivariate 3D geological modelling application for prognosis of the results obtained after one or another exploration option. The approach discussed allows quantitative assessing an effect of drilling a well at alternative locations on uncertainty level decrease. Proactive analysis of outcomes expected is thought to be a powerful tool for improvement of exploration program from the standpoint of both expenses optimization and risk decrease.
-
-
-
Efficient Optimization of Exploration Drilling Campaigns with Convergent Information Bounds
Authors M. Lilleborge*, J. Eidsvik and R. HaugeWe discuss a Bayesian Network (BN) with 42 nodes, of which 25 are leaf nodes that represent actual petroleum prospects in the North Sea where we could choose to collect data. We look at the case where the data gathering is carried out as a seasonal campaign with m exploration wells, and the question is: Where should they be drilled? The complexity of the problem is such that for large m or for larger networks, the optimal observation set problem is not computationally feasible through exact calculations. We introduce a method for computing upper and lower bounds inspired by the Junction Tree Algorithm to do a more efficient search for the optimal drilling campaign. Our algorithm improves the bounds until they are sufficiently tight, and our construction of the upper and lower bounds results in an extremely efficient search for the optimal observation set for exploration wells in the North Sea petroleum prospect case.
-
-
-
Can Geostatistical Models Represent Nature’s Variability? An Analysis Using Flume Experiments
Authors C. Scheidt*, A. Fernandes, C. Paola and J. CaersOne of the difficulties in multi-point geostatistics (MPS) is the definition of the training image (TI). In the context of uncertainty modeling, the construction of a set of TIs is desirable, but the number of TIs and the characteristics that should to be varied in the TI are not well understood. In this research, we explore the question of the definition of the TIs using tank experiments. A set of snapshots of delta deposits seen in the tank are used to explore the variability of the system over time and to see if MPS can reproduce the variability of the set of images using only a few, well-selected images that are taken as TI. Preliminary methodologies are explored to select representative images, where the variation of the deposits over time is studied. Our results show that MPS was able to reproduce the variability in the full set of images, hence the variability of the studied system. Analyzing the characteristics of the selected images is a first step forward in the attempt to define TIs. This study only present preliminary investigations and more general answers will be explored.
-
-
-
Role of Geostatistical Techniques on the Evolution of 3D Reservoir Interpretation and Modelling Outcomes - An Example
Authors N. Orellana*, I. Yemez, J. Cavero, V. Singh and E. IzaguirreOne of the key challenges in 3D reservoir modeling is distributing the identified facies and their properties in defined 3D framework honoring geology and available data. Different geostatistical techniques are used for reservoir facies and properties distribution in the 3D reservoir models which have different inputs and assumptions. These techniques constrain 3D reservoir models to local data which should represent the geological knowledge and help in creating appropriate flow behaviours through dynamic simulation. However, these simulation results are highly dependent on the available input data, geomodeler’s knowledge and experience. To capture the geological model evolution from discovery to development phases and assess the influence of different simulation techniques (SIS, TGS, Object Based, Multipoint) on the reservoir facies and properties distribution a geological model was built for a clastic reservoir and updated as and when new data/information was available. The modeling results (Original hydrocarbon in-place & recoverable resources/reserves and production forecasts) show significant variations for different phases of the project. If enough data, appropriate data QC, geological rules, mapping principles and geostatistics are not properly applied to capture all possible range of parameters and geological scenarios in the 3D modeling process, the results will be highly uncertain and affect decisions.
-
-
-
MPS Application in Carbonate Field and Joint Probability Updating of Trend and Depositional Scenario
Authors C. Corradi*, M. Bazzana, A. Da Pra, M. Pontiggia, J. Caers and C. ScheidtIn the late exploration to early appraisal phases of a reservoir, dynamic data are usually not available. Only information from seismic and a few wells is on hand for characterizing the reservoir. As a result, a high degree of uncertainty is present. A new workflow has been developed for a rapid model updating with well data, to produce more realistic uncertainty quantification. In this work, we present a specific methodology regarding the updating of probabilities of key uncertain parameters involved in the reservoir modelling, when new data becomes available. The main concept is to use a Bayesian framework to update prior probabilities of key geological parameters, in this case depositional scenario and spatial trend, when new data from drilling is obtained. The application of the workflow to a real carbonate field, characterized by a high degree of uncertainty, is presented. This method could give a huge improvement, especially for green fields with considerable uncertainty in the depositional system, proportion and trend. Furthermore, it could increase the overall efficiency by speeding up the process from geological modelling to reservoir dynamic modelling, without lost accuracy, especially regarding the uncertainty management.
-
-
-
Keynote - Geologically Consistent History Matching Using the Ensemble based Methods
By Y. Chen*The ensemble-based data assimilation methods have shown great potential for automatic history matching after being introduced to petroleum industry a decade ago. Over the decade, this family of methods has evolved from the initial method known as the ensemble Kalman filter to the iterative ensemble smoother (perhaps the most promising one to date for history matching). The updating scheme of the ensemble-based methods relies on the covariance, which is ideal for model parameters that are typically assumed Gaussian, i.e. log transformed horizontal permeability (log-permx) and porosity for a given facies type. The geological models for most reservoirs now typically have multiple facies types, each with distinct petrophysical properties. In this case, the joint distribution of log-permx over the entire grid is no longer multi-variant Gaussian. If the gridblock log-permx is directly updated in the ensemble-based methods, facies boundaries are smeared out, and often the updated log-permx shows values that are outside of the normal range. In order to maintain geological realism of updated models and increase their predictability, different transformation and parameterization methods have been used with the ensemble-based data assimilation methods to account for the presence of multiple facies types in the model. In this talk, I will review these methods with examples and discuss the advantage and limitation of each method.
-
-
-
Bayesian Inversion of Time-lapse Seismic Data Using Bimodal Prior Models
Authors I. Amaliksen* and H. OmreThe objective is to make inference about reservoir properties from seismic reflection data. The inversion problem is cast in a Bayesian framework, and bi-modal prior models are defined in order to honor the bi-modal behavior of the saturation variable. By using a Gauss-linear likelihood model the explicit expressions for the posterior models are obtained by the convenient properties of the family of Gaussian distributions. The posterior models define computationally efficient inversion methods that can be used to make predictions of the reservoir variables while providing an uncertainty assessment. The inversion methodologies are tested on synthetic seismic data with respect to porosity and water saturation at two time steps. Encouraging results are obtained under realistic signal-noise ratios.
-
-
-
Fast Model Update Coupled to an Ensemble based Closed Loop Reservoir Management
Authors J.A. Skjervheim, R.G. Hanea* and G. EvensenAn integrated approach to reservoir modeling is required if the geomodel is included in the conditioning process, and a fast, consistent and automated model chain workflow from structural modeling to flow simulation needs to be established. In this paper we demonstrate the integrated workflow, named Fast Model Update (FMU), on a real field application and how FMU can be coupled to a closed-loop reservoir management process. An automated modeling process allows for working with multiple realizations and to perform combined static and dynamic uncertainty studies, where geological uncertainties are consistently propagated all the way to simulation. The use of multiple realizations allows for the use of statistical “ensemble methods” for big-loop model conditioning, where any uncertain parameter that is input to the model workflow can be updated (e.g., channel direction, facies probability, seismic velocity model, structural surfaces). Working with multiple model realizations in FMU, allows for robust reservoir management and well planning, where the geological uncertainty is taken into account. Decisions can be made and wells can be drilled, at a reduced risk by using a better representation of all uncertainties
-
-
-
Quantifying the Uncertainty in the Facies Probability Cubes Using an Ensemble Kalman Filter Methodology
Authors B. Sebacher*, R.G. Hanea and T. EkThe work presented here introduces a new framework in the plurigaussian simulation context, which takes into account the uncertainty in the prior probability cubes. At the end of the assimilation process, we are able to offer besides an update of the facies fields, an update of the facies probability cubes. In order to achieve that, we extend the adaptive plurigaussian simulation methodology to work with multiple realizations of facies probability cubes and afterwards condition the facies fields to the production data. We generate an ensemble of facies fields by means of an ensemble of pairs of Gaussian fields. Each pair of Gaussian fields simulates a facies field from a different family of probability cubes. In the pluri-Gaussian methodology, the Gaussian fields represent the parameterization of the facies fields and are the parameters updated by the AHM process. The updated facies fields are generated with the updated values of the Gaussian fields. In addition, we are using the updated values of the Gaussian fields for reconstruction of the updated facies probability cubes. In our example, the ensemble of the prior facies probability cubes is created by perturbing a single realization with noise that is in correlation with a given prior uncertainty.
-
-
-
Framework for Seismic Inversion of Full Waveform Data Using Sequential Filtering
Authors M. Gineste* and J. EidsvikSubsurface velocity inversion using full waveform modelling continues to be a challenging problem. Instead of approaching it as a deterministic optimization problem, it is here formulated in a probabilistic framework as a filtering problem, using shot data sequentially to update the estimation procedure. Such an approach has the potential of being more robust to e.g. noise and starting guess, but comes at the cost of more forward model evaluations. We present a small-sized synthetic example of using a sequential filtering method known as the Ensemble Kalman Filter for seismic velocity inversion and conclude with an outline of direction for further investigations.
-
-
-
Integration of Seismic and Well Data to Characterize Facies Variation in a Carbonate Reservoir - The Tau Model Revisited
Authors M. Elahi Naraghi and S. Srinivasan*In this paper, we present a novel method of data integration based on the permanence of ratio hypothesis. In order to model the conditional probability, it would be convenient if the information from each data source can be assessed independently in order to find P(A|B) and P(A|C), and then these joint probabilities are merged to calculate P(A|B,C) accounting for the redundancy between different data sources. We propose a methodology for calculating the redundancy between different sources of information. Our formulation is based on the information from each data modeled using a mixture of Gaussian assumption indicative of the multiple facies or categories of rock properties observed in the reservoir. We implemented the proposed methodology to characterize a carbonate reservior in the Gulf of Mexico. The available data sets were drill cutting data, core data, well log measurements and 3D seismic volume. We used core data to calibrate log measurements to lithofacies. Then, we merged the probability maps of lithofacies using permanence of ratio hypothesis and generated multiple realization by Monte-Carlo sampling from the probability maps. The modeling resulted in identification of reservoir regions that have higher proportion of dolomitized grainstones that might be suitable drilling targets.
-
-
-
An Approximate Bayesian Inversion Framework based on Local-gaussian Likelihoods
Authors M. Jullum* and O. KolbjørnsenWe derive a Bayesian statistical procedure for inversion of geophysical data to rock properties. The procedure is for simplicity presented in the seismic AVO setting where rock properties influence the data through elastic parameters. The framework may however easily be extended. The procedure combines sampling based techniques and a compound Gaussian approximation to assess local approximations to marginal posterior distributions of rock properties, which the inversion is based on. The framework offers a range of approximations where inversion speed and accuracy may be balanced. The approach is also well suited for parallelisation, making it attractive for large inversion problems. We apply the procedure to a 4D CO2 monitoring case with focus on predicting saturation content. Promising results are obtained for both synthetic and real data. Finally we compare our method with regular linear Gaussian inversion for density prediction, where our method gives an improved fit.
-
-
-
Forecasting Production Decline Rate in Unconventional Resources by Kriging of Functional Data
Authors A. Menafoglio, O. Grujic* and J. CaersThe world, in particularly, the USA has seen an explosion in development of unconventional shale resources. In these reservoirs drilling and production occurs at development times orders of magnitude shorter than in conventional resources. As a result, decisions about where to drill and how to complete wells (hydro fracturing) need to be made in almost real-time, rendering the more traditional modelling approaches of geostatistics and flow modelling impractical. In this abstract, we present a novel approach of using the existing production data in a shale play to interpolate production decline rates for newly proposed wells. We develop these methods using novel techniques in statistical modelling based on the kriging of functional data and compare a variety of methods applied to the Barnett shale reservoir. Our Barnett dataset comes form publicly available databases (such as drillinginfo.com), and we considered a period of first 60 months of production in our study. Production profiles and well locations from 456 wells in our dataset were used for training purposes, with an aim to forecast remaining 456 wells that were not used for training (test set).
-
-
-
A Functional Data Analysis Approach to Surrogate Modelling in Reservoir and Geomechanics
Authors E. Della Rossa* and F. BottazziGenerally computational costs of reservoir and geomechanical models can be particularly high, making uncertainty evaluation and risk assessment difficult to perform. To overcome this problem, several approximation methodologies based on surrogate modelling have been developed and are commonly adopted. On the other side, Functional Data Analysis is a well-established technique in statistics but its application in reservoir uncertainty evaluation is less common. We present here a functional data analysis technique for reservoir and geomechanical models. The proposed approach, combines surrogate modelling and Functional Data Analysis to build, for a definite set of input values in the uncertainty space, a functional interpolation whose objects are functions representing the output variables in a full range of times or in a given time-space domain of interest. The methodology is particularly suited for geomechanical uncertainty assessment where the output variables are characterized by a relatively smooth behaviour and the computational cost for a direct Monte Carlo approach is very high. The methodology is first illustrated with a geomechanical uncertainty characterization problem and then through a real reservoir application. In low-dimensional uncertainty characterization studies, the proposed method makes possible to perform reliable time-space dependent risk assessment with a very limited computational cost.
-
-
-
Gradient Pore Pressure Modelling with Uncertain Well Data
Authors R. Nunes, P. Correia, A. Soares*, J.F.C.L. Costa, L.E.S. Varella, G. S. Neto, M. B. Silka, B.V. Barreto, T.C.F. Ramos and M. DominguesAbnormal pore pressures can result in drilling problems such as borehole instability, stuck pipe, circulation loss, kicks, and blowouts. Gradient pore pressure prediction is of great importance for risk evaluation and for planning new wells in early stages of development and production of oil reservoirs. In this paper, a stochastic simulation with point distributions method is presented to integrate uncertain data in pore pressure cube characterization. The method consists in the use of direct sequential simulation with point distributions. Wells data, in this case, are considered “soft” data, of which uncertainty is quantified by local probability distribution functions or a set of values. A case study using a real dataset is also presented to illustrate the results.
-
-
-
Quantifying Uncertainty in Pore Pressure Estimation Using Bayesian Networks, with Application to Use of an Offset Well
Authors R.H. Oughton*, D.A. Wooff, R.W. Hobbs, S.A. O'Connor and R.E. SwarbrickPore pressure estimation is a crucial yet difficult problem in the oil industry. If unexpected overpressure is encountered while drilling it can result in costly challenges and leaked hydrocarbons. Prediction methods often use empirical porosity-based methods such as the Eaton ratio method, requiring an idealised normal compaction trend and using a single wireline log as a proxy for porosity. Such methods do not account for the complex and multivariate nature of the system, or for the many sources of uncertainty. We propose a Bayesian network approach for modelling pore pressure, using conditional probability distributions to capture the joint behaviour of the quantities in the system (such as pressures, porosity, lithology, wireline logs). These distributions allow the inclusion of expert scientific information, for example a compaction model relating porosity to vertical effective stress and lithology is central to the model. The probability distribution for each quantity is updated in light of data, producing a prediction with uncertainty that takes into account the whole system, knowledge and data. Our method can be applied to a setting where an offset well is used to learn about the compaction behaviour of the planned well, and we demonstrate this with two wells from the Magnolia field.
-
-
-
Lateral Continuity of Stochastic Shale Barriers
Authors S. Lajevardi* and C.V. DeutschCharacterizing shale barriers in oilsands reservoirs is of critical importance for recovery predictions. High net to gross reservoirs contain small shale intervals that impede vertical drainage and have a large impact on recovery. The information on reservoir and shale interval thicknesses collected from vertical delineation wells provide only limited information about the horizontal extent and connectivity of these intervals. The main challenge is that often, such flow barriers cannot be correctly characterized due to the large spacing of delineation wells; the shales are laterally too small to be correlated between wells. Stochastic shales' characteristics are a strong function of their depositional environment (Harris, 1975). Over the past few decades, a significant amount of stratigraphic literature on the nature and character of mud beds in fluvial and tidal settings has been published (Galloway and Hobday, 1996; Miall, 1996). However, the detailed geometry and structure of remnant shales in dominantly sandy sediments is not well documented which motivates studies such as this. This paper proposes a novel methodology based on an inverse modeling scheme to estimate the lateral extent of shales independent of the gridding system. The paper includes a description of the methodology and a case study with implementation details and validation steps.
-
-
-
Stochastic Modelling of Patterns Using Graph Cuts
Authors X. Li* and G. MariethozMultiple-point geostatistics (MPS) algorithms are very computationally demanding, which can limit their application in certain applications. Texture synthesis methods used in computer graphics involve concepts of training images which are similar as in MPS. Some very computationally efficient texture synthesis algorithms are able to produce geostatistical models that are comparable in quality with state-of-the-art MPS methods, while presenting computational advantages. In this paper we introduce a patch-based method based on graph cuts. It is a general tool which allowing to optimally cut patches incorporating information from previous parches using a max-flow algorithm. The cutting algorithm is based on the representation of a patch as a graph, and guarantees that the cut is optimal between more than two patches. By recording the errors of previous cuts and iteratively replacing patches areas of high error, the simulation is continuously improved and the training image texture is reproduced without noise or artifacts.
-
-
-
Geological Metric Space Description by SVM Classification - Turbidite Reservoir Case Study with Multiple Training Images
Authors A. Kuznetsova*, V. Demyanov and M. ChristieThis paper shows the challenges related to handling multiple training images for reservoir prediction. We have identified two of the main challenges in handling multiple geological scenarios by creating a lower dimensional representation of the ensemble of model realizations: (i) how to relate geological knowledge to the metric space; and (ii) how to navigate in the metric space to facilitate in model update. In this work we demonstrate how to solve the classification problem in the metric space accounting for geological knowledge from a variety of prior geological concepts. In this paper we established geological relations in the metric space by making the links to the space of geologically interpretable parameters. These results would allow us to enhance geological realism of the new models obtained through the update process in the metric space.
-
-
-
Integration of Multi-scale Uncertainty Assessment into Geostatistical Seismic Inversion
Authors L. Azevedo*, V. Demyanov and A. SoaresTraditional geostatistical seismic inversion approaches are able to account for the uncertainty related with the stochastic simulation algorithms that are used as part of the inverse methodology for the model perturbation. However, they assume stationarity and no uncertainty related with large scale geological parameters represented for example by the spatial continuity pattern and the prior probability distribution of the property to invert as estimated from well-log data. We propose a multi-scale uncertainty assessment for traditional iterative geostatistical seismic methodologies by integrating stochastic adaptive sampling and Bayesian inference to tune the variogram ranges and the prior probability distribution of the property to invert within the inverse workflow. The application of the proposed methodology to a challenging synthetic dataset showed a good convergence of the inverted seismic towards the recorded one while the local and global uncertainty were jointly assessed.
-
-
-
Geobodies Stochastic Analysis for Geological Model Parameter Inference
Authors J.M. Chautru, R. Meunier*, H. Binet and M. BourgesIt is sometimes difficult to infer the input parameters of a geological model. For example, when variogram based simulation methods like SIS, Truncated Gaussian or Truncated Pluri-Gaussian facies simulation methods are used, inferring the facies horizontal variogram range may be very difficult in heterogeneous contexts, due to interwell spacing. This paper presents an indirect method for inferring input parameters, based on geobodies characteristics analysis, which can be used in such difficult cases. The method also allows selecting realizations of a geological model which have properties that are critical in flow simulations. The method is using dynamic synthesis results and production data to help validating the model parameters.
-
-
-
First Arrival Travel Time Tomography - Bayesian Approach
Authors J. Belhadj*, T. Romary, A. Gesret and M. NobleFirst arrival time tomography aims at determining the propagation velocity of seismic waves from experimental measurements of their first arrival time. This problem is usually ill-posed and is classically tackled by considering various iterative linearised approaches. However, these methods can yield wrong seismic velocity for highly nonlinear cases and they fail to estimate the uncertainties associated to the model. In our study, we rely on a Bayesian approach coupled with an interacting Markov chain-Monte Carlo (MCMC) algorithm to estimate the wave velocity and the associated uncertainties. The main difficulty associated to this approach is that traditional MCMC algorithms can be inefficient when multimodal probability distributions or complex velocity models involving a great number of parameters come into play. Therefore, a first step toward an efficient implementation of the Bayesian approach is to properly parametrize the model to reduce its dimension and to select adequate prior distribution for the parameters. In this paper, we present a ten layers probabilistic model for the velocity, that we illustrate on tomography results.
-
-
-
MPS Facies Modelling of a Submarine Fan Reservoir in Southeast Brazil with SNESIM
Authors P.R.M. Carvalho*, L.G. Rasera, J.F.C.L. Costa and L.E.S. VarellaFlow in a reservoir is controlled predominantly by connectivity of permeability extremes, such as those associated with clear sand channels and shale layers. These elements usually feature complex spatial patterns which are difficult to describe with two-point statistics. Furthermore, specific relationships between the facies are often an important factor in reservoir geology, requiring the use of simulation methods capable of reproducing these associations in order to generate reliable reservoir models. In this work, we were able to bestow physical realism to the geostatistical realizations of a reservoir composed by submarine fans. Multiple-point geostatistics (MPS) relies on training images to model the spatial structure of variables. The MPS simulations were conditioned to seismic and geology data and yielded realistic maps of distributary channels within sand lobes interleaved with shale layers. We concluded that MPS enhances data conditioning and uncertainty assessment with reproduction of specific geometry and facies relationships, making it suitable for geometry-sensitive applications like flow simulations.
-
-
-
Mapping with Auxiliary Data of Varying Accuracy
More LessThere are several ways for integrating different sources of data in mapping processes: • Multivariate estimations (cokriging, collocated cokriging) which require the fitting of a multivariate model (variograms and cross-variograms) and a stationary context; • Kriging with external drift or kriging with bayesian drift, which can be applied in non-stationary contexts and requires a univariate model. This paper proposes another approach which is based on the definition of additional data, well distributed over the area of interest, which define upper and lower envelopes for the map to be drawn. These envelopes are built from auxiliary data and will be considered as soft data of less accuracy than hard data. The approach is based on the combination of two geostatistical methods that are quite rarely used, conditional expectation with inequalities and kriging with measurement error. After a brief reminder of the methods, some applications in geological modelling are proposed: * Control of extrapolation * Mapping of geological horizons using the full trace of horizontal wells * Integration of geophysical data of varying accuracy in mapping at regional scale * Mapping layer tops in layer-cake models.
-
-
-
Geostatistical Analysis of Different Faults Attributes and Relations between them, Taking into Account the Sampling Bias
Authors D. Kolyukhin*, A. Torabi and I. SilvestrovA statistical analysis of different faults' attributes and relation between them is presented. The new method for correction of statistical analysis results for probability distribution of faults and fractures lengths sampled under truncation and censoring effects and their intensity is developed. A series of test calculations confirm the accuracy and computational efficiency of the method.
-
-
-
Droplet Size Distribution of Crude Oil Emulsions - Stochastic Differential Equations and Bayesian Modelling
Authors A. Svalova*, G.D. Abbott, N.G. Parker and C.H. VaneWater-in-oil emulsions (WOE) are two-phase colloidal systems formed during crude oil production and spills. The high viscosity and stability of WOEs imply challenges during their clean-up and removal. Such emulsions are difficult to disaggregate due to a combination of chemical and physical factors. Ultrasound spectrometry can be used to characterise the WOE physical properties, providing access to, e.g. the droplet size distribution (DSD), density and viscosity. The DSD has been identified as a significant property impacting emulsion stability. This study focuses on the data post-acquisition stage modelling the droplet size growth as a stochastic process. Geometric Brownian motion (GBM) and Itô stochastic differential equations (SDEs) are used. Bayesian inference is introduced as a tool aiding in conditions of poor sample quality. The obtained model could predict emulsion separation indicated by a sufficiently large mean and standard deviation of the droplet growth process. It could be used for emulsions of different chemical compositions, including with added dispersants, allowing to characterise their impact on the WOE stability over time.
-
-
-
Global Stochastic Inversion Using "Analogs-wells" and Zonal Distributions - Application to an Unexplored Area
Authors A. Pereira*, R. Nunes, L. Azevedo, L. Guerreiro and M.J. PereiraHigh demand for hydrocarbons incentivizes the industry to look for new exploration opportunities in unexplored areas with high risk where new potential discoveries of importance might be located. Frontier locations are unexplored or underexplored basins, in which geological and geophysical information might be unavailable, or sparse. In this paper we present a new methodology based on the Global Stochastic Inversion (GSI) algorithm (Soares et al., 2007; Caetano, 2009), which uses Direct Sequential Simulation (Soares, 2001) as a global perturbation method to generate equi-probable models of acoustic impedance, and follows a iterative process to optimize a previously defined objective function. The convergence of the inverse process is evaluated by the local and global correlation coefficient between real seismic and synthetic seismogram. This new method can be useful for preliminary assessment of different scenarios in unexplored areas, where no well log information exists to be used as a constrain to an inverse problem. The method follows the GSI workflow, but without using any well data in the study area. Instead of that, our proposal is to use analogs information (outcrops, modern analogs, and wells logs data from nearby fields) to condition the generation of acoustic impedance models. In this procedure, the geometry and position of the “analog-well” is ignored and the analog information is only used in the form of spatial dispersion and spatial patterns of acoustic impedances (histograms and variograms) for each lithology/facies expected in the geological model, as defined by an expert.
-
-
-
Simulation of Surface Petrophysical Heterogeneities on Sedimentary Objects
Authors M. Parquer*, J. Ruiu and G. CaumonAccurate modelling of all scales of heterogeneities is necessary for a precise flow modelling inside clastic reservoir. We propose to simulate small scale heterogeneties that can be deposed at the interface between sedimentary structures (e.g. between accretion figure in channel point bars). The geometries of the considered sedimentary structures are modelled as boundary representation using the Non-Uniform Rational B-Spline (NURBS) mathematical formulation. This representation of various sedimentary objects (channel, point bar, clinoform, lobe and levee) provides a curvilinear framework for petrophysical properties simulations. The spatial distribution of petrophysical properties such as permeabilities has been simulated using unconditional Sequential Gaussian Simulations (SGS). An upscaling is then performed in order to integrate the surface properties into flow simulation grid. Nevertheless, these features are often very thin and local and cannot be upscaled to a complete reservoir grid cell. Thus, these interface small-scale disparities are considered as transmissibility multipliers applied to the faces of the grid’s cells. Moreover erosion due to the deposit of other geobodies can modify the repartition of surface heterogeneities. Thus, order of deposit and erosion between objects are taken into account through all these procedures.
-
-
-
New Approach in Geological Modelling of the Reservoir Based on the Spectral Theory
Authors M.M. Khasanov, B.V. Belozerov, A.S. Bochkov and O.M. Fuks*The paper considers the problem occurring in geological modelling of the oil field – the problem of well data interpolation and further reconstruction of the rock properties of the reservoir in the space between the wells. We introduce a new method based on the spectral theory for analysis and modelling of the reservoir rock properties. In the method for well log data representation the Fourier decomposition is used and then each harmonic is interpolated independently. The method allows to obtain more realistic geological models in case of complex and low-permeability reservoirs in comparison with conventional geostatistical methods. An application of this methodology to the real field data is presented which shows encouraging results. We expect this approach to make real improvement in quality and predictive value of the geological models.
-
-
-
Fuzzy Geological Model - Stochastic Realizations Preserving Deterministic Features of Data
More LessThe problem considered is stochastic interpolation of quantitative properties (e.g., porosity) in the space between wells. It is shown that the method of normal score transformation can lead to serious errors. Fuzzy model is described as an approach to categorical interpolation of quantitative data, which is quite natural in the geological environment. The number of categories in Fuzzy model can be about 20. The stochastic realizations of Fuzzy model save deterministic features presented in well data. It is shown that under certain conditions the realizations of Fuzzy model are similar to the realizations of Sequential indicator simulation.
-
-
-
Multi-scale Reservoir Modeling for CO2 Storage and Enhanced Oil Recovery Using Multiple Point Statistics
Authors N.W. Bosshart*, J.R. Braunberger, M. Burton-Kelly, N.W. Dotzenrod and C.D. Goreckiesearch Center (EERC) and Plains CO₂ Reduction Partnership Program, in collaboration with the U.S. Department of Energy, have constructed 3-D geocellular models for the purpose of studying CO₂ storage and CO₂ enhanced oil recovery (EOR). These efforts are gaining importance as we continue to investigate methods in climate change mitigation and greenhouse gas reduction. The models created in these efforts range in size from small-scale pinnacle reefs up to formation- and basin-scale, spanning various reservoir types and lithologies, and many have utilized the multiple point statistical (MPS) method in the facies modeling process. This method allows the incorporation of geologic understanding, in the form of a training image, to better capture reservoir heterogeneity. Some complex reservoirs may be divided into multiple ‘geobody’ regions for the MPS process, with each region having a unique training image and facies distribution. The various facies models constructed in these CO₂ storage and CO₂ EOR investigations are used to constrain petrophysical property distributions, which are then used to analyze total and effective pore volumes and viability for CO₂ injection. Dynamic simulations are run to assess CO₂ storage capacity, efficiency, utilization, and consideration of potential as a CO₂ storage resource.
-
-
-
Quality Analysis of Geostatistical Simulations through their Connected Structures
Authors G. Rongier*, P. Collon, P. Renard, J. Straubhaar and J. SausseVarious methods have been developped to perform geostatistical simulations. Depending on the studied case, each of them can claim to obtain the best results, but it is rarely supported by an objective and quantitative analysis, especially concerning the geological structure reproduction. In this work we propose to go deeper into the assessment of realization quality through connected geobodies, focusing on the capacity to reproduce connected geological strutures. This quality assessment relies on quality indicators computed on each realization. The realizations are then compared based on a dissimilarity calculated from the indicators. The dissimilarity analysis is facilitated using multidimensional scaling (MDS) completed with heat maps. The application of this methodology to a synthetic case and associated realizations gives rise to some practical considerations. While MDS is a powerful tool to facilitate the analysis, it does not represent exactly the dissimilarities, leading to possible misinterpretations. Details considering the relationship between the realizations should preferably be analyzed on the heat map as it represents directly the dissimilarities. This connected geobody-based method appears to complete pattern histogram-based method, which less take into account pattern relationships.
-
-
-
Importance of Facies Connectivity in History Matching for 3D Channelized Reservoir Characterization
More LessChannelized reservoirs consist of sinusoid patterns for sandstone. It is the most important for decision making to characterize connectivities because they significantly affect reservoir performances. Ensemble Kalman filter (EnKF) and ensemble smoother (ES) modify reservoir models using dynamic data. However, it is difficult for them to apply to channelized reservoirs because they assume that model parameters follow Gaussian distribution. The purpose of this research is to characterize 3D channel connectivities using ensemble-based methods. We use the concept of multiple Kalman gains for improved inverse modeling and it is applied to ES for fast history matching. Multiple Kalman gains are calculated by distance-based method such as hausdorff distance and kernel kmeans clustering. The proposed method, ES with multiple Kalman gains, is compared to EnKF and ES for 3D synthetic case. It solves overshooting problem in ES and describes better channel connectivities and bimodal distribution than EnKF. Furthermore, it requires only 6.4% simulation time of EnKF. When oil production rates and water cuts are predicted by updated ensembles, only the proposed method gives reliable uncertainty quantifications while EnKF and ES deviate from the true productions. Therefore, the proposed method can be utilized for fast decision making tool for 3D channelized reservoirs.
-
-
-
Development Scenario Optimization under Geological Description Uncertainty
Authors N.V. Bukhanov, I.I. Alekhin, V.V. Demyanov* and V.E. BaranovChoosing optimal development scenario is one of the most important tasks in case of a fluvial reservoir. Well configuration factors that can influence reservoir connectivity are well density, well orientation and length of completion zones. Therefore length of deviated well path and its major direction are the main parameters for searching of the most robust development scenario. Optimization is based on three geological descriptions representing one of three algorithms (SIS, OBJ, MPS) and having minimum absolute error for the test wells. Parameters of each development scenario are optimized to reach maximum field oil production through 1000 realizations of differential evolution. Ranges of parameters distribution give useful trends showing the robustness on different geological descriptions.
-
-
-
Forecasting Reservoir Performance with Production Data without History Matching - Libyan Reservoir Study
More LessHistory matching has traditionally been an important element of forecasting, but, due to computational, as well as modeling complexities in real reservoir systems, many ideas have remained academic. In this work we propose a new paradigm to side-step the iterative history matching process and to aim instead to directly establish a statistical relationship between forecast variables (future water & oil rates) and historical production data variables (past water & oil rates). A novel statistical method is developed for this purpose: canonical function principal component analysis (CFCA). A real field case study is presented with complex model variability that includes depositional, geostatistical, structural geological as well as fluid property uncertainty. Forecast are shown to be the same as full history matching with a fraction of computational cost.
-
-
-
Inverse Sequential Simulation - Inversion by Conditioning
Authors J.J. Gomez-Hernandez* and T. XuInverse sequential simulation combines features of the ensemble Kalman filter and of multivariate sequential simulation resulting into an algorithm that generates history-matched permeability realisations by conditioning on pressure observations.
-
-
-
Ensemble Kalman Methods in Reservoir History Matching - Why Do they sometimes Fail and how Can you Fix it?
Authors J. Saetrom*, T.F. Munck and E. MorellThe ensemble Kalman based methods have seen numerous successful application over the past 10 years in fields such as numerical weather predictions, oceanography, and reservoir history matching. However, it is well-known that the standard implementation of the ensemble Kalman update equation can lead unphysical model updates and the problem known as filter divergence (or ensemble collapse). In this paper, we will re-visit recent theoretical results which highlights the issues of the standard ensemble Kalman update equations and identifies how you can potentially fix them. We use the data from the Brugge field to demonstrate the link between the theoretical results and practical applications in reservoir history matching.
-
-
-
Efficient Neighborhoods for Kriging with Numerous Data
Authors M. Vigsnes*, P. Abrahamsen, V.L. Hauge and O. KolbjornsenKriging is a data interpolation method that can be used to populate regular grids from data scattered in space, and requires the solution of a linear equation system the size of the number of data. When the data is numerous the speed of the calculation is slow. In this paper we propose to divide the regular grid into rectangular sub-segments and let all the grid cells in each sub-segment share a common data neighborhood. The advantage of this approach is that the number of data in the neighborhoods can be small compared to the complete dataset and it is possible to reuse some of the computations for all grid cells in each sub-segment. We show that the precision can be controlled through selection of neighbourhood size, and that the speed of the calculations can be optimized through selection of sub-segment size. We show that this is an efficient method for kriging when number of data is huge, giving a significant speed-up even for high data densities and precisions.
-
-
-
Invariant Formulations of Inverse Problems
Authors K. Mosegaard* and T.M. HansenMathematical physics is based on the fundamental assumption that physical predictions must be the same, independently of the parameterization of the system. This principle even constitutes the very foundation of certain physical theories, of which the theory of relativity is perhaps the most notable. The importance of the principle is that it seeks to maintain objectivity: When two different analysts predict the evolution of the same physical system, but use different parameterizations (reference systems), their predictions must agree physically. Otherwise the theory would give results that depend on the individual analysts's preferences, and hence be subjective. Our question here is the following: What would happen if we impose the same constraints on modeling and data inversion? What if we required that our procedures for modeling and inversion should be designed such that no conflicts between analysts would appear? We will look into this problem by focusing on three different problem areas where possible inconsistencies may occur.
-