- Home
- Conferences
- Conference Proceedings
- Conferences
ECMOR XII - 12th European Conference on the Mathematics of Oil Recovery
- Conference date: 06 Sep 2010 - 09 Sep 2010
- Location: Oxford, UK
- ISBN: 978-90-73781-89-4
- Published: 06 September 2010
41 - 60 of 117 results
-
-
A Stochastic Sharpening Method for the Propagation of Phase Boundaries in Multiphase Lattice Boltzmann Simulations
Authors T. Reis and P.J. DellarExisting lattice Boltzmann models that have been designed to recover a macroscopic description of immiscible liquids are only able to make predictions that are quantitatively correct when the interface that exists between the fluids is smeared over several nodal points. Attempts to minimise the thickness of this interface generally leads to a phenomenon known as lattice pinning, the precise cause of which is not well understood. This spurious behaviour is remarkably similar to that associated with the numerical simulation of hyperbolic partial differential equations coupled with a stiff source term. Inspired by the seminal work in this field, we derive a lattice Boltzmann implementation of a model equation used to investigate such peculiarities. This implementation is extended to different spacial discretisations in one and two dimensions. We shown that the inclusion of a quasi-random threshold dramatically delays the onset of pinning and facetting.
-
-
-
Parallel Sparsified Solvers for Reservoir Simulation
By H.M. KlieThe premise of the present work lies on the fact that a large number of system coefficients may be disregarded without compromising the robustness of the overall solution process. That is, given a linear system it is possible to construct a preconditioner by dropping a large number of off-diagonal nonzeros and use it as a suitable proxy to approximate the original matrix. This proxy system can be in turn factored to generate a new class of block ILU preconditioners, approximated inverses and algebraic multigrid implementations. We propose two parallel algorithms to sparsify a given linear system: (a) random sampling sparsification (RSS), and (b) percolation-based sparsification (PBS). The former one relies on the idea that coefficients are included into the sparsified system with a probability proportional to its effective transmissibility. The latter relies on capturing highly connected flow paths described by whole set of transmissibility coefficients. Depending on the case, the RSS and PS algorithms have the potential to reduce in orders of magnitude the number of floating point operations associated with the preconditioner. Results confirming the benefits of sparsified solvers are illustrated on a wide set of field cases arising in black-oil and compositional simulations.
-
-
-
High Performance Manycore Solvers for Reservoir Simulation
More LessThe forthcoming generation of many-core architectures compels a paradigm shift in algorithmic design to effectively unlock its full potential for maximum performance. In this paper, we discuss a novel approach for solving large sparse linear systems arising in realistic black oil and compositional flow simulations. A flexible variant of GMRES (FGMRES) is implemented using the CUDA programming model on the GPU platform using the Single Instruction Multiple Threads (SIMT) paradigm by taking advantage of thousands of threads simultaneously executing instructions. The implementation on the GPU is optimized to reduce memory overhead per floating point operations, given the sparsity of the linear system. FGMRES relies on a suite of different preconditioners such as BILU, BILUT and multicoloring SSOR. Additionally, the solver strategy relies on reordering/partitioning strategies algorithms to exploit further performance. Computational experiments on a wide range of realistic reservoir cases show a competitive edge when compared to conventional CPU implementations. The encouraging results demonstrate the potential that many-core solvers have to offer in improving the performance of near future reservoir simulations.
-
-
-
Multiple Kernel Learning Approach for Reservoir Modelling
Authors V. Demyanov, L. Foresti, M. Kanevski and M. ChristieThe paper proposes a novel data driven approach for modelling petrophysical properties in oil reservoir. We aim to improve realism of reservoir models with a more intelligent way of integrating the raw data and geological knowledge. Multiple Kernel Learning (MKL) provides enhanced interpretability of the model by using separate kernels for input variables performs kernel/feature selection to solve a regression problem in a high-dimensional feature space. MKL has an advantage of rigorous control over the model complexity to achieve the right balance between data fit and prediction accuracy. The MKL reservoir model was designed to integrate data and prior knowledge, which describe geological structure at multiple scales. Geological structures can be detected by applying convolution filters on noisy seismic data to capture changes in gradients, orientations, sizes of meandering channels. Such "geo-features" are added as input variables into the MKL model, which optimises the weighted combination of kernels to fit to the available data. MKL application to a synthetic meandering channel reservoir has demonstrated capacity of selecting the relevant input information for detecting the channel structure. Experiments with noisy seismic inputs highlighted feature selection skills of MKL which was able to filter them out.
-
-
-
MP Simulations Without Computing MP Statistics
Authors G. Mariethoz, P. Renard, J. Straubhaar and J. CaersIn recent years, multiple-point simulation has become an invaluable tool to integrate geological concepts in subsurface models. However, due to the high CPU and RAM demand, its use is restricted to relatively small problems with limited structural complexity. Moreover, it only allows for the simulation of univariate fields. We present an alternative method that produces conditional realizations honoring the high-order statistics of uni- or multivariate training images. It is based on a sampling method introduced by Shannon (1948), strictly equivalent to the original method of Guardiano and Srivastava (1993), but that does not need to compute conditional probabilities and to store them. In the sampling process, we use a distance between data configurations that allows simulating both discrete and continuous variables. As a result, the simulation algorithm is drastically simplified and has more possibilities. Since nothing is stored, neighborhoods can have virtually any size. Moreover, the neighborhoods are not restricted to a template, making multiple-grids unnecessary. Multivariate data configurations can be considered, allowing to generate realizations presenting a given multivariate multiple-point dependence. In addition to having virtually no RAM requirement, the method is straightforward to parallelize. Hence it can produce very large and complex realizations.
-
-
-
Application of Stochastic Partial Differential Equations to Reservoir Property Modelling
Authors R. Potsepaev and C.L. FarmerExisting algorithms of geostatistics for stochastic modelling of reservoir parameters require a mapping (the 'uvt-transform') into the parametric space and reconstruction of a stratigraphic co-ordinate system. The parametric space can be considered to represent a pre-deformed and pre-faulted depositional environment. Existing approximations of this mapping in many cases cause significant distortions to the correlation distances. In this work we propose a coordinate free approach for modelling stochastic textures through the application of stochastic partial differential equations. By avoiding the construction of a uvt-transform and stratigraphic coordinates, one can generate realizations directly in the physical space in the presence of deformations and faults. In particular the solution of the modified Helmholtz equation driven by Gaussian white noise is a zero mean Gaussian stationary random field with exponential correlation function (in 3-D). This equation can be used to generate realizations in parametric space. In order to sample in physical space we introduce a stochastic elliptic PDE with tensor coefficients, where the tensor is related to correlation anisotropy and its variation is physical space.
-
-
-
Spartan Random Fields and Applications in Spatial Interpolation and Conditional Simulation
More LessGeostatistics plays an important role in the estimation and simulation of reservoir parameters, and in forecasting reservoir production. This presentation focuses on Spartan spatial random fields (SSRFs), which provide a new framework for geostatistical applications. The idea motivating SSRF development is that spatial correlations can be represented by local interactions. A brief overview of SSRF mathematical properties will be given. SSRF variogram models, which in the isotropic case contain 3-4 parameters and thus offer more flexibility than standard models, will be presented. It will be argued that empirical variogram estimation is not necessary to infer SSRF parameters. SSRF spatial interpolation of scattered data will be discussed. It will be shown that approximate but closed-form, efficiently calculable expressions become possible. Hence, numerical solution of kriging linear systems is avoided, leading to improved numerical complexity. Connections will be drawn between to Markov random fields, statistical physics, minimum curvature estimation, and local random fields. An SSRF application to 2D conditional simulation will be presented. Finally, directions for the future development of SSRFs will be discussed, including extensions to non-Gaussian data distributions, using discretized random field models with “spin” interactions.
-
-
-
Isometric Unfolding of Stratigraphic Grid Units for Accurate Property Populating – Mathematical Concepts
Authors S. Horna, C. Bennis, H. Borouchaki, C. Delage and J.F. RainaudIn traditional methods used to populate stratigraphic units, the distortions can be very important and affect the setting up of the static and dynamic parameters necessary to the reservoir simulation, hence the simulation results. These distortions result from the mapping between the original curvilinear stratigraphic grid and the intermediate cartesian grid in which the property populating is processed. To minimize the deformation and improve the populating process, we propose a new original isometric unfolding process based on the minimization of the elastic tensor deformation. This method could be applied for every type of deposit: horizontal, parallel to top, parallel to bottom, proportional. Starting from a structural model defined into a coordinate line grid, the user chooses a reference iso-chronological level represented by a triangulated surface. The contacts between this surface and fault surfaces are explicitly extracted as coincident edges. These coincident edges are used to constraint an unfolding process, respecting the above constraints, join the fault lips opened by geological tectonic events. In this paper we will focus on mathematical concepts of the algorithms used in the whole unfolding process and present some results throw actual case studies.
-
-
-
History Matching of a Stochastic Multifractal Subesismic Fault Model
Authors M. Verscheure, A. Fourno and J.P. ChilèsMany geostatistical methods have been developed to generate realistic models of subseismic fault networks. The problem is that these models are difficult to history match. In this work, we present an original methodology in which history matching is performed through a modification of fault positions. We first propose a multifractal methodology to generate the faults. The method has been specifically developed to allow history matching. The model parameters are derived from the analysis of the seismic faults. A stochastic algorithm is used to generate 3D subseismic fault realizations that are constrained to the seismic faults. The fracture network is then discretized on a dual porosity simulation grid. Equivalent flow parameters are computed using an analytical method. Last, full field simulations are performed using a multiphase flow simulator. Then, we introduce a method to gradually change the locations of faults while preserving multifractal properties. Changes in locations are driven from a reduced number of parameters. These parameters are gradually modified to optimize the geometry of the realization and compel the initial fault model to reproduce the hydrodynamic behaviour observed on the field. The potential of the methodology is demonstrated on a 3D case study.
-
-
-
Fault Displacement Modelling Using 3D Vector Fields
Authors P. Røe, F. Georgsen, A.R. Syversveen and O. LiaIn history matching and sensitivity analysis it is useful to study the effect of changing the position of horizons near faults and the resulting facies juxtaposition due to change in fault displacements. A new algorithm for calculating a 3D displacement field is developed. This is applicable to a wide range of faults due to a flexible representation and gives the possibility to apply this field to change the displacement and thereby moving horizons. The fault is represented as a surface in a local coordinate system defined by a centre point, dip and strike angles. The displacement of points associated with the fault outside the fault surface is described by a 3D vector field. The displacement on the fault surface can be found from the intersection lines between horizons and the fault surface (fault lines). Away from the fault surface the displacement field is defined by a monotonic decreasing function. The displacement for the fault can be changed by a user-defined factor. Then the whole displacement field is changed and points on horizons around the fault are moved by applying the modified displacement field. The interaction between faults influencing the same points are taken care of by truncation rules.
-
-
-
Quantifying of the Impact of Reservoir Heterogeneity on Recovery Using Shear and Vorticity
Authors B. Rashid, A.L. Bal, G.J.J. Williams and A.H. MuggeridgeWe have used the vorticity of the displacement velocity, as defined by Heller (1966), to derive dimensionless numbers to be used to quantify the relative impact of viscosity ratio, gravity, diffusion and dispersion, and permeability heterogeneity on reservoir flow behaviour. We have used this approach to introduce a new objective measure of the impact of permeability and porosity heterogeneity on reservoir flow behaviour. Buoyancy forces are quantified using a gravity to viscous ratio (G) and diffusion/dispersion using the transverse dispersion number. Detailed simulation of first contact miscible gas/solvent displacements through a range of geologically realistic reservoir models is used to show that the new heterogeneity number, in conjunction with the dimensionless numbers, can be used to provide meaningful results for real non-linear reservoir flows. This study goes some way towards developing a unified mathematical framework to determine under which flow conditions reservoir heterogeneity becomes more important than other physical processes.
-
-
-
A New Parameterization Technique for the Calibration of Facies Proportions in History–matching Processes
Authors G. Enchery, F. Roggero, M. Le Ravalec, E. Tillier and V. GervaisIn this paper, we propose a parameterization technique to automatically adjust facies proportions in a history-matching process. Facies proportions being usually non stationary, they involve a number of parameters proportional to the number of grid blocks which can not be handled in practice. Our technique allows us to vary facies proportions from a reduced number of parameters. This method depends on the ratio of average proportions between facies groups with a priori poorly known proportions. The changes in the ratio values are driven by the optimization process and induce variations in facies proportions over the target area. Two different algorithms are introduced depending on the geological environment. The simplest and most efficient one generates discontinuity between the target area and the embedding environment while the second one ensures continuity. One advantage of both techniques is that they do not depend on the stochastic method used to simulate the facies realizations. To stress the potential of this approach, we recap the results obtained for a faulted turbidite field located in offshore Angola and show how this technique helped to improve the calibration of 4D seismic data.
-
-
-
A New Method for Updating and Assessing Validity of Prior Reservoir Models
Authors G.M. Mittermeir, M.T. Amiry and Z.E. HeinemannThe paper presents a new approach for updating reservoir models beeing history matched previously. The main concern regarding any reservoir model is to maintain the predictive capability. Practical experience shows that this capability is lost very soon. Consequently updating of prior simulation models goes along with significant changes in the model itself. Often this requires not only tuning of the aquifers but also modifications of the reservoir properties. Based on the results it will be decided if the model is still valid or if an entire update is necessary. A newly developed method called Target Pressure and Phase Method (TPPM) provides a mean to significantly improve this process. TPPM does not only speed-up this update procedure but additionally assesses the validity of the model reliably. In the conventional history matching workflow pressures, water-cut and GOR are calculated and the reservoir model will be changed until it fits the history. TPPM considers the pressure measurements and the oil/gas/water production rates as input and determines the aquifer parameters and the well conditions needed in order to match the given historical measurements. The paper will present the basic idea as well as its applicability to a field case.
-
-
-
Mathematical Reformulation of Highly Nonlinear Large-scale Inverse Problem in Metric Space
Authors K. Park, C. Scheidt and J. CaersSolving a highly nonlinear and often time-dependent large-scale inverse problem is still challenging due to the computing costs, various types and scales of data, nonlinearity between the model and the data. Based on the observation that in most inverse problem of flow the (geological) model is very complex while the response under investigation (the data) is of much lower dimension, we propose to reformulate the inverse problem in metric space. Once we know the distance between any two (geological) model realizations, any such model (whether a structure or property) can be mapped into a metric space, which is non-dimensional but can be represented as its projection to low-dimensional (typically 2D-5D) space through multi-dimensional scaling. Knowing the differences between the responses of the models and the data (= response from the true Earth), the location of this truth can be identified. Therefore, the inverse problem is reformulated by finding the ensemble of models which is mapped into the location of the truth. We propose a methodology to solve this reformulated inverse problem by a series of mathematical techniques. We apply the proposed method to a realistic reservoir history matching problem, which contains various types of constraints and requires large computational costs.
-
-
-
Monitoring 3D Reservoirs from CSEM Data Using a Level Set Technique
Authors O. Dorn and R. VillegasA new shape reconstruction technique is presented for Controlled Source ElectroMagnetic (CSEM) imaging of subsurface structures such as reservoirs. he technique can in principle be used both for exploration and for monitoring of active reservoirs. The main characteristic of our algorithm is to ncorporate certain types of a priori information into the estimation which has the potential to increase resolution. In particular, we assume that only he shape and topology of the structures need to be determined from the data. The background is assumed to be known from alternative tools such as eismics, well-logs or geological information. The full system of Maxwell's equations in 3D is employed for the inversion and combined with a level set echnique for representing shapes. Numerical examples are presented which show that very good reconstructions can be obtained with this novel econstruction technique from CSEM data for realistic simulated situations.
-
-
-
Structural Uncertainty Modelling and Updating Using the Ensemble Kalman Filter – Parameters and State Consistency
Authors A. Seiler, S.I. Aanonsen and G. EvensenIn previous work, the authors presented an elastic grid approach to handle horizon and fault geometric uncertainties in the reservoir model and established an assisted history-matching workflow for updating the structural model using the EnKF. In this paper we consider a gas-flooding experiment in which the flow path is controlled by the reservoir top structure. Uncertainties in the top horizon are accounted for and updated by sequential assimilation of production data using the Ensemble Kalman Filter (EnKF). The result is an ensemble of history-matched models, with reduced and quantified uncertainty in the top horizon. The updated estimate of the top horizon captures the main features of the reference structure. We focus on the consistency between the reservoir top horizon and the gas saturation, as it has been shown that when the assumption of Gaussian priors in the EnKF is violated, the EnKF update scheme may lead to inconsistencies between the state and model parameters. We study in detail the sequential updating of the gas saturation and investigate if the updated state is a better description of the reservoir condition given the uncertainties in the reservoir description and modelling errors, or if a reinitialization with the updated parameters is needed to solve the inconsistency issues before predictions.
-
-
-
Assessing the Impact of Different Types of Time-lapse Seismic Data on Permeability Estimation
Authors T. Feng and T. MannsethWe consider the impact of using time-lapse seismic data in addition to production data for permeability estimation in a porous medium with multi-phase fluid flows, such as a petroleum reservoir under water-assisted production. Since modeling seismic wave propagation in addition to modeling fluid flows in the reservoir is quite involved, it is assumed that the time-lapse seismic data have already been inverted into differences in elastic properties, or even fluid-saturation and pressure differences (pseudo-seismic data). Because an inversion process often leads to considerable error growth, we will consider pseudo-seismic data with large uncertainties. The impact of pseudo-seismic data is assessed through permeability estimation with and without such data, and through application of some uncertainty measures for the estimated parameters. A predictor-corrector technique is used for the parameter estimations. The predictor leads to a coarse-scale permeability estimate, using only dynamic data. The corrector downscales the predictor estimate into a more smoothly varying field, using also the prior model. A successful final corrector result will thus be consistent with the prior model and reconcile dynamic data. The predictor-corrector approach can reveal what parameter resolution that can be achieved in a stable manner with different data types, since it can be applied without an explicit regularization term in the objective function. In this work, the impact of pseudo-seismic data will be investigated based on both coarse-scale predictor solutions and fine-scale predictor-corrector solutions. The numerical examples clearly indicate that the permeability estimation problem is stabilized at a higher level of resolution when pseudo-seismic data are applied in addition to production data, even if the pseudo-seismic data have large associated uncertainties. Several types of pseudo-seismic data, like acoustic-impedance differences, fluid-saturation differences, and pressure differences, are tested, and the resulting estimates are compared.
-
-
-
Comparing the Adaptive Gaussian Mixture Filter with the Ensemble Kalman Filter
Authors A.S. Stordal, H.A. Karlsen, G. Nævdal, H.J. Skaug and B. VallèsOver the last years the ensemble Kalman filter (EnKF) and related versions have become a very popular tool for reservoir characterization. The EnKF presents the history matching result and uncertainty in terms of an ensemble of models generated from a prior model and updated sequentially in time to account for the measurements. From a statistical point of view, the optimal solution to the history matching problem is the posterior distribution of the parameters in the reservoir given all the measurements. However, since the EnKF update is linear it has severe limitations when the posterior distribution to be estimated is multimodal and/or strongly skewed due to nonlinearity of the system. As standard sequential Monte Carlo (SMC) techniques are too expensive for large models. Several methods have been proposed to combine the EnKF with SMC methods. In this paper we apply, for the first time, the recently proposed Adaptive Gaussian Mixture filter (AGM), introduced by Stordal et al. 2009, on a reservoir model and compare the results with the traditional EnKF. The AGM tries to loosen up the requirement of a nearly linear/Gaussian model by combining a relaxed EnKF update with an importance weights resampling approach, thereby taking advantage of some of the higher order moments information as in standard SMC methods such as particle filter whilst keeping the robustness of EnKF. The reservoir is a 2D synthetic reservoir model with 4 producers and 1 injector. The permeability and porosity fields are estimated. Although both methods produce good history matching, the results show that the AGM better preserves the geology of the prior model. Moreover, the last updated fields with AGM are closer to the truth than the corresponding EnKF results.
-
-
-
Population MCMC Methods for History Matching and Uncertainty Quantification
Authors L. Mohamed, B. Calderhead, M. Filippone, M. Christie and M. GirolamiThis paper presents the application of a population MCMC technique to generate history matched models. The technique has been developed and successfully adopted in challenging domains such as computational biology, but has not yet seen application in reservoir modelling. In population MCMC, multiple Markov chains are run on a set of response surfaces that form a bridge from the prior to posterior. These response surfaces are formed from the product of the prior with the likelihood raised to a varying power less than one. The chains exchange positions, with the probability of a swap being governed by a standard Metropolis accept/reject step, which allows for large steps to be taken with high probability. We show results of Population MCMC on the IC Fault Model - a simple 3 parameter model that is known to have a highly irregular misfit surface and hence be difficult to match. Our results show that population MCMC is able to generate samples from the complex, multi-modal posterior probability surface of the IC Fault model very effectively. By comparison, previous results from stochastic sampling algorithms often focus on only part of the region of high posterior probability depending on algorithm settings and starting points.
-
-
-
Adaptive Pilot-point Strategy for History Matching of 3D Seismic Data
Authors S. Da Veiga and V. GervaisMatching seismic data in assisted history matching can be a challenging task. Local parameterization techniques such as pilot-point or gradual deformation methods can be introduced. While information related to seismic data is sometimes considered to initialize such local methods, it is no longer used for further steps and this results in non-adaptive procedures. We propose to use the residual maps related to the 3D seismic data to drive the generation of pilot point locations in an adaptive way. Residual maps are considered as probability density functions of the pilot point locations. Once generated, these locations are changed according to a gradual deformation method which is applied to the random numbers associated to the locations. Last, the optimal locations and values provide a reservoir model which is used to build a new residual map and the procedure is repeated until convergence. A straightforward application of this technique leads to successive optimizations of a discontinuous objective function since locations jump from a high probability region to another. Consequently, we develop an innovative pre-treatment of residual maps based on Gaussian mixtures. The combination of these steps leads to a highly flexible perturbation technique for 3D seismic matching.
-