- Home
- Conferences
- Conference Proceedings
- Conferences
ECMOR XIII - 13th European Conference on the Mathematics of Oil Recovery
- Conference date: 10 Sep 2012 - 13 Sep 2012
- Location: Biarritz, France
- ISBN: 978-90-73834-30-9
- Published: 10 September 2012
21 - 40 of 114 results
-
-
New Formulation of the Objective Function for Better Incorporation of 4D Seismic Data into Reservoir Models
Authors R. Derfoul, S. Da Veiga, C. Gout and C. Le GuyaderTo build consistent reservoir models, 4D seismic data are an invaluable source of information on fluid displacements and geology over extensive areas of the reservoir. In this paper, we focus on the integration of such data to improve the obtained optimal model in a history matching process. However, this is a challenging task involving a proper definition of the objective function. The objective function computes the discrepancy between observed data and responses computed by the reservoir model. Classical formulations based on the least square mismatch are not adapted to deal with complex, noisy and numerous data such as 4D inverted seismic data. In this paper, we study the integration of seismic data in order to improve the optimal model obtained by the history matching process. The main focus of this paper is to define an experimental methodology to compare and classify seismic matching methods. In particular, we propose an efficient algorithm which focuses on the main trends in a seismic cube. This new algorithm is investigated in the context of seismic data, and its potential is demonstrated on several history matching reservoir examples.
-
-
-
Time Lapse Inversion Workflow Constrained by Reservoir Grid Parameterization
By P. ThoreTime lapse seismic provides key information for assisted history matching. Qualitatively geo-bodies extracted from 4D data reflects the front for a given flow event (e.g. the water flood due to an injector). But 4D data can also provide quantitative information relative to dynamic parameters. We propose a novel workflow for quantitative use of geophysical data for AHM which consists in three steps: 1. Model-Based-Inversion which keeps the layer parameterization of the reservoir grid. This parameterization introduces high and low frequencies missing in the seismic bandwidth. 2. Pressure and saturation inversion constrained by dynamic information and handling uncertainty on data and model. 3. Direct mapping of seismically derived information into the reservoir grid (without using any time to depth conversion). That solves the main problem inherent with the vertical change of support from the (regular) seismic grid to the (irregular layer-based) reservoir grid. Our paper is illustrated with real data examples.
-
-
-
Optimal Choice of a Surveillance Operation Using Information Theory
Authors A.C. Reynolds and D.H. LeWe consider the problem of choosing among a suite of potential reservoir surveillance operations. We frame the problem in terms of two questions: (1.) Which surveillance operation is the most useful? (2.) What is the expected value of the reduction in uncertainty in the reservoir variable J (e.g. cumulative oil production) that would be achieved if we were to conduct each surveillance operation to collect and history-match the data obtained? Note that the objective is to answer these questions with an uncertain reservoir description and without any actual measurements. We propose a procedure based on information theory to answer these questions. Question 1 is answered by calculating the mutual information between J and the vector of observed data. Question 2 is answered by estimating the expected value of the standard deviation (or P90-P10) of J in the posterior model from the conditional entropy of J. We apply the proposed method to two simple problems, a nonlinear toy problem and a simple water flooding problem. The results are verified by an exhaustive history matching procedure, which is reasonably rigorous but very computationally demanding. We find that the mutual information approach is a fast and reliable alternative to the history matching approach.
-
-
-
Application of the Adaptive Gaussian Mixture Filter to History Match a Real Field Case
Authors R. Valestrand, G. Nævdal and A.S. StordalOver the last decade the ensemble Kalman filter (EnKF) has attracted attention as a promising method for solving the reservoir history matching problem: Updating model parameters so that the model output matches the measured production data. The method possesses unique qualities such as; it provides real time update and uncertainty quantification of the estimate, it can estimate any physical property at hand. The method does, however, have its limitations; in particular it is derived based on an assumption of a Gaussianity. A recent method proposed to improve upon the original EnKF method is the Adaptive Gaussian mixture filter (AGM). The AGM loosens up the requirements of a linear and Gaussian model by making smaller linear updates and including importance weights associated with each ensemble member at computational costs as low as EnKF. In this paper we present results where the AGM algorithm is combined with localization. To validate the performance of AGM the result is compared with the EnKF, with and without localization. From the results, we are able to distinguish the performance of the different filters. In particular all the methods provide good history match, but we see that the AGM stands out by better honoring the original geostatistics.
-
-
-
Neural Networks and their Derivatives for History Matching and other Seismic, Basin and Reservoir Optimization Problems
Authors J. Bruyelle and D.R. GuérillotDescription: In geosciences, complex forward problems met in geophysics, petroleum system analysis and reservoir engineering problems often requires replacing these forward problems by proxies, and these proxies are used for optimizations problems. For instance, History Matching of observed field data requires a so large number of reservoir simulation runs (especially when using geostatistical geological models) that it is often impossible to use the full reservoir simulator. Therefore, several techniques have been proposed to mimic the reservoir simulations using proxies. Due to the use of experimental approach, most of authors propose to use second order polynomials. In this paper we demonstrate that: (1) Neural networks can also be second order polynomials. Therefore, the use of a neural network as a proxy is much more flexible and adaptable to the non linearity of the problem to be solved; (2) First order and second order derivatives of the neural network can be obtained providing gradients and hessian for optimizers. For the first point, a complete description of a neural network equivalent to a second order polynomial will be given. For inverse problems met in seismic inversion, well by well production data, optimal well locations, source rock generation, etc., most of the time, gradient methods are used for finding an optimal solution. The paper will describe how to calculate these gradients from a neural network built as a proxy. When needed, the hessian can also be obtained from the neural network approach. Application: On a real case study, the ability of neural networks to reproduce complex phenomena (water-cuts, production rates. etc.) is showed. Comparisons with second polynomials (and kriging methods) will be done demonstrating the superiority of the neural network approach as soon as non linearity behaviors are present in the responses of the simulator. The gradients and the hessian of the neural network will be compared to those of the real response function. Results and conclusions: (1) Neural Network can replace advantageously polynomial and kriging approaches as proxies for inverse problems and uncertainty analysis, (2) A neural network giving a bilinear polynomial will be explicitly given, (3) Gradients and Hessian of neural network can be calculated and use by optimizers. Keywords: Proxies, History Matching, Gradient Methods, Optimizers, Basin Modelling, Seismic Inversion, Uncertainty Analysis, Hessian
-
-
-
North Sea Chalk Reservoir – Seismic History Matching Workflow
Authors H. Sudan, E. Tolstukhin and A. JanssenThis presentation outlines an integrated workflow that incorporates 4D seismic data into the North-Sea Chalk Reservoir history matching process. Successful application and associated benefits of the workflow process are also presented. A number of 4D seismic surveys have been acquired over this field between 1989 and 2008 and this data is becoming a quantitative tool for describing the spatial distribution of reservoir properties and compaction. The seismic monitoring data is used to optimize the waterflood by providing water movement insights and subsequently improve infill well placement. Reservoir depletion and water injection in this field lead to rock compaction and fluid substitution. These changes are revealed in space and time through 4D seismic differences. Inconsistencies between predicted 4D differences (calculated from reservoir model output) and actual 4D differences are therefore used to identify reservoir model shortcomings. This process is captured using the following workflow: prepare and upscale a geologic model; simulate fluid flow and associated rock-physics using a reservoir model; generate a synthetic 4D seismic response from fluid and rock-physics forecasts; and update the reservoir model to better match actual production data and 4D seismic observations. The above-mentioned Seismic History Matching (SHM) workflow employs rock-physics modeling to quantitatively constrain the reservoir model and develop a simulated 4D seismic response. Different parameterization techniques and seismic misfit formulations were validated and used to calibrate and update the reservoir model. This workflow updates the parameters in the closed loop system through minimization of a misfit function by using a customized Particle Swarm Optimization Algorithm. In summary, the Seismic History Matching workflow is a multi-disciplinary process that requires strong collaboration between geological, geomechanical, geophysical and reservoir engineering disciplines to optimize reservoir management.
-
-
-
Efficient Solution of the Optimization Problem in Model-reduced Gradient-based History Matching
Authors S. Szklarz, M. Rojas and M. KaletaAdjusting parameters in reservoir models by minimizing the discrepancy between the model's predictions and actual measurements is a popular approach known as history matching. One of the most effective techniques is gradient-based history matching. For reservoir models, the number of grid blocks and therefore, the size of the problem can become very large. In recent years, model-order reduction techniques aiming to replace large, complex dynamic systems with lower-dimension models have been incorporated into history matching. In both gradient-based history matching and model-reduced gradient-based history matching, first-order optimization methods are used in order to minimize the mismatch between simulated well-production data and observed production. In this work, we investigate the performance of some optimization methods on the minimization problem in model-reduced gradient-based history matching. The methods were tested on the history matching of a small reservoir model with synthetic measurements. Our results show that fast first-order techniques such as the spectral projected gradient method can compete with the popular quasi-Newton BFGS approach.
-
-
-
Deterministic Linear Bayesian Updating of State and Model Parameters
Authors O. Pajonk, B.V. Rosić and H.G. MatthiesBayesian estimation has become an important topic for inverse problems in the context of hydrocarbon recovery. The conceptual and computational advantages due to direct integration with uncertainty quantification workflows are appealing. Especially, linear Bayesian techniques like the ensemble Kalman filter (EnKF) have been successfully used in numerous cases. However, such techniques have difficulties in some applications which are often caused by sampling errors, a limited ensemble size, or the sometimes large number of required samples. In this work we present and discuss a closely related linear Bayesian technique which is based on orthogonal expansions of the stochastic spectrum of the involved random variables and random fields. Basically being a family of fully deterministic implementations of the well-known projection theorem of Hilbert spaces, the technique is conceptually simple, yet powerful. Since they are fully deterministic, these methods avoid all sampling errors. First combined parameter and state estimation results with a low-dimensional chaotic model are presented, using a specific choice of orthogonal expansion. These are compared to results obtained with EnSRF, since it is a close relative to these spectral estimation methods. Challenges and opportunities for applications to the inverse problem of identification for hydrocarbon reservoirs are discussed.
-
-
-
A New Global Upscaling Technique for 3D Unstructured Grids
Authors M. Karimi-Fard and L.J. DurlofskyNew procedures for unstructured coarse-model generation are presented and applied. The underlying fine-grid model is considered to be unstructured, and the coarse-model cells are defined as groupings of fine-grid cells. The key flow quantity that must be computed for the coarse model is the upscaled transmissibility for each cell-to-cell connection. We introduce a global upscaling procedure for this computation. The method first requires several (minimum of three) global single-phase flow solutions. Appropriately defined linear combinations of these solutions are used to compute each upscaled transmissibility. This approach circumvents some of the limitations of existing (local and global) upscaling procedures. It also enables transmissibility to be quickly computed for a number of different coarse grids without performing any additional pressure solutions. Results are presented for an idealized two-phase flow problem. The fine grid contains nearly 200,000 cells, and coarse models of varying resolution are considered. Accurate results for total injector-producer flow rate are observed for all grid-resolution levels for the three different well configurations considered. Oil rate as a function of time is shown to improve in accuracy with increasing resolution, and is quite accurate for a model of about 10,000 cells.
-
-
-
Grid Adaption for Upscaling and Multiscale Method
Authors K.-A. Lie, J.R. Natvig, S. Krogstad, Y. Yang and X.H. WuA Dirichlet-Neumann representation method was recently proposed for upscaling. The method expresses coarse fluxes as linear functions of multiple discrete pressure values along the boundary and at the center of each coarse block. The number of pressure values can be adjusted to improve the accuracy of simulation results, and in particular to resolve important fine-scale details. Improvement over existing approaches is substantial especially for reservoirs that contain high permeability streaks or channels. Multiscale methods obtain fine-scale fluxes or pressures at the cost of solving a coarsened problem, but can also be utilized for flexible upscaling. We compare the DNR and a multiscale mixed finite-element method. Both can be expressed in mixed form, with local stiffness matrices obtained as inner products of basis functions with fine-scale subresolution determined from local flow problems. Piecewise linear Dirichlet boundary conditions are used for DNR and piecewise constant Neumann conditions for MsMFE. Adding discrete pressure points in the DNR method corresponds to subdividing coarse faces and hence increasing the number of basis functions in the MsMFE method. The methods show similar accuracy for 2D Cartesian cases, but the MsMFE method is more straightforward to formulate in 3D and implement for general grids.
-
-
-
Reduced-order Modeling for Thermal Recovery Processes
Authors M.A.H. Rousset, C.K. Huang, H. Klie and L.J. DurlofskyThermal recovery typically entails higher costs than conventional oil recovery, so the application of computational optimization techniques may be beneficial. Optimization, however, requires many simulations, which incurs substantial computational cost. Here we apply a model-order reduction technique, which aims at large reductions in computational requirements. The technique considered, trajectory piecewise linearization (TPWL), entails the representation of new solutions in terms of linearizations around previously simulated (and saved) training solutions. The linearized representation is projected into a low-dimensional space, with the projection matrix constructed through proper orthogonal decomposition of solution `snapshots' generated in a training step. We consider two idealized problems, specifically primary production of oil driven by downhole heaters, and a simplified model for steam assisted gravity drainage, where water and steam are treated as a single `effective' phase. The strong temperature dependence of oil viscosity is included in both cases. TPWL test-case results for these systems demonstrate that the method can provide accurate predictions relative to full-order reference solutions. The overhead associated with TPWL model construction is equivalent to the computation time for several full-order simulations (the precise overhead depends on the number of training runs). Observed runtime speedups are very substantial -- over two orders of magnitude.
-
-
-
Enabling Optimal Production Strategies under Uncertainties via Non-Intrusive Model Reduction Methods
Authors H. Klie, H. Chen, Q. Wang and K. WillcoxThe present work proposes an alternative approach to generate nonlinear reduced order models for optimization and control under uncertainty without explicit knowledge of all the equations governing the physics of the simulation. Hence, the proposed method is amenable for legacy simulation codes. In order to cope with the lack of physical information in conjunction with the inherent curse of dimensionality associated with the number of parameter coefficients, control and state variables of the problem, we combine the projection operators obtained from the Proper Orthogonal Decomposition with neural net interpolation. In this way, the proposed Black-Box Stencil Interpolation Method (BSIM) is capable of exploiting both spatial and temporal variable locality. The method can be seen as a competitive but non-intrusive alternative to the Trajectory Piece-Wise Linear method and the Discrete Empirical Interpolation Method (DEIM) both recently proposed in the literature. We illustrate the capabilities of BSIM on a suite of different black-oil and compositional field models subject to multiple well controls under geological uncertainty. We show that the results are comparable in accuracy to DEIM despite the non-intrusive character of BSIM.
-
-
-
Reservoir Management Using Two-stage Optimization with Streamline Simulation
Authors T. Wen, M.R. Thiele, D. Echeverría Ciaurri, K. Aziz and Y. YeWaterflooding is a common secondary oil recovery process. Performance of waterfloods in mature fields with a significant number of wells can be improved with minimal infrastructure investment by optimizing injection/production rates of individual wells. However, a major bottleneck in the optimization framework is the large number of reservoir flow simulations often required. In this work we propose a new method based on streamline-derived information that significantly reduces these computational costs in addition to making use of the computational efficiency of streamline simulation itself. We seek to maximize the long-term net present value of a waterflood by determining optimal individual well rates, given an expected albeit uncertain oil price and a total fluid injection volume. We approach the optimization problem by decomposing it into two stages which can be implemented in a computationally efficient manner. The two-stage streamline-based optimization approach can be an effective technique when applied to reservoirs with a large number of wells in need of an efficient waterflooding strategy over a 5 to 15 year period.
-
-
-
Response Surface Approaches for Large Decision Trees: Decision Making Under Uncertainty
By H. GrossTraditionally, the connection between simulation and decision analysis is done by using simulation outputs as inputs to decision algorithms. We propose to use simulation input uncertainties directly in decision algorithms by extending existing probabilistic reservoir simulation tools (experimental design, proxy models), and existing decision analysis tools (decision trees, Pareto fronts). This approach addresses questions on field development options under uncertainty (facility sizing, completion decisions or data collection campaigns). When linking probabilistic simulation with decision analysis, three practical problems arose. First, the number of reservoir uncertainties creates huge decision trees. We solve this problem by creating composite solutions, with some branches evaluated exhaustively, and others evaluated with calibrated response surfaces. Then, assumption of independence between uncertainties, often encountered, was too restrictive for practical uses. We thus specify probabilities on all uncertainty branches. Last, we must handle multiple decision drivers and understand the consequences of decisions on several metrics. We have therefore implemented multi-objective optimization capabilities. The technique developed here extends beyond the capability of existing decision analysis and uncertainty quantification tools. Its practical value is demonstrated on two field problems, and proves useful to identify optimal decision paths.
-
-
-
A Workflow for Decision Making Under Uncertainty
Authors D. Busby, S. Da Veiga and S. TouzaniWe propose a workflow for decision making under uncertainty aiming at comparing different development plan scenarios under uncertainty. The approach applies to mature fields where the residual uncertainty is estimated using a probabilstic inversion approach. Moreover a robust optimization method is discussed to optimize controllable parameters in the presence of uncertainty. The key elements of this approach are the use of response surface models to reduce the very high number of simulator model evaluations needed. To build efficient and reliable response surfaces for this application we discuss an experimental design method for correlated input variables where the correlation is induced by the probabilistic inversion process. For the problem of optimization under uncertainty an iterative approach is proposed aiming at refining the response surface iteratively such as to reduce effectively approximation errors and converging faster to the true solution. The workflow is illustrated on a realistic test case of a mature field where the approach is used to compare two new development plan scenarios both in terms of expectation and of risk mitigation and to optimize well position parameters in the presence of uncertainty.
-
-
-
Estimation of Production Rates Using Transient Well Flow Modeling and the Auxiliary Particle Filter
Authors R. Lorentzen, A.S. Stordal, G. Nævdal, H.A. Karlsen and H.J. SkaugImproved recovery of oil from existing petroleum fields is increasingly important. A better representation of production zone information leads to better flowrate control and reservoir management. In order to achieve this, it is possible to utilize the fact that smart wells with multiple zones and laterals are more common, and they may be equipped with permanent instrumentation and control. Today, accurate flowrate measurements or estimates for each zone are lacking, and existing tools are often limited to steady-state models with no uncertainty analysis. Here we combine a transient well flow model and estimation techniques, into a tool for interpretation of wellbore measurements. The estimation technique applied here is the auxiliary sequential importance resampling (ASIR) filter, which has the advantage of being more robust than the traditional particle filter (PF). The ASIR filter is used to tune the output of specific stochastic models of the flowrates. To do this tuning we have chosen a regime type model for the flowrates. More specifically, the model implies that the flowrate process changes structure governed by an underlying Markov jump process. Using this type of models makes us capable of capturing both smooth transitions as well as more abrupt changes of the flowrates.
-
-
-
Generalized Field Development Optimization: Coupled Well-Placement and Control under Geologic Uncertainty
Authors B. Jafarpour and L. LiWell placement optimization is often formulated as an integer-programming problem and is typically carried out assuming known well control settings. Similarly, finding optimal well controls is usually formulated and solved as a control problem in which the well locations are fixed. Solving each problem independently without accounting for the coupling between them leads to suboptimal solutions. We propose to solve the coupled well placement and control optimization problems for improved production performance. We present two alternative methods: i) sequential solution of the decoupled well placement and control subproblems where each subproblem is resolved after updating the decision variables of the other subporoblem from the previous step; ii) simultaneous solution by concurrently changing well locations and controls during the iterations using a generalized stochastic approximation simultaneous perturbations algorithm. The first approach allows for application of well-established methods in the literature to solve each subproblem individually while the second approach requires development of new methods to solve mix-integer optimization problems. We consider field development optimization under geologic uncertainty and discuss computationally efficient approximate solution techniques for robust optimization under ensemble model representations. Several numerical experiments with the PUNQ and a layer of the SPE10 benchmark models demonstrate the applicability of these methods.
-
-
-
A Derivative-Free Methodology with Local and Global Search for the Joint Optimization of Well Location and Control
Authors O.J. Isebor, L.J. Durlofsky and D. Echeverría CiaurriIn oil field development, the optimal location for a new well depends on how it is to be operated. Thus, it is generally suboptimal to treat the well location and well control optimization problems sequentially. Rather, they should be considered as a joint problem. In this work, we present noninvasive, derivative-free, easily-parallelizable procedures to solve this joint optimization problem. Specifically, we consider Particle Swarm Optimization (PSO), a heuristic global stochastic search algorithm, Mesh Adaptive Direct Search (MADS), a local search procedure, and a hybrid PSO-MADS technique that combines the advantages of both methods. Nonlinear constraints are handled through use of filter-based treatments that seek to minimize both the objective function and constraint violation. We also introduce a formulation to determine the optimal number of wells, in addition to their locations and controls, by associating a binary variable (drill/do not drill) with each well. Example cases of varying complexity, which include bound constraints, nonlinear constraints, and the determination of the number of wells, are presented. The PSO-MADS hybrid procedure is shown to consistently outperform both standalone PSO and MADS when solving the joint problem. The joint approach is also observed to provide superior performance relative to a sequential procedure.
-
-
-
Well Placement Optimization under Uncertainty with CMA-ES Using the Neighborhood
Authors Z. Bouzarkouna, D.Y. Ding and A. AugerIn the well placement problem, as well as in other field development optimization problems, geological uncertainty is a key source of risk affecting the viability of field development projects. Well placement problems under geological uncertainty are formulated as optimization problems in which the objective function is evaluated using a reservoir simulator on a number of possible geological realizations. In this paper, we present a new approach to handle geological uncertainty for the well placement problem with a reduced number of reservoir simulations. The proposed approach uses already simulated well configurations in the neighborhood of each well configuration for the objective function evaluation. We use thus only one single reservoir simulation performed on a randomly chosen realization together with the neighborhood to estimate the objective function instead of using multiple simulations on multiple realizations. This approach is combined with the stochastic optimizer CMA-ES. The proposed approach is shown on the benchmark reservoir case PUNQ-S3 to be able to capture the geological uncertainty using a smaller number of reservoir simulations. This approach is compared to the reference approach using all the possible realizations for each well configuration, and shown to be able to reduce significantly the number of reservoir simulations (around 80%).
-
-
-
Optimization of Well Trajectory under Uncertainty for Proactive Geosteering
Authors Y. Chen, R.J. Lorentzen and E.H. VefringVarious logging-while-drilling (LWD) and seismic-while-drilling (SWD) tools offer opportunities to obtain geological information near the bottom-hole-assembly during the drilling process. These real-time in-situ data provide relatively high-resolution information around and possibly ahead of the drilling path compared to the data from a surface seismic survey. The use of this in-situ data offers substantial potential for improved recovery through continuous optimization of the remaining well path while drilling. We show an automated workflow for proactive geosteering through continuous updating of the estimates of the earth model and robust optimization of the remaining well path under uncertainty. A synthetic example is shown to illustrate the proposed workflow. The estimate of the reservoir surfaces, reservoir thickness, and the depth of the initial oil-water contact and their associated uncertainty are obtained through the ensemble Kalman filter using directional resistivity measurements. A robust optimization is used to compute the well position that minimizes the average cost function evaluated on the ensemble of geological models estimated from the EnKF.
-