- Home
- Conferences
- Conference Proceedings
- Conferences
ECMOR XV - 15th European Conference on the Mathematics of Oil Recovery
- Conference date: 29 Aug 2016 - 01 Sep 2016
- Location: Amsterdam, Netherlands
- ISBN: 978-94-6282-193-4
- Published: 29 August 2016
81 - 100 of 163 results
-
-
Distinguishing Signal from Noise in History Matching - Analysis of Ensemble Collapse on a Synthetic Data Set
Authors P. Roe, A. Almendral Vazquez and R. HaneaUnderestimation of posterior parameter uncertainty is one of the main problems encountered when doing history matching using ensemble based methods. In history matching results with the partial or full ensembles collapse, it is very hard to distinguish updates due to spurious correlation with noise in the data from the actual updates attributed to information in the data. History matching of porosity and permeability based on well production data using the ensemble smoother with multiple data assimilation has been performed on a synthetic data set. The presence of ensemble collapse has been evaluated by different means: by looking at the stability of the update based on the starting ensemble, by adding dummy parameters to the update which do not affect the forward model, and by examining how well the data set used to generate the production data matches the posterior distributions of the parameters. Ensemble collapse can be avoided by increasing the number of ensembles. This is however a prohibitively expensive strategy for cases with a large number of history data. Localization methods have been proposed in the literature as a way to increase the ensemble spread and hence avoid collapse, by for example limiting the analysis update to regions of influence of the data, while at the same time keeping the number of ensembles low. A local analysis was performed to reduce the problems related to ensemble collapse. The results from the localized history matching produce a posterior distribution that better matches the original data set. Since our test data set is synthetic, we may perform measures of posterior uncertainty estimation by comparing with the true solution, with and without localization.
-
-
-
Estimating Observation Error Covariance Matrix of Seismic Data from a Perspective of Image Processing
More LessEstimating observation error covariance matrix properly is a key towards successful seismic history matching. Observation errors of seismic data are usually correlated, therefore the observation error covariance matrix is non-diagonal. Estimating such a non-diagonal covariance matrix is the focus of the current study. We decompose the estimation into two steps: (1) estimate observation errors; and (2) construct covariance matrix based on the estimated observation errors. Our focus is on step (1), whereas at step (2) we use a procedure similar to that in Aanonsen et al., 2003. In Aanonsen et al., 2003, step (1) is carried out using a local moving average algorithm. By treating seismic data as an image, this algorithm can be interpreted as a discrete convolution between an image and a rectangular window function. Following the perspective of image processing, we consider three types of image denoising methods, namely, local moving average with different window functions (as an extension of the method in Aanonsen et al., 2003), non-local means denoising and wavelet denoising. The performance of these three algorithms is compared using both synthetic and field seismic data, and it is found that the wavelet denoising method leads to the best performance in our investigated cases.
-
-
-
Pilot Design Analysis Using Proxies and Markov Chain Monte Carlo Method
Authors B. Chen, J. He, X. Wen, W. Chen and A. ReynoldsA pilot project is a crucial step of reservoir management that enables the minimization of subsurface risks and improves the quality of decisions on full-field development. Selecting a pilot project involves evaluating the expected uncertainty reduction and the value of information (VOI) attainable from a set of plausible pilot projects. Proxy-based pilot analysis (PBPA) represents a promising approach for characterizing the uncertainty reduction and VOI from each of a set of feasible pilot projects. In the PBPA method, multiple plausible realizations of observed data from a pilot are generated and probabilistic history matching (based on filtering) is performed for each realization of the vector of observed data in order to obtain the corresponding posterior distribution. The multiple history-matching runs are accomplished with a manageable number of simulations with the help of proxies. Previously, PBPA was successfully applied to quantify the expected value of uncertainty reduction in cases where the history-matching tolerance is high, but as shown here, the filtering-based history-matching procedure can fail when the tolerance is low. Moreover, it has not been demonstrated previously that the PBPA method can quantify VOI. In this paper, enhancements to PBPA that eliminate these two PBPA shortcomings are introduced. First, a Markov chain Monte Carlo (MCMC) method is used in place of the filtering procedure to calculate the posterior distribution. The combined MCMC-PBPA procedure is shown to outperform the filtering-based PBPA when the history-matching tolerance is low. Secondly, we define a framework that combines the MCMC-PBPA method with decision tree analysis in order to calculate the VOI. The proposed framework is demonstrated for a synthetic waterflooding pilot in the Brugge reservoir where it successfully quantifies the VOI for different pilot designs.
-
-
-
Uncertainty quantification using a self-supervised surrogate-assisted parallel Metropolis-Hastings algorithm
More LessParallel tempered algorithm is a Markov-chain Monte Carlo technique, applied to propagate uncertainty in the parameters of interest, using multiple Metropolis-Hastings algorithms. Despite the effectiveness, it is a computationally intensive method, since, at every step, a numerical reservoir simulation should be executed. To reduce the CPU-time of such a process, an online-learning surrogate-assisted algorithm is proposed in which two surrogates (one for high-temperature chains and one for low-temperature chains) are utilised together with the exact function (numerical simulation). After each swap step, both surrogates are re-trained, and their fidelity is estimated. According to the estimated fidelities, the frequency of use, for each surrogate, is defined with a heuristic fuzzy rule. The algorithm stochastically decides between the simulation and surrogates, based on the frequencies. This creates a self-supervised strategy, which can optimise the use of the numerical simulation, through the sampling process. The robustness of the proposed algorithm is analysed using IC-fault model. The outcomes are compared with the results achieved by a typical (unassisted) parallel tempered Metropolis-Hastings algorithm, over a range of chain lengths. The comparison indicates that the proposed algorithm can deliver a significantly better approximation of the probability density function, with the same amount of computation.
-
-
-
On Obtaining Optimal Well Rates and Placement for CO2 Storage
Authors R.D. Allen, H.M. Nilsen, O. Andersen and K.A. LieLarge-scale storage of CO2 in saline aquifers is considered an essential technology to mitigate CO2 emissions. Storage potential has mainly been estimated based on volumetrics or detailed simulations for specific injection scenarios. In practice, achievable storage capacity will depend on engineering, economical, and political restrictions and be limited by the length of the injection period, costs associated with potential CO2 leakage, pressure management, etc. We show how achievable storage volumes can be estimated and maximized using adjoint-based optimization and a hierarchy of simulation methods. In particular, vertical equilibrium models provide the simplest possible description of the flow dynamics during the injection and early post-injection period, while percolation type methods provide effective means for forecasting the long-term fate of CO2 during the later migration stages. We investigate the storage volumes that can be achieved for several formations found along the Norwegian Continental Shelf by optimizing well placement and injection rates, using production wells for pressure management when necessary. Optimal strategies are obtained under various objectives and simple but realistic constraints, namely: penalization of CO2 leakage, minimization of well cost, and restriction of pressure buildup.
-
-
-
Gradient-based Production Optimization with Economic Constraints
Authors O. Volkov and M.C. BelloutThis work develops an analytical framework to study the effects of enforcing simulator-based constraints while performing gradient-based production optimization. In particular, this work studies how the enforcement of this type of constraints affects the consistency of the adjoint-based gradient and the performance of a gradient-based algorithm. In reservoir management, production optimization is commonly performed using gradient-based algorithms that rely on the efficient computation of control gradients through an adjoint formulation. Often, production optimization is implicitly coupled with economic constraints, which typically are implemented through well performance limits enforced within the reservoir simulator. We show that enforcing simulator-based economic constraints triggers non-differentiable unscheduled changes in both the well model equations and in the functions defining the economic criteria. These discontinuities lead to inconsistencies within the adjoint gradient formulation that eventually translate into decreased algorithmic performance. The analytical framework developed in this study allows us to devise an efficient implementation of the simulator-based constraints that provides gradients that are consistent with the adjoint formulation. These analytical results are described using the theoretical framework developed in this paper, and implemented for a production optimization test case where they are shown to outperform common modes of economic constraint enforcement.
-
-
-
Well Placement Plan Optimization by Dynamic Pattern Adjustment with Existing Wells and Streamlines
More LessWe present an original method that automatically generates an optimal well placement plan (WPP) by dynamic pattern aberration with existing wells and streamlines. The new method is a pattern-based optimization of a field development plan, where WPP is optimized using a constrained downhill-simplex approach. In existing methods, a secondary production pattern, e.g. 5-spot waterflood, is generated from five control variables. We extend this approach, while still honoring the topology of a pattern. We distort the pattern geometry to consider existing wells and to honor the underlying fluid and reservoir heterogeneities, which is essential for mature brownfields and geologically complex reservoirs. A fast streamline simulator is used to identify allocation factors for each producer-injector pair, which is then used to guide the optimizer to discover the optimal irregular geometry of the pattern. This is analogous to fast cloth simulation algorithms heavily employed in various industries. Existing wells are dealt with by introducing a radius constraint. This constraint defines maximum distance within which existing well, when found, could be re-used either as producer or injector, with producer/injector conversion allowed. Modeling production with streamline simulations and combining many individual drilling decisions into a consolidated WPP does not significantly affect convergence time and optimization problem complexity. At the same time, the algorithm is capable to produce realistic plans with much higher NPV and sweep efficiency, when compared to a basic repatterning approach. Results are illustrated on the Brugge synthetic field, widely known for history match and prediction benchmarking.
-
-
-
Manifold-mapping Optimization Applied to Oil Field Operations
Authors D. Echeverria Ciaurri and C. Wagenaarhas been reported as an efficient surrogate-based optimization approach in several engineering applications (e.g., design of electromagnetic devices, microwave structures and antennas, friction-stir welding and analysis of fluid-structure interaction). Manifold Mapping relies on the iterative solving of a “coarse” (approximate, fast-to-evaluate) formulation of the optimization problem of interest. This formulation is progressively corrected using scarce evaluations of a satisfactorily accurate (and often computationally more expensive) “fine” formulation of the same problem. In order to obtain optimized solutions of acceptable quality by means of Manifold Mapping we typically need to perform as many evaluations of the fine formulation as the number of degrees of freedom that the optimization problem really has (as can be expected, this behavior depends strongly on the particular coarse formulation considered). Manifold Mapping can be combined with gradient information but it is normally used in a derivative-free manner. We will first present results of Manifold Mapping for a relatively simple production optimization case based on a small reservoir model with eight wells. The fine formulation is constructed by means of reservoir flow simulation and the coarse formulation through approximation/interpolation using simulation output. Manifold Mapping yields speed-up factors of 2-3 with respect to direct optimization of the fine formulation. We also introduce geological uncertainty in that example by means of a fine formulation with 20 realizations. A coarse formulation built on approximation using simulation output from only one realization accelerates again 2-3 times the optimization of the fine formulation. We will conclude the paper with additional experiments that include different number of control variables and nonlinear optimization constraints.
-
-
-
Inter-well Connectivity in Waterfloods - Modelling, Uncertainty Quantification, and Production Optimization
Authors T. Wen, X. Zhai and S.F. MatringeTo improve performance of waterfloods with minimal capital investment is important as the crude price is low. Adjusting the well controls to achieve a more efficiency sweep pattern is more economic than side tracking or infill drilling. This paper presents a methodology designed to guide well controls and maximize the recovery of remained oil in large and mature waterfloods by modeling and optimizing the inter-well connectivities. The workflow includes three steps: modeling, uncertainty quantification (UQ), and production optimization. Firstly, the reservoir is modeled as a connected network characterized by the strength and efficiency of each injector-producer connection. The concept is similar to the flux pattern derived by streamlines (Thiele and Batycky, 2006). But the presented approach does not use streamlines, and instead simulates tracer concentration between each well pair to quantify the strength of energy support from injectors to producers. The technique is a generalized form of the work by Shahvali, et al. (2012). The efficiency of connections measures the oil contribution of each connection, which identifies the water cycling. It is history matched by a data-driven technique. In the UQ step, the method estimates the possible range of efficiency due to the non-uniqueness of the history matching solution. The efficiency of the connection carrying less flux in the entire history tends to be more uncertain. We quantify the uncertainty by evaluate the upper and lower bound of the efficiency subject to similarly good history matching. The formulation of the maximization/minimization was inspired by the work of Van Essen, et al. (2010), but the optimization algorithm differs and is a non-linear constrained pattern search method. For production optimization, a nonlinear optimization problem is formulated based on the connectivity model to find well controls strengthening efficient connections and weakening inefficient connections. The optimization algorithm takes advantage of the linearity of the network model to achieve faster performance than pattern search. Here UQ regulates the risk of the recommended well control strategy. The methodology was tested based on a full simulation model of a real field with 200+ wells, which was regarded as the true reservoir in this study. We trained our network model for 3 years then started to optimize the waterflooding strategy for six months. The results demonstrated that the optimized strategy maintained oil production and reduced water production by 50% without adding new well, while the historical operation satisfied the oil target by drilling tens of new wells and scarifying water-cut.
-
-
-
On Solving Large-scale Well Positioning Problems
Authors A.M. Kuvichko, N.Y. Andrianov and A.I. ErmolaevWe consider several mathematical aspects of technological objects positioning applied to oil and gas fields. Best results of these algorithms obtained while dealing with non-great fields of complex structure; these algorithms are suitable for non-regular well location problems, when one cannot define a strong pattern for well positions. Proposed problems are formulated in terms of Boolean programming. We present a new iterative method of solving described Boolean problems finding optimum within a reasonable time. This method is a clustering algorithm designed to deal with Boolean parameters. We introduce a Monte-Carlo-like approach forecasting the optimal value for a certain problem. Several applications of formulated algorithms applied to test field models with properties similar to real field models. We present both Boolean problems solutions. Comparing different scales, we show complexity of proposed algorithm. We also present comparisons between computational time and number of iterations for different initialization schemes. Optimal solutions of formulated problems are good as initial points for other iterative (gradient-based) algorithms of well location problems. Proposed algorithms are practical for field development cases applied to complex oil and gas fields.
-
-
-
Optimization of Cyclic CO2 Flooding through the Gas Assisted Gravity Drainage Process under Geological Uncertainties
Authors W.J. Al-Mudhafar, D.N. Rao and S.M. Hosseini NasabThe purpose of this research is to determine an actual optimal solution through cyclic optimization of CO2-Gas Assisted Gravity Drainage (GAGD) process in a heterogeneous sandstone reservoir under geological uncertainties. We propose an integrated approach to optimize durations of gas injection, soaking, and oil production under geological uncertainties. Therefore, 100 stochastic reservoir realizations of the 3D permeability and porosity distributions were created honouring geological constraints. Ranking was applied through quantifying of reservoir oil response to select P10, P50, and, P90 that represent the overall reservoir uncertainty. More than 400 training simulation runs were created including the durations and geological uncertainty parameters through Latin Hypercube Design to build the second-order proxy model along with approximately 200 extra verification runs. The verification runs led to keep the solutions in global optima and obtain satisfactory proxy model through an iterative validation procedure. The cyclic optimization has shown its feasibility to increase oil recovery through the GAGD process from 71.5% to 75.5% with incremental cumulative oil production of 225 million barrels. The presented robust optimization workflow under geological uncertainties led to higher recovery factor than nominal realization optimization with providing degrees of freedom for the decision-maker to significantly reduce the project risk.
-
-
-
Multi-fidelity Proxy Models for Reservoir Engineering
Authors A. Thenon, V. Gervais and M. Le RavalecProxy models are built to approximate outputs that depend on many uncertain parameters using few evaluations. In reservoir engineering, they can strongly reduce the number of flow simulations required for sensitivity analysis, history matching or production optimization. However, a large number of simulations can still be necessary to compute predictive proxy models. We propose to build multi-fidelity proxy models based on co-kriging to speed up the process. This approach introduces coarser resolution levels for the reservoir model that are less informative, but faster to estimate. These coarser levels can be obtained using a fluid flow simulator with simplified physics or an upscaled reservoir model. Then, the fine and coarse level evaluations are combined to build a proxy model of the reference – or fine – level. The objective is to retrieve as much information as possible from the faster levels in order to limit the calls to the fine, but most expensive ones. Sequential design strategies can also help reduce the number of simulations required to get predictive proxy models by iteratively defining the appropriate location of the added point or the appropriate fidelity level to consider when in multi-fidelity context. Sequential design strategies that take advantage of some kriging/co-kriging features (kriging variance and cross-validation predictions) were thus introduced to fully exploit the potential of the proposed approach. Comparisons of time saving between the simple and multi-fidelity proxy modeling methodologies were then performed through a sensitivity analysis for the Brugge field.
-
-
-
Organization of High-performance Computing on the Mobile Platforms to Calculate the Oil Recovery of Reservoir
Authors T.S. Imankulov, D. Akhmed-Zaki, B.S. Daribayev and O.N. TurarRecently variety of novel parallelization technologies and approaches is rapidly developing. Distribution of high-powered graphical processors and parallelization tools on mobile platforms required a detailed comparison of computational capabilities on practical problems. This paper considers numerical investigation of oil displacement process by polymer/surfactant flooding taking into account water salinity and temperature effects, study parallel algorithms for it on several platforms and describes the organization of high-performance computing by using GPU of mobile devices. Calculation times, efficiency and speed up of these algorithms on various platforms were compared in order to identify optimal technology or combination of technologies for effective and well-optimized developments of industrial scale simulator. Basic parallelization technologies being used: MPI and CUDA. The calculations are performed on a mobile devices Xiaomi MiPad with NVIDIA Tegra K1 and on a personal computers with NVIDIA GeForce and Tesla K20. Besides the comparison of different version’s working time profound analysis of different platforms’ using expedience presented considering of the received working time benefits. Standard and proven parallelization tools are being used by way of samples for justification competitiveness of new types of parallelization. The research output has led the authors to come to the main conclusion: mobile devices can be used as computers to solve problems in oil industry.
-
-
-
A Hybrid DFN with Elastic Properties to Construct a Seismic Forward Model
Fractured reservoirs present significant modeling uncertainties. This paper describes a methodology for generating a hybrid Discrete Fracture Network (DFN) model to be employed in a dynamic fluid flow simulation, and then introduces a mathematical construct to create elastic properties from the simulation results. The hybrid DFN model is constructed by extracting in-situ, meso to macro-scale DFN population of a fracture system from 3D seismic data, and stochastically modeling micro to meso-scale DFN population of the same fracture system using population properties of the extracted 3D DFN from seismic data and 1D fracture data from multiple boreholes. This hybrid model enables to better constrain 3D network connectivity with a multi-scale fracture system and thereby to carry out more realistic flow simulation. The hybrid model also provides the basis for the computation of anisotropic elastic properties which are then used for rock physics/fluid substitution modeling and computation of representative seismic data. The seismic-simulation loop is then closed by comparing modeled seismic data to an acquired seismic survey, and using the observed discrepancies between these data to achieve improvements in the match to observed production data through guided adjustments to the computer modeling approach. The calibrated model is used to optimize a surface network and allow for the introduction of other reservoirs into the gathering facility.
-
-
-
Asphaltene Flow Assurance Risk Mitigation through Emerging Approach of Inhibitor Numerical Modelling
More LessThis work was motivated for establishing a comprehensive evaluation method of asphaltene mitigation using inhibitor in an oil field that has a high risk of asphaltene precipitation in tubing. Application of asphaltene inhibitor is a typical counter measure and widely applied in many fields; however, most of the applications are temporary relief to mitigate problems. During an entire field life, the production operating condition has varied such as pressure decline, water cut and GOR increase, and so on. According to these variation, the inhibitor formulation that was once selected as the best effective one, its efficiency fades away. Then, another screening process is required to select alternative one and/or to modify the original formulation to adapt effectiveness to the new operating condition. This paper demonstrates a comprehensive estimation of its inhibiting efficiency during a whole field life by generating a numerical model based on results of asphaltene dispersant test (ADT) that was performed to experimentally select inhibitor for our one of oil field asset. The best asphaltene inhibitor IB-23 was selected through the two staged ADT from total nineteen samples because the IB-23 revealed highest inhibiting efficiency more than 80 % at 200 ppm concentration and maintained its efficiency more than 90 % even at 12.5 ppm. Based on this testing result, an emerging technique was applied to generate a numerical model to reproduce inhibiting efficiency. This special technique treated asphaltene inhibitor as pseudo-component defined using physical data that was available in public accessible material safety data sheet (MSDS). To date, any commercial software is not available for modelling of asphaltene inhibitor due to confidentiality for inhibitor’s physical data; however, our approach achieves to express the inhibiting efficiency as size-reduction of asphaltene precipitation envelope (APE) on thermodynamic plot. The model was generated using cubic-plus-association (CPA) EoS for fluid characterization, and the model validation was confirmed by comparing with the ADT data. Assuming natural depletion, the APEs were compared with variation of vertical lifting curves (VLC) in tubing. Two VLCs were assumed to represent early and late field conditions (i.e. high wellhead/reservoir pressures and depleted ones). The no-inhibitor case revealed precipitating risk existed over most of the tubing section. In contrast, the inhibitor dosed case could significantly reduce the risks in the early stage in particular. Even in the late stage, the risks could be minimized as the interception of VLC on the APE became shorter than the no-inhibitor case.
-
-
-
Investigation of Water Diversion by a Novel Polymer Gel System for Enhancing Oil Recovery
Authors A. Jahanbani Ghahfarokhi, J. Kleppe and O. TorsaeterNumerical study of water diversion by gel treatment is presented in this paper, particularly investigating layered reservoirs where crossflow may be an important recovery mechanism. The transport of the new gel system which is, unlike many other systems, environmentally acceptable through porous medium and the mechanism of permeability reduction are evaluated. Permeability reduction causing the water diversion is mathematically modelled by interaction of the gelants or gel with the rock matrix in terms of equilibrium adsorption both reversibly and irreversibly. Long half-life of the gelants is applied to the gelation kinetics model to simulate the controlled release of the crosslinkers. Modelling the blockage for both aqueous and oil phases and including the inaccessible pore volume results in a more realistic situation. The location of the high permeability streak is analysed to closely investigate gravity and crossflow. Permeability reduction and crossflow are the main mechanisms involved. Results indicate that high permeability reduction in the thief zone should exist to improve the recovery. This is justified by observing the spread of residual resistance factor. Gel treatment is generally more efficient than polymer flooding in terms of increased oil recovery and reduced water cut, however, gel is not completely formed in the case of high crossflow between layers since some reactants are lost to the low permeability zone and cause damage and additional water crossflow. Investigation of individual layers shows a peak in oil saturation and production rate of the high permeability layer which is due to resaturation of this depleted layer by crossflow from the low permeability layer. The oil production rate of the low permeability layer adjacent to the thief zone increases after gel treatment reflecting the effectiveness of water diversion treatment. This effect is observed to be more significant in the case of low crossflow between layers, as discussed before. Study of the injection strategy in terms of alternating water, polymer and gel injected shows that a small slug of gel injected early after water breakthrough is more effective than injection following a polymer flood. In real cases, crossflow and permeability contrast between layers are beyond control and injection conditions should be optimized when designing a treatment. Therefore, effects of blocking properties, injection time and concentration and reaction rate are studied in details in this work.
-
-
-
An Application of Green's Function Technique for Computing Well Inflow without Radial Flow Assumption
Authors A. Korneev, A.V. Novikov, D.V. Posvyanskii and V.S. PosvyanskiiWell modeling plays an important role in numerical reservoir simulation. The main difficulty in well modeling is the difference in scale between the wellbore radius and the well block grid dimension used in the simulation. Peaceman’s formula is widely used in reservoir simulation in order to match the cell pressure to the local solution of the diffusivity equation describing the flow near the well. However, it was developed under the assumption of radial flow. The objective of this study is to calculate a semi-analytical expression for the well productivity index without making any assumption about radial flow, and to subsequently use it in numerical reservoir simulation. Radial flow may not occur due to boundary conditions at the top (bottom) of reservoir or well trajectory. The well inflow equation can be solved through the Green’s function method (GFM), which may take into account various boundary conditions and different well trajectories. The GFM solution of the diffusivity equation is presented as a series over the eigenvalues of the Laplace differential operator, but the series converges conditionally and its direct summation is time-consuming. This makes the GFM solution impractical for well modeling. Reference [1] presented the method of fast summation of such a series, which was successfully applied for analyzing pressure build up curves. In this paper we apply the same techniques for calculating the well productivity index for horizontal, deviated and partially penetrating wells . It is shown that for reservoirs with a gas cap (or underline water), using Peaceman’s model for the well productivity index leads to a significant discrepancy between the numerical and presented semi-analytical solution. Although local grid refinement around the well leads to a reduction in the discrepancy, it introduces its own set of numerical problems. It is demonstrated that using new expression for the well index models the well inflow with high accuracy even on a coarse grid. [1] E.S. Makarova, D.V.Posvyanskii, V.S.Posvyanskii, A.B. Starostin ECMOR XI P26 2008
-
-
-
Evaluation of Ultra Low Concentration Surfactant System for Chemical Flooding
Authors I. Sagbana, P. Diaz and M. CentenoIn order to select a surfactant formulation for chemical flooding, the surfactant has to be evaluated at reservoir conditions to determine its compatibility with the reservoir to be injected in. This is to avoid formation of gels and precipitation in the reservoir which can make surfactant enhanced oil recovery unsuccessful. In several studies, surfactants have been tested in the laboratory at room temperature using only sodium chloride salt in the brine. While in oilfield scenario, the temperature is higher and the reservoir brine contains divalent ions. In this study, very low concentration alcohol alkoxy sulfate with and without a co-surfactant in hard brine and medium crude oil has been evaluated. The results from the salinity scan, phase behaviour and core flooding experiments at 60°C shows that alcohol alkoxy sulfate is tolerant to divalent ions and its stability can be improved with the addition of methyl ester sulfonate and internal olefin sulfonate as co-surfactants. These co-surfactants were able to reduce the viscosity of microemulsion phase, create a lower interfacial tension by increasing solubilisation ratio and also increase oil recovery by at least 20%.
-
-
-
Satistical Analysis and Mapping of Oil and Gas Development Cost Based on Field Development Plan in Indonesia
Authors A. Azizurrofi, A. Asnidar, J. Simanjuntak and R. FirdausThe oil price has already sunk to its lowest level at 37.13 US$/bbl (per December 2015) over the last 11 years. In this economic downturn, the oil and gas company must take extra measures to cope with this issue. In a country that adopts a Production Sharing Contract fiscal regime such as Indonesia, there are several changes which can be proposed such as accelerating depreciation, providing incentives (Investment Credit and Interest Cost Recovery), DMO Holiday or readjusting the First Tranche Petroleum or Split Ratio. Another solution is to discover a new oil and gas area that has low cost development which also has large commercial reserves. Here, this paper will provide an insight about the development of oil and gas industry in Indonesia during the downfall of oil price and produces the bubble map of development cost and commercial reserves based on geographic area of Indonesia to show the attractiveness of investing in oil and gas industry based on statistical data analysis. There were 387 Plan of Development (POD) approved during 2003 - 2015. For the purpose of this paper, the geographic area of Indonesia shall be simplified into 6 different area (Sumatera, Natuna Sea, Java, Kalimantan, Sulawesi and Papua). From here, the total development cost and commercial reserves of each respective contract area in the POD shall be calculated and redistributed into the aforementioned clusters. Based on data analysis, the oil and gas industry in Indonesia is still considered attractive for the investors despite suffering from low oil price, this is because the maximum development cost needed in Indonesia is around 15.48 US$/bbl or smaller than the current oil price. Futhermore, there are several areas which have untapped wealth of commercial reserves such as Papua and Natuna Sea. Under the current adverse condition, this paper eventually provides a good insights for the investors and help them creating and revisiting their strategy and portfolio to invest in Indonesia's oil and gas industry.
-
-
-
An Efficient Hybrid Model for Fractured-vuggy Reservoir Based on Discrete Fracture-vug Network Model
More LessThe numerical simulation of fractured-vuggy reservoirs has received much attention in the past years, because of the significant contribution of fractured-vuggy reservoirs to the oil and gas reserves and productions. In this paper, an efficient hybrid method is proposed to simulate two phase flow in the fractured-vuggy reservoirs with multiple-length scaled fractures and vugs, which cannot be easily modeled by only the continuum models or discrete models. In this hybrid model, small fractures and vugs are modeled by a continuum model, and long fractures are modeled by the embedded discrete fracture model. Firstly, the coarse grid system is made according to the structure characteristics of fractures and vugs. Then, the discrete fracture-vug model (DFVN) is implemented in each coarse grid to calculate equivalent permeability tensor based on homogenization theory, and an analytical procedure is implemented to obtain a pseudo relative permeability curves for each grid containing fractures and cavities. The long fractures are embedded in homogenized media system. After that, an efficient numerical simulator is devised to solve the coupled system of long fracture and homogenized media based on mimetic finite difference method. At last, several numerical examples have been shown to verify the validity and accuracy of the hybrid model.
-