- Home
- A-Z Publications
- First Break
- Previous Issues
- Volume 22, Issue 3, 2004
First Break - Volume 22, Issue 3, 2004
Volume 22, Issue 3, 2004
-
-
Tigress out to capture bigger share of market for integrated reservoir interpretation solutions
By A. McBarnetThe low key announcement in January that Petroleum Geo-Service (PGS) had divested its subsidiary PGS Tigress via a management buy-out ends the company’s 10-year proprietorial association with possibly the most ambitious geoscience and engineering integration software development ever launched. Andrew McBarnet takes up the story. The thing that amazes David Sullivan most is that the Tigress project was ever launched in the first place. ‘It is extraordinary that the software ever got written because it set out to create a database from scratch covering everything from geophysics to reservoir simulation. As such it was way ahead of its time.’ Sullivan is chairman of Tigress Geosciences, the management buy-out from PGS based in Marlow, UK now managing the continued evolution of The Integrated Geoscience and Reservoir Engineering Software System (Tigress), best described as a suite of reservoir interpretation software tightly integrated around an Oracle database. For PGS the disposal was, according to Diz Mackeown, president of PGS Marine Geophysical, part of the company’s policy to concentrate on core activities. He noted that ‘the deal allows a successful relationship for both companies by way of an ongoing corporate licence agreement for the complete Tigress software portfolio.’ Prior to the buy out, the Tigress operation had had been organised into an independent division but the actual transaction was delayed by the financial restructuring of PGS following the collapse of merger talks with Veritas DGC. Sullivan claims that Tigress Geosciences represents ‘the industry’s only independent software producer with a totally independent product covering interpretation tools from geophysics to reservoir simulation. The description, so relevant to current calls for integrated E&P software products, belies the fact that the Tigress concept dates back to 1988, and has been a reality since the early 1990s. It was then that David Wilson, former academic at Imperial College and later a reservoir engineer with Shell, coaxed the first substantial funding out of the UK Department of Energy’s Offshore Supplies Office and company sponsors, Shell, Enterprise Oil and ARCO British for work on Tigress, initially at Robertson Research in North Wales and then at Energy Resource Consultants (ERC) in Marlow, near London. In 1991 First Break recorded that after three years work Robertson ERC (as it was then) was ready to launch Tigress. The report said that 50 software engineers were involved in the project development work being carried out by Robertson ERC with Winfrith Petroleum Technology (a spin-off from the UK Atomic Energy Agency which focused on the reservoir simulation end of the spectrum, led by Dr Joe King). Then as today, Tigress was ‘designed to forecast and plan the optimum development of a reservoir from early appraisal through to maturity.’ With a background in the software industry, Sullivan joined Robertson ERC to manage the installation and rollout of Tigress. The very first installation was for Shell in Rijswijk, The Netherlands. He recalls ‘the system was, and arguably still is, the most tightly integrated E&P system available. The concept of the asset team and the need to minimise development lead times by taking an integrated approach is commonplace today. But in 1991 these were radical ideas indeed.’ Sullivan admits that the Tigress team in the short term might have been better advised to have limited the scope of its aspirations when it first started. ‘In those days companies like Landmark and Schlumberger were focused on workstations and the functionality of individual products in individual disciplines, and clearly that met the needs of the market. Tigress, on the other hand, was a major new technical project and it was also asking industry specialists to work in a different way. That was a tall order.’
-
-
-
Shell shock highlights dilemma over reserves reporting system
By K. ChewShell's recent announcement that 20% of its reserves were no longer 'proved' has raised questions over the workings of the current system of reserves reporting. Kenneth Chew, vice president, industry performance and strategy, IHS Energy, considers the issues surrounding reserves reporting and looks at relevant regulatory controls. Why do companies report reserves anyway? The purpose of reserves reporting is undoubtedly commendable and there are good fiscal and regulatory reasons for doing so. It also allows investors and the public at large to put a value on the assets of an E&P company and to make comparative analysis between companies. However, this only works if all companies report reserves to the same standards and this is where industry regulators enter the picture. Financial accounting rules (and thereby reserves reporting) vary from country to country. The most commonly used accounting standard, which is a requirement for all companies quoted on the New York Stock Exchange, is that of the US Financial Accounting Standards Board (FASB). Issued in November 1982, FAS 69 (Disclosures about Oil and Gas Producing Activities) provides the reporting and accounting rules and reporting formats for (1) proved oil and gas reserve quantities, (2) capitalized costs relating to oil and gas producing activities, (3) costs incurred for property acquisition, exploration and development activities, (4) results of operations for oil and gas producing activities, and (5) a standardized measure of discounted future net cash flows relating to proved oil and gas reserve quantities. The fifth item is one of the most contentious because it requires future cash inflows to be calculated by applying year-end oil and gas prices (no matter how atypical they may be) to yearend reserves and bases future costs on year-end costs and the assumed continuation of existing economic conditions. As Shell said in its statement: ‘The information so calculated does not provide a reliable measure of future cash flows from proved reserves…’ Although the FASB makes the rules, the resultant company reports are filed with the United States Securities and Exchange Commission (SEC). Furthermore it is the SEC that has defined what is actually meant by the various terms incorporated into FAS 69, such as ‘oil and gas producing activities’, ‘exploration costs’ and, perhaps most importantly, ‘proved reserves’. Regulation SX, Rule 4-10, where these definitions appear, was first published in 1978 and, although slight modifications have been made to it since that time, it remains essentially unchanged. This has created numerous problems for companies because technology has not stood still in the meantime. As an example, advances in seismic such 3D, 4C and direct hydrocarbon detection, the use of huge computing power to make reservoir simulations, and probabilistic calculation methods all allow far greater precision in establishing in-place and recoverable hydrocarbon volumes than was possible a quarter of a century ago.
-
-
-
Soft computing for qualitative and quantitative seismic object and reservoir property prediction. Part 1: Neural network applications
Authors F. Aminzadeh and P. de GrootFred Aminzadeh and Paul de Groot of dGB Earth Sciences begin a major series of three articles on the increasing use of soft computing techniques for E&P geoscience applications, focusing first on how neural networks can enhance seismic object detection. Soft computing has been used in many areas of petroleum exploration and development. With the recent publication of three books on the subject, it appears that soft computing is gaining popularity among geoscientists. In this paper we focus on one aspect of soft computing: neural networks, in qualitative and quantitative seismic object detection. In subsequent papers we will review other aspects of soft computing in exploration. Highlighted here will be the role neural networks play in combining different seismic attributes and effectively bringing together data with the interpreter’s knowledge to decrease exploration risk in four categories (geometry, reservoir, charge and seal). Three new books in the general area of soft computing applications in exploration and development, Wong et al (2002), Nikravesh et al (2003) and Sandham et al (2003) represent a comprehensive body of literature on recent applications of soft computing in exploration. Soft computing is comprised of neural networks, fuzzy logic, genetic computing, perception- based logic and recognition technology. Soft computing offers an excellent opportunity to address the following issues: ■ Integrating information from various sources with varying degrees of uncertainty ■ Establishing relationships between measurements and reservoir properties ■ Assigning risk factors or error bars to predictions. Deterministic model building and interpretation are increasingly replaced by stochastic and soft computing-based methods. The diversity of soft computing applications in oil field problems and the prevalence of their acceptance can be judged by the increasing interest among earth scientists and engineers. Given the broad scope of the topic, we will limit the discussion in this paper to neural network applications. In subsequent papers we will review other aspects of soft computing, such as fuzzy logic in exploration. Neural networks have been used extensively in the oil industry. Approximately 10 years after McCormack’s review (1991) of neural network applications in geophysics, much work has been done to bring such applications to the main stream of geophysical interpretation. Some of these efforts are documented in Wong et al (2002), Nikravesh et al (2003) and Sandham et al (2003) which include many papers and extensive references on neural network applications. Most of these applications have been in reservoir characterization, seismic object detection, creating pseudo logs, and log editing. In the next section, we will focus on two general areas of applications of neural networks. This will include qualitative methods with the main aim of examining seismic attributes to highlight certain seismic anomalies without having access to very much well information. In this case neural networks are primarily used for classification purposes. The second category involves quantitative methods where specific reservoir properties are quantified using both seismic data and well data, and neural networks serve as an integrator of the information.
-
-
-
Benefits of rapid data assessment and visualization prove themselves in exploration scenarios
By C. BurnsUse of mapping and visualization software is growing rapidly in exploration. Carmel Burns of Canadian company Geosoft, provider of geospatial solutions to earth science industries, describes how some of its customers are adapting to the possibilities. The ability to effectively display, rapidly assess and dynamically experiment with multiple datasets has helped to reduce risk and increase prospecting capabilities in exploration. Increasingly, what’s required in exploration is software that can handle large volumes of data and multiple data sources and data types, such as geophysical data, geochemical data, drillhole data, satellite imagery, GIS data and any kind of mapping data, within one single environment or transparently linked environments. Utilizing today’s visualization tools, geoscientists are able to reduce risk and increase understanding by looking at as much different data as they can, in as many different ways as they can, within compressed project time frames. Despite the fact that exploration companies are leaner, with fewer people and shorter project time frames, Dr Michal Ruder, principal of US-based Wintermoon Geotechnologies, says she has seen exponential improvements in productivity and data quality as a result of new software for mapping and visualization. Whereas it used to take weeks to process and interpret geoscience datasets, today it is not uncommon for geoscientists to address the salient issues of interpretations in the course of one or two days. ‘I can remember doing batch maps in paper copies back in the 1980s,’ says Dr Ruder. ‘Since then, the ability to image geoscientific datasets on a computer screen in realtime and continual improvements in visualization software have had an amazing impact on what we can do as geoscientists, and how quickly we can do it.’ Interpretation results are also more accurate because geoscientists have the tools to view the quality of the data in every single phase, from initial data processing and quality control through to visualization, integration and the final interpretations.
-
-
-
Doing business the 'e-savvy' way pays off for oil company and suppliers
This report on how Canadian oil and gas company EnCana has worked with Schlumberger on improving the business processes of its US operations is a follow up to the Special Topic feature from DigitalOilfield, published in the January issue of First Break, on the emerging use of e-technology to conduct business more efficiently. Just as every detail of reservoir information is processed to derive the most value, every detail of electronic business transactions can yield immediate and long-term rewards. Within the energy sector, the advantages gained from real-time data for on-the-spot decision-making now extend beyond the realm of improvements in E&P exploration, exploitation and production technology to include business systems and processes. One example of how trading partners can examine ledger processes and implement application-to-application integration comes from EnCana Oil&Gas (USA) and Schlumberger who now claim short- and long-term benefits from doing business electronically. A first milestone in automated electronic exchange of PIDX XML (Petroleum Industry Data Exchange Extensible Markup Language) invoice information was reached in early 2003. Industry players were introduced to true application-to-application integration and a new mechanism that would reduce costs, automate repetitive and manual processes, and shorten cycle times. Since then, energy companies and service firms have been attempting to implement eBusiness processes in the most cost-effective and efficient manner. EnCana first piloted the Digital Oilfield OpenInvoice internet-based eInvoicing at its Rifle, Colorado field in December 2001. OpenInvoice allowed EnCana’s suppliers and employees and authorized users to collaborate on the creation, processing, and approval of invoices and field tickets. For suppliers, the OpenInvoice system was used to automate paper-driven processes by automatically coding line items, tracking invoice inquiries, resolving disputes, and seamlessly linking spend information. Some suppliers already possess automated field data capture and invoicing processes linked to backend Enterprise Resource Planning (ERP) systems. For over 10 years these systems have successfully exchanged large numbers of Electronic Data Interchange (EDI) documents with customers, and today capture and analyze critical business information. At the time EnCana piloted its programme, the OpenInvoice system required suppliers to manually input service delivery information, scan, and upload delivery tickets, code services and log in for dispute resolution as part of the browser-based process. However, it was recognized that the integrated suppliers of EnCana needed an alternative solution to achieve the same benefits. The solution was systems integration.
-
-
-
IODP plans its first drilling expeditions in the world’s oceans
By M.F. CoffinMillard F. Coffin from the Science Advisory Office, Integrated Ocean Drilling Program (IODP), Ocean Research Institute, University of Tokyo outlines the first operations of IODP, the international collaboration of earth, ocean and life scientists which came into existence last October. Building upon the successes of previous scientific ocean drilling programmes, the IODP offers scientists worldwide unprecedented opportunities to address a vast array of scientific problems in all submarine settings. The scientific advisory structure of the proposal-driven IODP recently planned the inaugural drilling expeditions, targeting critical scientific problems in the eastern Pacific, central Arctic, and north Atlantic Oceans in 2004 and 2005 (Figure 1, Table 1). Co-led by Japan and the United States, with initial significant contributions from the European Consortium for Ocean Research Drilling (ECORD), the IODP is guided by an initial science plan, Earth, Oceans, and Life (www.iodp.org/isp.html), developed with broad input from the international geoscientific community. For the first time, scientists will have permanent riser and non-riser drilling vessels and mission- specific capabilities such as drilling barges and jack-up rigs for shallow water and Arctic drilling at their disposal. Japan is providing the new riser vessel, Chikyu, to the IODP beginning in 2006; the United States is supplying the non-riser drilling vessel, currently JOIDES Resolution, beginning in 2004; and ECORD is furnishing mission-specific platforms beginning in 2004. The planned IODP expeditions for 2004 and 2005 directly address principal themes of Earth, Oceans, and Life. ■ The Juan de Fuca Ridge Flank Hydrogeology expedition: the deep biosphere and the subseafloor ocean. ■ The Central Arctic Paleoceanography, North Atlantic Neogene-Quaternary Climate, and Norwegian Margin Bottom Water expeditions: environmental change, processes, and effects. ■ The Atlantis Oceanic Core Complex expedition: solid Earth cycles and geodynamics. ■ The Central Arctic Paleoceanography expedition will drill the central Arctic Ocean for the first time, and it is also the first mission-specific platform expedition performed under the auspices of international scientific ocean drilling.
-
-
-
Visualization for pore pressure prediction
More LessJoanne Wang, Duane Dopkin and Huw James (Paradigm Geophysical) discuss how visualization and interpretation of multi-disciplinary data volumes can improve risk assessment of predicted overpressures. Seismic data transformations are used routinely by geoscientists to enhance geometric features and physical property descriptions of the subsurface. These transformations can range from the application of one-dimensional operators to migrated and stacked seismic amplitude data to the application of complex transforms that make use of multiple dimensions (e.g. amplitude versus angle) or multiple data inputs (e.g. seismic inversion procedures). They can be extended with the application of linear or non-linear operators (e.g. neural networks) to relate the resultant elastic properties (e.g. impedance) to other rock properties or reservoir conditions making use of petrophysics and rock physics data. While these transformations exploit the dynamic properties of seismic data, other transforms make use of the kinematic properties of multi-offset seismic data to estimate or predict subsurface conditions. Pore pressure predictions and transformations that make use of seismic velocity measurements, for example, have had a huge impact on drilling safety and the economics of drilling design and well construction. The pore pressure models derived from the integration and careful calibration of wireline, petrophysical, seismic velocity and field test measurements provide the data needed to make critical pre-drill stress and overpressure predictions in order to secure a safe and economic well program. Because most exploration and development projects incorporate many data transformations, a heavy burden is often placed on the geoscientist and drilling engineer to integrate a broad range of data volumes and models. Although automated procedures (e.g. PCA analysis, neural network classification) are available to facilitate multi-volume integration, high-end visualization technologies and strategies often provide the most effective and informative data integration vehicles. The multi-disciplinary pore pressure prediction and transformation process that integrates seismic, well log, petrophysical, field test data with structural frameworks and directional well paths can return a wide range of deliverables (outputs) that are natural for creative co-visualization and concurrent interpretation by geophysicists, geologists, and drilling engineers. These co-visualizations are often the most effective way to understand the ‘interplay’ or dependencies of one transformation with another.
-
-
-
Fresnel aperture prestack depth migration
Authors H. Tabti, L. J. Gelius and T. HellmannWe investigate possible improvements in seismic imaging. We discuss how the Fresnel zone relates to the migration aperture and introduce the concept of the Fresnel aperture, which is the direct time-domain equivalent, at the receivers’ surface, of the subsurface Fresnel zone. Through these concepts we propose a new and efficient method for optimal aperture selection and migration. For complex media, multipathing will occur and multiple Fresnel apertures can exist for a given image point. In practice, due to inaccuracies and smoothing of the background velocity macromodel, inaccuracies in the ray-tracing method used for Green’s function computations and possible noise corruption of the data, the true Fresnel apertures will, in many cases, be replaced by ‘false’ ones, with apparently new Fresnel apertures being added. Hence, contributions from these ‘false’ Fresnel apertures cause a noise-corrupted image of the subsurface. It is now assumed that the single scattered events are quite robust with respect to the above-mentioned distortions, and that their corresponding Fresnel apertures will remain essentially undistorted, with the strongest amplitudes. Based on this main assumption, we propose a method, analogous with velocity analysis, where the strong-amplitude Fresnel apertures can be picked interactively and at least semi-automatically. However, as in velocity analysis, a certain amount of user interaction has to be assumed. When this technique is combined with a prestack Kirchhoff-type depth-migration method, we call it Fresnel-aperture PSDM. This imaging method has been applied to data from both the Marmousi model and the North Sea. In both cases the improvements, when compared to conventional imaging, were considerable.
-
Volumes & issues
-
Volume 42 (2024)
-
Volume 41 (2023)
-
Volume 40 (2022)
-
Volume 39 (2021)
-
Volume 38 (2020)
-
Volume 37 (2019)
-
Volume 36 (2018)
-
Volume 35 (2017)
-
Volume 34 (2016)
-
Volume 33 (2015)
-
Volume 32 (2014)
-
Volume 31 (2013)
-
Volume 30 (2012)
-
Volume 29 (2011)
-
Volume 28 (2010)
-
Volume 27 (2009)
-
Volume 26 (2008)
-
Volume 25 (2007)
-
Volume 24 (2006)
-
Volume 23 (2005)
-
Volume 22 (2004)
-
Volume 21 (2003)
-
Volume 20 (2002)
-
Volume 19 (2001)
-
Volume 18 (2000)
-
Volume 17 (1999)
-
Volume 16 (1998)
-
Volume 15 (1997)
-
Volume 14 (1996)
-
Volume 13 (1995)
-
Volume 12 (1994)
-
Volume 11 (1993)
-
Volume 10 (1992)
-
Volume 9 (1991)
-
Volume 8 (1990)
-
Volume 7 (1989)
-
Volume 6 (1988)
-
Volume 5 (1987)
-
Volume 4 (1986)
-
Volume 3 (1985)
-
Volume 2 (1984)
-
Volume 1 (1983)