- Home
- A-Z Publications
- First Break
- Previous Issues
- Volume 24, Issue 5, 2006
First Break - Volume 24, Issue 5, 2006
Volume 24, Issue 5, 2006
-
-
Modern 2D and 3D VSP: reservoir imaging from downhole
Authors B. Fuller, M. Sterling and L. WalterBrian Fuller and Marc Sterling, Sterling Seismic Services, and Larry Walter, Geospace Engineering Resources International (GERI), provide a guide to modern VSP and see further potential as the technology matures. Check shot surveys and VSPs have been used for decades as a way to obtain a reliable time-depth tie for seismic reflections, constrain depth migration results, and generally improve seismic interpretations. Along the way seismologists also noted that seismic data recorded in the borehole generally contains higher frequency than surface seismic data. The empirically derived rule of thumb is that VSP data contain two times the frequency content of surface seismic data. It is commonly assumed that seismic frequency attenuation is smaller for borehole seismic data because the VSP raypath is shorter than for surface seismic raypaths and because the VSP seismic wavefield passes through the nearsurface zone only once. Figure 1 shows a direct comparison between a 3D surface seismic image and a 3D VSP image. Abutting slices from the respective 3D volumes both show discontinuous sand bodies where gas production is strongly influenced by faulting and stratigraphic variations. The 3D VSP image, however, contains about twice the frequency content of the surface seismic data and shows many more details of the reservoir. The higher frequency content available in VSP data provides an opportunity to use 2D and 3D VSP as a powerful reservoir development tool. The details of reservoir fault architecture and stratigraphy are simply easier to see with 120 Hz data than 60 Hz data. The purpose of this article is to provide the reader with an overview of modern 2D and 3D VSP methods and some factors driving the current rapid growth in use of the method.
-
-
-
Designing borehole seismic VSPs for complex subsalt or near-salt reservoir evaluation
Authors A. Campbell, L. Nutt, R. Smith and H. ChangAllan Campbell, Les Nutt, Ric Smith, and Hungyu Chang discuss the advantages of Schlumberger’s latest approach to vertical seismic profiles in challenging subsalt environments such as the Gulf of Mexico. Borehole seismic technology used to acquire vertical seismic profiles (VSPs) has proven successful in gathering information and gaining understanding of complex subsalt or near-salt structural environments. However, the success for acquiring any type of VSP survey that will impact drilling and development decisions under these challenging conditions is highly dependent upon careful threedimensional (3D) pre-survey planning and modelling as well as efficient seismic data acquisition and processing techniques. An integrated borehole seismic system from Schlumberger has been developed to help operators accomplish this by optimizing all aspects of borehole seismic services during drilling, acquisition, and processing operations. The technology is discussed along with three VSP survey examples of its application, from job planning through data acquisition, transmission, processing, and results interpretation.
-
-
-
VSP fracture imaging to bridge the well-to-seismic scale gap for a fractured carbonate reservoir
More LessSteve Rogers, Calin Cosma, Peter Shiner, Simon Emsley, and Nicoleta Enescu discuss the power of a vertical seismic profile processing technique using three component recorded data to image a structurally complex carbonate reservoir. Vertical seismic profile surveys are carried out to address a wide range of reservoir objectives that include the simple, e.g., velocity or checkshot surveys, with applications including well log and surface seismic time-depth correlation; acoustic log calibration and synthetic seismogram generation; through to 2D and 3D imaging and salt imaging, and AVO attribute analysis. Whilst three component recording is generally the universal standard for surveys, the horizontal components frequently remain unused. This article describes the results of a VSP processing technique that utilizes all three recorded components, allowing the imaging of key structural features from 2D VSP data sets that allow the positioning of faults and fracture systems in 3D space. This technology was applied to a structurally complex fractured carbonate reservoir that has a hierarchy of fracture elements as a result of several phases of tectonic activity (Emsley et al., 2002). Typically, characterization of the geometry of these fracture elements involves the integration of features imaged in the well bore from image logs, along with the interpretation of lower resolution but wider coverage surface seismic data. Integrating these two data sources was proving problematic for two main reasons. The interpretation of image logs, with a high intensity of widely dispersed fractures, was unclear in the context of larger scale seismic features. Also, the often poor quality seismic data made interpretation of seismic scale faults difficult. This resulted in diminished confidence about the interpretation of several faults or fracture zones from the seismic data.
-
-
-
Feeling the heat, can’t stand the pressure? Risks to borehole integrity when drilling in ultra-HPHT environments
By S.M. WillsonFinding new hydrocarbon reserves is not getting any easier. Despite the historically high prices of oil and gas, there remain significant cost pressures to economically and efficiently exploit hydrocarbon reserves accessible to multinational companies. One potentially cost-effective way of adding new reserves and production is via ‘infrastructureled exploration’. This involves exploring for reserves that are deeper or laterally off-set from developed fields and existing production facilities. Both pose significant, albeit different, drilling challenges. It is the challenges associated with drilling in deeper, hotter and more highly pressured environments that are described here. The term ‘HPHT’ is commonly used to describe wells that are hotter or higher pressure than most. The term came into use upon the 1990 release of the Cullen report on the Piper Alpha platform disaster in the UK sector of the North Sea. Here, HPHT is formally defined as a well having an undisturbed bottomhole temperature of greater than 300°F (149°C) and a pore pressure of at least 0.8 psi/ft (ca. 15.3 lbm/gal, or 1.83 g/cc) or requiring a blowout preventer with a rating in excess of 10,000 psi (68.95 MPa). With combined bottomhole conditions of 20,000 psi (137.9 MPa) and 420°F (215°C), Mobile Bay, off the coast of Alabama, USA, still holds the record as the world’s hottest and highest-pressure offshore producing environment. As exploration foci are targeting deeper and deeper formations, a new terminology is developing that further classifies HPHT conditions (Figure 1). Three levels of HPHT severity are defined: Tier I refers to wells with reservoir pressures up to 15,000 psi (103.4 MPa) and temperatures up to 350°F (177°C). Most HPHT wells drilled to date fall into this category. The Tier II category defines the current focus of ‘ultra-HPHT’ wells, defined as having bottomhole fluid pressures of up to 20,000 psi (137.9 MPa) and temperatures of up to 400°F (204°C). Several deep gas wells currently being planned (or being drilled) in coastal regions of the Gulf of Mexico fall into this category. Tier III defines ‘extreme-HPHT’ conditions, with reservoir pressures up to 30,000 psi (206.8 MPa) and temperatures up to 500°F (260°C). Several deep gas reservoirs onshore North America and in the Gulf of Mexico – both on the Shelf and in deep water – would fall into this category - see Figure 2, adapted from Baker Hughes (2005).
-
-
-
Pore pressure prediction using well-conditioned seismic velocities
Authors L.D. Den Boer, C.M. Sayers, Z.R. Nagy, P.J. Hooyman and M.J. WoodwardAbnormal pore pressures are encountered worldwide, often resulting in drilling problems such as borehole instability, stuck pipe, lost circulation, kicks, and blow-outs (Dutta, 1997). To optimize the choice of casing and mud weight while drilling abnormally pressured formations, a pre-drill prediction of pore pressure is required. A pre-drill estimate of pore pressure can be obtained from seismic velocities using a velocity to pore pressure transform calibrated from offset well data. However, velocities obtained from processing seismic reflection data often lack the spatial resolution needed for accurate pore pressure prediction, due to assumptions such as layered media and hyperbolic moveout. In addition, the uncertainty in velocity is often not quantified. In this example from the Gulf of Mexico, seismic velocities obtained using reflection tomography are combined with well data to produce a refined velocity field that honours the available well information. The refined velocity field is then used to predict pore pressure.
-
-
-
Microgravity as a tool for the detection, characterization and prediction of geohazard posed by abandoned mining cavities
Authors P. Styles, S. Toon, E. Thomas and M. SkittrallThe presence of mining-related cavities or karstic features in the rock mass and their actual or potential collapse pose a severe geohazard and a range of subsidence-related problems for both current and future users of that land. Cavities constitute a hazard to both development and redevelopment as their migration to the surface, as sinkholes or fractured and disturbed ground, may seriously damage property and services, and in severe and catastrophic failure, cause potential significant loss of life. The most common natural targets in karst environments are solution-related features such as voids, extended cavern systems, and the collapse/drainage features associated with swallow holes (or sinkholes). Manmade cavities, including mine workings, shafts and tunnels, are just as hazardous and can be even more prevalent than natural features, particularly in industrialized environments. Prior to the development (or redevelopment) of a site, the most common method of site investigation has been to drill an extensive pattern of boreholes over the target area in an attempt to locate and then define the spatial extent of any cavities. Indirect techniques such as geophysics can give a cost-effective, non-invasive method of cavity delineation with targeted drilling used as a verification tool rather than a primary search technique. The existence of a cavity alters the physical state of the strata and results in a contrast between the cavity and the host stratum that can be detected using suitable geophysical methods if the contrasts are large enough and the features are of a sufficient size (McDowell, 2002). Microgravity involves measuring minute changes in the gravitational pull of the Earth and interpreting the presence of subsurface density variations, such as those produced by voids and cavities, from an analysis of these readings. A cavity usually has a lower density than the surrounding material and may be filled with water, sediment, collapse material, or a mixture of all of these. A void therefore represents a mass deficiency in the subsurface and a very a small reduction in the pull of the Earth’s gravity is observed, which is called a negative gravity anomaly. Although the method is simple in principle, measurement of the minute variations in the gravity field of the Earth to a few parts per billion requires the use of highly sensitive instruments, strict data acquisition procedures, stringent quality controls, careful data reduction, and sophisticated digital data analysis techniques in order to evaluate and interpret the data. These gravity anomalies are superimposed onto much larger variations produced by elevation, topography, latitude earth tides, and regional geological variations and are, usually, almost undetectable by conventional gravity investigations. Microgravity surveying has developed considerably over the last 10 years with the development of modern, high resolution instruments, careful field acquisition procedures, sophisticated data reduction methods, and advanced analysis techniques. Qianshen (1996) presents a thorough review of the fundamentals of the microgravity technique although interpretation in particular has developed significantly since then. It is now possible to detect and interpret anomalies as small as 10 microgal with a repeatability of a few microgals. Not only can the isolated anomalies reveal the location of mines, caverns and voids, either natural or man-made, but they also provide information on their depths, shapes and morphology. Through the use of Euler deconvolution and Gauss’s theorem, the topology and the ‘missing mass’ associated with the void can be calculated in order to provide vital information for the development of remediation strategies and, ultimately, the costs associated with cavity filling. Through the targeted use of repeated post-remediation microgravity surveys, assessments can be made on the success, or not, of the remediation process and help verify the location and distribution of materials used to fill the void space. These attributes have led to the method becoming widely used in hydrogeological, engineering and geotechnical investigations with the significant advantage of leaving the ground completely undisturbed. Conventional site investigation techniques, nowadays sometimes guided by laser cavity scanning, are then employed as directed by the microgravity results to verify the areas deficient in mass. Emsley et al. (1992) and Bishop et al. (1997) describe the application of the microgravity method in the detection of both karstic and man-made cavities and also describe how the resulting data can be enhanced by image processing to better define the anomalies associated with the targets. This paper describes two detailed applications of the microgravity technique for the delineation of mining-related geohazards, the first in a currently operational open-cut gold mine at Kalgoorlie in Western Australia, and the second for the detection of historic chalk mining in the United Kingdom which caused the collapse of the main A2 trunk road into central London in 2002. Both required detailed terrain corrections to be made, in the first case for the effects of the main open cut workings and, in the second, for the influence of surrounding buildings as well as topography. The methods by which these are calculated are very different for the two different environments but are essential if interpretation of small-amplitude subtle anomalies is to be made.
-
Volumes & issues
-
Volume 42 (2024)
-
Volume 41 (2023)
-
Volume 40 (2022)
-
Volume 39 (2021)
-
Volume 38 (2020)
-
Volume 37 (2019)
-
Volume 36 (2018)
-
Volume 35 (2017)
-
Volume 34 (2016)
-
Volume 33 (2015)
-
Volume 32 (2014)
-
Volume 31 (2013)
-
Volume 30 (2012)
-
Volume 29 (2011)
-
Volume 28 (2010)
-
Volume 27 (2009)
-
Volume 26 (2008)
-
Volume 25 (2007)
-
Volume 24 (2006)
-
Volume 23 (2005)
-
Volume 22 (2004)
-
Volume 21 (2003)
-
Volume 20 (2002)
-
Volume 19 (2001)
-
Volume 18 (2000)
-
Volume 17 (1999)
-
Volume 16 (1998)
-
Volume 15 (1997)
-
Volume 14 (1996)
-
Volume 13 (1995)
-
Volume 12 (1994)
-
Volume 11 (1993)
-
Volume 10 (1992)
-
Volume 9 (1991)
-
Volume 8 (1990)
-
Volume 7 (1989)
-
Volume 6 (1988)
-
Volume 5 (1987)
-
Volume 4 (1986)
-
Volume 3 (1985)
-
Volume 2 (1984)
-
Volume 1 (1983)