Exploration Geophysics - Volume 29, Issue 3-4, 1998
Volume 29, Issue 3-4, 1998
- Articles
-
-
-
Creating image gathers in the absence of proper common-offset gathers
More LessAuthors Gijs J.O. VermeerCurrent velocity model building techniques have been developed specifically with parallel geometry in mind. In this geometry it is possible to create common-offset gathers, to migrate individual gathers, and then to analyse moveout in the image gathers directly as a function of offset. In practice, well-sampled 3D common-offset gathers with constant azimuth are not available, at least in land data acquired with the orthogonal geometry or other crossed-array techniques, and not even in data acquired with the parallel geometry.
Therefore, alternative data gathers have to be sought which are suitable for migration and which still allow migration velocity analysis. The method proposed in this paper is based on an extension of the notion of a minimal data set, being a single-fold alias-free data set, suitable for migration. Examples of minimal data sets are common-offset gathers with constant azimuth and cross-spreads. However, proper minimal data sets cannot always be constructed, or, in other cases, minimal data sets do not extend across the entire survey area. This requires the construction of pseudo-minimal data sets. Each pseudo-minimal data set is an approximation of a minimal data set; their number should be equal to the fold count.
In parallel geometry the pseudo-minimal data sets are still close to common-offset gathers. These gathers can be used directly for velocity analysis. In other geometries, the pseudo-minimal data sets encompass a wide range of offsets. Then it is necessary to determine from all traces in a pseudo-minimal data set which trace is the imaging trace, and what is its offset. A possible technique to determine this offset is the vector-weighted diffraction stack.
The proposed data gathering and velocity-analysis technique needs further research and testing for the best results.
-
-
-
-
Mapping of a granite batholith using geological and remotely sensed data: the Mount Edgar Batholith, Pilbara Craton
More LessAuthors Peter WellmanThe gamma-ray spectrometric data over an exposed granite batholith contains detailed information on its structure and composition. The information is poorly displayed in the conventional red-green-blue or hue-saturation-intensity colour-space images. Variation diagrams can be prepared showing the relationship between K, Th and U, and the data can be displayed as a map of distance along the average variation path, and deviation of the observed concentrations from this average variation path. Alternatively, the rocks can be separated into subdivisions of granite types, and these types mapped using separate classification layers.
For the Mount Edgar Batholith, gamma-ray spectrometry results are compared with the results from other techniques – rock sample geochemistry, petrography, air photo interpretation, field observations, structural studies, magnetic anomalies, and Landsat-5 Thematic Mapper data enhanced for geology. In this outcropping batholith the spectrometric data are thought to be superior technique in determining batholith structure and compositional variation, because of it has greater resolution than field mapping and rock sample geochemistry, and because of its tie to rock geochemistry that is lacking in magnetic anomalies.
-
-
-
Tomographic velocity model building for pre-stack depth migration
More LessAuthors Peter WhitingIt is now commonly accepted that prestack depth migration is the best method available for accurate imaging. However, this technique is only used in relatively extreme circumstances. The implementation of pre-stack depth migration is restricted by relatively high costs and turnaround times, as well as its sensitivity to errors in the interval velocity/depth model.
For a pre-stack depth migration project to be successful, a reliable interval velocity/depth model is essential. Multiple iterations of pre-stack depth migration are often required to achieve a sufficiently reliable model. Obviously, this iterative nature contributes significantly to higher costs and turnaround times.
Reflection tomography has been considered, in recent years, as a potential method for finding a reliable velocity model more easily. This method has also been held back due to its own generally high costs and turnaround. Reflection tomography normally requires interpretation of many reflectors on pre-stack data. This is time consuming and prone to error. Also, in efforts to simplify the overall procedure, the inversion itself has sometimes been compromised.
A reflection tomography algorithm has been developed that does not require manual picking of events and does not compromise the inversion process. The scheme depends on a method of tracing reflected raypaths that does not require reflector definition and that allows automatic picking to be successful. It also utilises entropy constraints in a subspace inversion with stages of decreasing model space smoothing. The aim of this inversion is to help ensure that reliable velocity models are created and that local minima are avoided.
This approach to reflection tomography is quite automated and has been applied successfully to many datasets from Australia and around the world. Two examples from offshore Australia demonstrate that the velocity models resulting from this reflection tomography algorithm clearly improve the results of pre-stack depth migration.
-
-
-
Pre-stack depth migration experience in less complicated geological environments
More LessAuthors R. Gareth Williams, Bob Gosling and Steve HollingsworthDepth migration differs from time migration in that it images seismic data correctly in the presence of lateral velocity changes. However, for mild lateral velocity gradients we often use time migration for reasons of cost and stability. The time spent deriving an accurate velocity field and the increased sensitivity of depth migration to the velocity field make depth migration more difficult to apply. For these reasons, depth migration has often only been used when time migration is perceived to fail to image the data properly. Very often, this means that depth migration is only used in particularly difficult and complex geological environments. Unfortunately, in these environments, depth migration often does not provide a clear image where none existed without depth migration because either the ray paths diverge causing the subsurface not to be illuminated by the recording, or current model building techniques are inadequate for such complicated cases. For example, many model updating techniques assume the starting model is either close to the correct answer, slowly varying or that flat horizons exist.
Experience in the Browse Basin and the North Sea has shown that prestack depth migration can be used to improve imaging substantially in comparatively simple geological environments. In the Browse Basin, rugged seabed topography can cause imaging problems that can be addressed with prestack depth migration. It is worth noting that in this context no well information or geological model from an interpreter was necessary; building the model became almost entirely a velocity picking exercise. In many parts of the North Sea a chalk layer with gentle dips lies above the oil and gas bearing targets. The chalk has an interval velocity that is typically twice that of the overburden and the underlying strata. Consequently, the gentle dips at the top and bottom of the chalk are sufficient to cause image distortion at the reservoir level that can only be corrected using prestack depth migration. Consequently, prestack depth migration can be viewed as a tool to obtain better images in regions of either steep dip and moderate velocity changes between layers or moderate dip and high velocity changes between layers. It is not just a tool for extreme geological cases.
-
-
-
Geologists and geophysicists: getting them on the same planet
More LessAuthors A. J. Willocks and B. A. SimonsThe results of new detailed airborne geophysical surveys over Victoria have been lauded by industry as being a great incentive to increase mineral exploration in the State. These data become especially useful when combined with new semi-detailed geological mapping. The Geological Survey of Victoria has now developed a new methodology to integrate geological mapping with the interpretation of the geophysical data to produce a single composite understanding of the rocks and their relationships. It has required a reappraisal of the way geologists and geophysicists map, both together and separately, and additional training to make the process work.
Sufficiently detailed data acquired prior to the geological mapping allows a fully integrated interpretation, using the available geophysical and geological data, to produce maps that reflect both geological and geophysical reality. Previously, geologists and geophysicists worked in partial or complete isolation. Too often geophysicists gave geologists lineament or line maps that bore little resemblance to geological reality, lacked credibility and were almost immediately discarded by geologists as being “unhelpful”. The new process requires geologists and geophysicists to work as a team to reconcile all the geophysical and geological observations to produce an accurate, integrated geological map. It demands that the geologist understands the geophysical responses and the geophysicist understands the geology. Both need to acknowledge the limitations inherent in each method.
Presenting the results provides a further series of challenges to the mappers, interpreters, managers and cartographers. We have also yet to integrate the mineralisation history into this mapping process. Meeting these challenges to produce a full and accurate understanding of the geology and geophysics, rather than of one or the other, is essential to ensure increased exploration success.
-
-
-
How to find localised conductors in GEOTEM® data
More LessAuthors Peter Wolfgram, Marina Hyde and Stephen ThomsonOrebodies, mineralised zones, faults, folds, contacts, etc. may represent localised electrical conductors that create airborne electromagnetic (AEM) responses of interest to explorationists. However, typical AEM datasets in conductive regimes exhibit numerous features besides those of interest and it is left to the interpreter to identify the ones that are of significance to the interpretation task at hand.
Synthetic data can be used to illustrate typical effects of host medium and conductive overburden on target responses, and how these might be identified in the presence of noise such as variations in aircraft ground clearance. Although an understanding of the complex anomaly features is possible, analysing large data sets will require rapid methods of pinpointing anomalous areas and allowing the user to employ visual correlation over a map to aid in the interpretation.
Different transformations of the data can enhance different features of interest to the explorationist. The conductivity depth transform (CDT) maps broad conductive zones and their depths - it is less suitable for detecting localised conductors. The stationary current image (SCI™) on the other hand indicates areas where electric currents become trapped in localised conductive features such as isolated bodies, faults, folds, etc. The SCI emphasises structural features because it is optimised for lateral contrasts in electrical conductivity.
-
-
-
A new method for crosswell reflector imaging
More LessAuthors Jingping Zhe and Stewart GreenhalghImaging seismic reflectors with crosshole and VSP tomographic data is only occasionally carried out using Kirchhoff-style migration schemes. We introduce a new kinematic method here for tomographically imaging reflectors. The scheme first picks up traveltimes of each reflected event from common-shot gather tomographic data. It then uses the picked times to image the reflector interfaces with a very important principle: that two reflection points from the same interface that produce reflection pulses on two neighbouring receivers have the same tangent line.
The same scheme can be used for surface or crosshole reflection data, but the latter requires some special processing to separate upgoing and downgoing waves. This paper will show how to apply the scheme in tomographic surveys. Traveltimes of each reflection event are initially read from a common-shot gather tomographic seismic section. Then each pair of traveltimes from neighbouring traces is processed in sequence. The two isochronal curves are calculated from the two traveltimes. A common tangent to the two isochronal curves is then found and a reflection segment is defined. In this way, all traveltimes in pairs are processed and a straight or curved interface, corresponding to the locus of tangents, is reconstructed to map the reflection interface.
The processing difference between surface seismic reflection data and tomographic (crosshole) seismic reflection data is that the crosshole data contains downgoing waves as well upgoing waves. So the program has to decide if a reflection event comes from above or below the source. Then the program processes the data to obtain the reflector orientation.
This imaging method can handle multiple layers and curved interfaces for an arbitrary 2D velocity distribution.
-
-
-
Radio frequency tomography trial at Mt Isa Mine
More LessAuthors B. Zhou, P.K. Fullagar and G.N. FallonA cross-hole RFEM (Radio Frequency Electromagnetic) tomographic survey was conducted at the Mt. Isa Copper Mine in 1995 as part of a CMTE/AMIRA project investigating the application of geophysics in metalliferous mines. The primary objective of the survey was to evaluate the capability of RFEM for orebody delineation, in a section of the mine where a correlation had previously been established between conductivity and copper grade.
An absorption tomogram constructed from the limited 52.5 kHz data set demonstrated that RFEM has potential in this environment for resolving orebody boundaries and establishing ore continuity between drill holes. The calculated absorption coefficients on the tomogram lie between 0.94 and 5.165 dB/m, consistent with laboratory absorption measurements on rock samples from the survey site.
The continuity of the footwall orebody, paralleling the Paroo Fault, was not well represented in the tomogram, due to low ray coverage in the comer of the image. However, a simple amplitude mask, depicting only the less attenuated ray paths, provided evidence for continuous ore between the holes. This provides encouragement for efforts to combine amplitude masking with tomography.
-
-
-
Crosshole acoustic velocity imaging with full-waveform spectral data: 2.5-D numerical simulations
More LessAuthors Zhou Bing and S.A GreenhalghThis paper focuses on the question of full-waveform inversion and the use of crosshole full-waveform spectral data. We examine the differences in effectiveness of each form of data (real, imaginary, amplitude, phase and Hartley spectra) and determine which is best for imaging the velocity distribution. By 2.5-D numerical simulation for three models, we found that, except for the phase data, the monochromatic real, imaginary, amplitude and Hartley spectra can be used to image the targets between boreholes by full-waveform inversion; the real and imaginary spectra produce nearly the same quality of images. The images obtained from the amplitude data exhibit more artefacts than the others. The inversion of the Hartley spectral data gives the best image of all. By computing the data misfit functions, we found that the profile of the misfit function of the phase data is more complicated than the others. For example, some discontinuities or fluctuations occur in the neighbourhood of the tme solutions. The complexity of the misfit function may be the main cause for the failure to satisfactorily image with the phase data alone: local minima capture the misfit function during the attempted global optimisation.
-
Volumes & issues
-
Volume 56 (2025)
-
Volume 55 (2024)
-
Volume 54 (2023)
-
Volume 53 (2022)
-
Volume 52 (2021)
-
Volume 51 (2020)
-
Volume 50 (2019)
-
Volume 49 (2018)
-
Volume 48 (2017)
-
Volume 47 (2016)
-
Volume 46 (2015)
-
Volume 45 (2014)
-
Volume 44 (2013)
-
Volume 43 (2012)
-
Volume 42 (2011)
-
Volume 41 (2010)
-
Volume 40 (2009)
-
Volume 39 (2008)
-
Volume 38 (2007)
-
Volume 37 (2006)
-
Volume 36 (2005)
-
Volume 35 (2004)
-
Volume 34 (2003)
-
Volume 33 (2002)
-
Volume 32 (2001)
-
Volume 31 (2000)
-
Volume 30 (1999)
-
Volume 29 (1998)
-
Volume 28 (1997)
-
Volume 27 (1996)
-
Volume 26 (1995)
-
Volume 25 (1994)
-
Volume 24 (1993)
-
Volume 23 (1992)
-
Volume 22 (1991)
-
Volume 21 (1990)
-
Volume 20 (1989)
-
Volume 19 (1988)
-
Volume 18 (1987)
-
Volume 17 (1986)
-
Volume 16 (1985)
-
Volume 15 (1984)
-
Volume 14 (1983)
-
Volume 13 (1982)
-
Volume 12 (1981)
-
Volume 11 (1980)
-
Volume 10 (1979)
-
Volume 9 (1978)
-
Volume 8 (1977)
-
Volume 7 (1976)
-
Volume 6 (1975)
-
Volume 5 (1974)
-
Volume 4 (1973)
-
Volume 3 (1972)
-
Volume 2 (1971)
-
Volume 1 (1970)
Most Read This Month