- Home
- A-Z Publications
- First Break
- Previous Issues
- Volume 7, Issue 4, 1989
First Break - Volume 7, Issue 4, 1989
Volume 7, Issue 4, 1989
-
-
Deep VSP study in a complex Alpine overthrust area
Authors G.R. Winkler and B.R. CassellDuring the last decade exploration activity in the Alpine overthrust area has been encouraged by evidence of oil and gas in commercial amounts in and below the Calcareous Alpine zone where exploration is seen to be economically viabie. In this paper we concentrate on the Vorarlberg area of western Austria with the objective of delineating features in the limestone cover, the underlying sequence and the sub-Alpine basement. As expected, seismic exploration is impeded by limited accessibility due to topography consisting of narrow valleys and high mountains. Also, interpretation of seismic sections derived from the relatively sparse network of lines is generally made difficult by variable data quality. In addition to these difficulties, geophysical and geological interpretation is anything but straightforward due to the predominance of complex overthrust nappe structures with large velocity contrasts between them. A 4200m deep wildcat well (Au-1) was drilled in order to obtain a better definition of seismic velocities as well as structural and dip information in an area where the depth of the exploration targets range from 2000 to 9000 m below the surface. A vertical seismic profile (VSP) survey was carried out in order to provide a more accurate estimate of the velocity profile and, hopefully, to obtain an image of the local structure. In this artiele we discuss the borehole seismie acquisition and processing techniques used to produce an image of the structure and compare our results with the existing model derived from surface seismie and geological data.
-
-
-
Computer sciences for geophysicists. Part IX: the reliability of software, geophysical or otherwise
By L. HattonI expect some of you are still wondering what happened to the article in this series on graphics, devices, standards usage and so on. I still am. I suppose I ought to come clean and admit the real problem, which is not so much indolence on my part as the essentially ephemeral nature of graphics standards. It seems as though these standards possess a fifth force of nature to accompany weak, strong, electromagnetic and gravitational forces. This force is the political force. In essence, it works like this. Different graphics committees come together, exchange a DISON, an elementary particle of unbelievable mass which carries a Draft International Standard, no charm and little agreement. Having done this, the exchangers then diverge as rapidly as possible, often exceeding the speed at which free food and drinks disappear at Icebreakers. Thus it is with three-dimensional (3D) graphics (and FORTRAN 8X). After the various international organizations came together fleetingly like mayflies on a balmy day producing GKS, the Graphical Kemel System for two-dimensional (2D) graphics, they have now set off in opposite directions to produce Draft Standards for 3D which are almost but not quite, completely dissimilar. If you wish to keep up with things, the current protagonists are PHIGS (look it up) and GKS 3D. Coupled with other standards which seem to be vying for attention, such as CGI, (Computer Graphics Interface), which is a NQTS (Not Quite The Same) standard, I have temporarily given up and clutch at GKS 2D as a rock in an otherwise uneasy ocean. The very best thing you can do with graphics standards seems to be nothing as yet unless you are brave and rich enough to go it alone. The real subject I would like to address here is one which is rapidly assuming great importance to everybody involved in the use of software. This is the subject of reliability.
-
-
-
Improving the reliability of seismic data processing
By L. HattonThis paper is about a practical technique for the processing of seismic data which will help to isolate any problems due to software error and algorithmic inconsistencies, as reported for example by Hatton (1989). It is quite short, requires no new processing techniques and can be done using existing software. What then is the drawback? The simple answer is that it costs more, although there has never been a better time to use such a methodology, as commercial seismic data processing today is more of a charity than a viable business.
-
Volumes & issues
-
Volume 42 (2024)
-
Volume 41 (2023)
-
Volume 40 (2022)
-
Volume 39 (2021)
-
Volume 38 (2020)
-
Volume 37 (2019)
-
Volume 36 (2018)
-
Volume 35 (2017)
-
Volume 34 (2016)
-
Volume 33 (2015)
-
Volume 32 (2014)
-
Volume 31 (2013)
-
Volume 30 (2012)
-
Volume 29 (2011)
-
Volume 28 (2010)
-
Volume 27 (2009)
-
Volume 26 (2008)
-
Volume 25 (2007)
-
Volume 24 (2006)
-
Volume 23 (2005)
-
Volume 22 (2004)
-
Volume 21 (2003)
-
Volume 20 (2002)
-
Volume 19 (2001)
-
Volume 18 (2000)
-
Volume 17 (1999)
-
Volume 16 (1998)
-
Volume 15 (1997)
-
Volume 14 (1996)
-
Volume 13 (1995)
-
Volume 12 (1994)
-
Volume 11 (1993)
-
Volume 10 (1992)
-
Volume 9 (1991)
-
Volume 8 (1990)
-
Volume 7 (1989)
-
Volume 6 (1988)
-
Volume 5 (1987)
-
Volume 4 (1986)
-
Volume 3 (1985)
-
Volume 2 (1984)
-
Volume 1 (1983)