Home

Quick Links

Search

 
Computer science for geophysicists. Part I: elements of a seismic data processing systemNormal access

Author: L. Hatton
Journal name: First Break
Issue: Vol 25, No 1, January 2007 pp. 83 - 86
DOI: 10.3997/1365-2397.2007003
Language: English
Info: Article, PDF ( 319.42Kb )
Price: € 30

Summary:
Of the many subjects of interest to the practising geophysicist, that of computer science is at the same time one of the most important and one of the least well understood. This article is the first in what will be a series talking about computer science and its relevance to exploration geophysics. The material in this article is extracted from a series of lectures on computer science which form a smaIl part of the M.Sc. course in Petroleum Seismology at Oxford University. It is common to find geophysical programming considered as a ‘black art’, amongst both geophysical programmers and the geophysical users of their products. One of the main reasons for this is that the demands of seismic data processing have traditionally pushed computer technology to its limits. At any stage in the last twenty years there have always been a number of processes which we could apply to the data but could never afford. Nothing has changed. Algorithmically, we know how to do fuIl three-dimensional depth migration before stack, but it would take the world’s most powerful computers many years to do it. (I must confess, however, that our knowledge of the necessary velocities might be a bit shaky.) Even something as mundane as demultiplexing can induce early retirement in a computer centre manager if the tapes concerned happen to be uncorrelated, thirty-odd second sweep, 2 msec Vibroseis at 6250 fpi (more on these arcane subjects later). These difficulties in implementing geophysical techniques have unfortunately tended to breed the computer ‘guru’ who by dint of erudite and obscure solutions, manages to get them to work. Such solutions tend to be extremely difficult either to maintain (correct errors), or enhance (add new functionality). Hence, each time the computer hardware laboriously (and temporarily) catches up, the whole lot often has to be re-written.


Back to the article list