1887
Volume 22, Issue 8
  • ISSN: 0263-5046
  • E-ISSN: 1365-2397

Abstract

Murray Roth, executive vice president of marketing & systems, Landmark Graphics, explains how seismic data processing will have to adapt to the looming scenario of a seriously shrinking pool of expertise facing an explosion in the growth of data volumes. Since the dawn of digital seismic processing more than 40 years ago, the basic processing workflow has changed very little, despite huge strides in the evolution of computing hardware and algorithms. Seismic processors still follow time-honoured steps to take field data through static fractions, noise attenuation and parameter selection, ultimately ending up with a migrated, stacked image of the subsurface. Unfortunately, industry resources are shrinking while data volumes are exploding. Revolutionizing the way seismic processors do their daily work is not only desirable, but essential. This article reviews where we’ve come from, where we’re at today, and where we can reasonably expect to go in the near future based on emerging innovations in information technology. Origins of seismic processing The first real computer application in the oil patch was seismic data processing. During the 1950s, analogue seismic surveys were acquired, processed and interpreted in the field by a single individual known as a ‘human computer.’ He laid out the survey lines and supervised the shoot by day. At night, he ‘processed’ the analogue shot records, marked subsurface reflectors of interest, hand-timed and plotted each trace on paper, and drew a rough cross section looking for structural highs, to recommend for drilling. By the late 1950s, analogue seismic records were converted into digital form using big number-crunching computers (Figure 1). To do this, of course, petroleum companies had to pull the seismic processing step out of the field and move it into a centralized computer facility. Now field crews recorded data on magnetic tape, sent them to a processing centre where raw data were turned into clean paper sections, which were forwarded to an office somewhere else where seismic interpreters focused on mapping structures and identifying prospects. The original, unified prospect generation ‘value chain’- from seismic acquisition through processing to interpretation - although enhanced by new technology was nevertheless fragmented into increasingly isolated specialties. Only in recent years have they begun to reunite, through even more advanced information technologies. From the 1960s through the late 1980s, batch seismic processing sequences were executed overnight on mainframe computers. In 1990s, certain parts of the processing workflow moved onto a range of powerful new computers, from interactive workstations to massively parallel supercomputers. During that decade, interpreters also adopted increasingly sophisticated computer systems for the analysis and visualization of processed seismic data.

Loading

Article metrics loading...

/content/journals/0.3997/1365-2397.22.8.25985
2004-08-01
2024-04-25
Loading full text...

Full text loading...

http://instance.metastore.ingenta.com/content/journals/0.3997/1365-2397.22.8.25985
Loading
  • Article Type: Research Article
This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error