1887

Abstract

Seismic exploration relies on the collection of massive data volumes that are subsequently mined for information during seismic processing. While this approach has been extremely successful in the past, the current trend towards higher quality images in increasingly complicated regions continues to reveal fundamental shortcomings in our workflows for high-dimensional data volumes. Two causes can be identified. First, there is the so-called "curse of dimensionality" exemplified by Nyquist's sampling criterion, which puts disproportionate strain on current acquisition and processing systems as the size and desired resolution of our survey areas continues to increase. Secondly, there is the recent "departure from Moore's law" that forces us to lower our expectations to compute ourselves out of this curse of dimensionality. In this paper, we offer a way out of this situation by a deliberate randomized subsampling combined with structure-exploiting transform-domain sparsity promotion. Our approach is successful because it reduces the size of seismic data volumes without loss of information. As such we end up with a new technology where the costs of acquisition and processing are no longer dictated by the size of the acquisition but by the transform-domain sparsity of the end-product.

Loading

Article metrics loading...

/content/papers/10.3997/2214-4609.201400639
2010-06-14
2024-04-28
Loading full text...

Full text loading...

http://instance.metastore.ingenta.com/content/papers/10.3997/2214-4609.201400639
Loading
This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error