-
oa Tackling the “Data Deluge”: a Dimensionality-reduction Approach
- Publisher: European Association of Geoscientists & Engineers
- Source: Conference Proceedings, 73rd EAGE Conference and Exhibition - Workshops 2011, May 2011, cp-239-00089
- ISBN: 978-90-73834-13-2
Abstract
Current-day imaging and inversion technology increasingly relies on faithful samplings and simulations of seismic wavefields. This reliance on full sampling and high-fidelity wavefield simulations strains our acquisition and processing systems and overcoming this impediment is becoming one of the main challenges faced by our industry. By using randomized dimensionality-reduction techniques, we propose a new strategy where acquisition and computational costs are no longer dictated by the sampling grid but by transform-domain compressibility of the image. To arrive at this result, we combine recent findings from machine learning / stochastic optimization—where (nonlinear) inversions are carried out on random subsets of data—and compressive sensing—where data that permit compressible representations are deliberately subsampled. The key idea of the stochastic approximation is to reduce computational costs by computing each gradient update on a different randomly selected subset of data. In seismic exploration, this corresponds to carrying out migrations with one or a few incoherent supershots made of superpositions of random source-encoded experiments. While this approach introduces source cross talk, it has been applied successfully to reduce the cost of least-squares migration and full-waveform inversion because it reduces the number of wave-equation solves. However, the method is sensitive to noise and relies on relatively large numbers of supershots and wave simulations to get reasonable results.