1887

Abstract

Seismic imaging relies on the collection of multi-experimental data volumes in combination with a sophisticated back-end to create high-fidelity inversion results. While significant improvements have been made in linearized inversion, the current trend of incessantly pushing for higher quality models in increasingly complicated regions reveals fundamental shortcomings in handling increasing problem sizes numerically. The so-called "curse of dimensionality" is the main culprit because it leads to an exponential growth in the number of sources and the corresponding number of wavefield simulations required by 'wave-equation' migration. We address this issue by reducing the number of sources by a randomized dimensionality reduction technique that combines recent developments in stochastic optimization and compressive sensing. As a result, we replace the current formulations of imaging that rely on all data by a sequence of smaller imaging problems that use the output of the previous inversion as input for the next. Empirically, we find speedups of at least one order-of-magnitude when each reduced experiment is considered theoretically as a separate compressive-sensing experiment.

Loading

Article metrics loading...

/content/papers/10.3997/2214-4609.20149172
2011-05-23
2024-04-26
Loading full text...

Full text loading...

http://instance.metastore.ingenta.com/content/papers/10.3997/2214-4609.20149172
Loading
This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error