Full text loading...
-
Quantitative Quality Control: a Tool for Seismic Data Processing Monitoring and Comparison
- Publisher: European Association of Geoscientists & Engineers
- Source: Conference Proceedings, Saint Petersburg 2018, Apr 2018, Volume 2018, p.1 - 5
Abstract
Quality Control (QC) is an essential way of seismic data processing follow up necessary to insure appropriateness of the final deliverables. Qualitative approach based mainly on visual analysis of spectra, gathers, sections, slices etc. before and after key processing steps is most commonly used for such a QC.
Introduction of so-called “well-driven/guided” methodology can be regarded as a shift towards more quantitative way for seismic data processing QC. In Total “Quantitative QC” methodology was extended even further beyond well-to-seismic ties by integration of different seismic attributes, numerical statistics from which provides quantification of seismic data quality in different domains (signal quality, its vertical, lateral and AVO stability and consistency etc.) after key stages of any particular processing workflow. The current paper describes main principles of “Quantitative QC” implementation followed by a real case example illustrating that it allowed not only evaluation of re-processing potential compared to the legacy dataset, but also comparison and ranking of 5 contractors participated in pilot re-processing competition.
However, it is believed that quantitative approach is not intended to fully substitute the qualitative one, but rather complement it providing broader vision on (re-)processing effectiveness and its eventual impact on seismic data quality.