- Home
- A-Z Publications
- First Break
- Previous Issues
- Volume 37, Issue 12, 2019
First Break - Volume 37, Issue 12, 2019
Volume 37, Issue 12, 2019
-
-
Tutorial — time conversion of depth migrated data: part II, TTI preSDM
By Ian F. JonesAbstractContemporary depth imaging projects often require final pre-Stack Depth-Migrations (preSDM) to be converted to time for comparison to vintage pre-Stack Time-Migrations (preSTM), or to facilitate conversion to ‘geological’ depth through calibration to well and check-shot data. Here, I consider the situation of having performed an anisotropic Tilted Transverse Isotropy (TTI) preSDM and wanting to convert it to time via vertical stretch in order to compare it, say, to an anisotropic preSTM.
Such a comparison is inherently invalid, as time-migration will explicitly treat any anisotropy as if it were Vertical Transverse Isotropy (VTI), and, in addition, the lateral positioning error inherent in preSTM will render such comparisons questionable on steeply dipping structures.
I show here that the most appropriate type of ‘velocity’ to use for conversion to time of TTI preSDM reflection events should be the vertical component of the phase velocity. Conversely, if we are considering point-to-point measurements, such as the direct arrival travel time, a down-hole or check-shot measurement, then the group velocity should be used, as it is with this speed that energy travels. In addition, subsequent depth conversion of any time product for interpretational purposes would best be accomplished using a velocity calibrated to well check-shots.
-
-
-
Optimizing performance of data processing
Authors Samuel Brown and Tony MartinAbstractThe industry associates data processing with the application of geophysical algorithms to seismic data, but this definition is too narrow. We should consider a broader meaning particularly in these times of increased automation, and time and cost consciousness. If processing is a method to produce information from the manipulation of data, then we should consider the framework that enables this process. A system must exist to facilitate the work. Generally, systems are a collection of computers, networks, processes and to a lesser extent people. For any given input, they create an output, which in seismic data processing can be the application of a geophysical algorithm, or it can be reordering, reducing or aggregating data. In this system, the headline grabbing geophysical process may only be a small part of the work. The underlying framework needs to be as efficient as it can be, either managing the data, enabling the geophysical process, or both.
-
-
-
Processing and imaging of towed-streamer electromagnetic data with synthetic aperture method
Authors Michael S. Zhdanov and Xiaolei TuAbstractThe synthetic aperture (SA) method is one of the key techniques in remote sensing using radio frequency signals. Introduced in the early 1950s, Synthetic Aperture Radar, or SAR revolutionized reconnaissance surveying using radio and microwave signals. The key idea of the SAR is based on considering the entire flight path of airborne or spaceborne platform as a huge synthetic antenna, and on processing simultaneously all signals collected along this path. As a result, the clarity and resolution of the images produced by SAR can be improved dramatically.
-
-
-
Post-migration inverse Q filtering to enhance amplitude supported prospectivity evaluation
AbstractOur primary investigative tool for understanding the subsurface prospectivity is seismic data, where energy reflected from different seismic horizons can reveal the physical properties of these rocks. The deeper in the earth we look, the weaker the reflected signal will be, so finding ways to boost this signal, without affecting its properties is critical to being able to analyse this signal. Furthermore, seismic interpretation requires a quantitative understanding of the seismic data and it is common practice to assume that the seismic amplitudes can be understood in terms of either acoustic or elastic wave propagation. Preparing ‘interpretation-ready’ seismic data therefore faces two possibly conflicting challenges. On the one hand, seismic processing should conserve the amplitude fidelity of the seismic data. On the other hand, compensation of viscous effects such as attenuation and dispersion facilitates data analysis.
-
-
-
An automated quantitative multi-stage approach to invert velocity models for microseismic event locations
Authors Fernando Castellanos, Mike Preiksaitis, Ryan Nader, Vlad Shumila, Steve Falls, Dan Hook and Doug AngusAbstractComplexity in hydraulic fracturing programmes has motivated microseismic service providers to innovate and propose creative methods to monitor drilling, completion and field development. Although microseismic analysis and interpretation have moved beyond the ‘dots-in-a-box’ solution, velocity model (VM) calibration using inversion plays a critical role in the initial phase of accurate microseismic event (event) locations to ensure the accuracy of subsequent higher-order microseismic attributes given data quality and monitoring geometry. Knowing that business decisions are, at times, required in real time, it is imperative to provide confident event locations efficiently through the construction of well-constrained VMs based on quantitative and objective methodologies. The most time-consuming aspects of microseismic data processing are optimal VM construction and inversion. In this paper, we demonstrate improved efficiency in microseismic data processing by developing and implementing an automated approach to perforation shot (perf) detection and VM inversion using Particle Swarm Optimization (PSO). These primary tasks (perf detection and VM inversion) are critical in the event location workflow and can benefit significantly from increased efficiency. Although more advanced Greens functions can provide more accurate solutions to the source location problem (e.g., Angus et al., 2014), we focus on ray-based approaches due to their high computational efficiency, especially for anisotropic media and hydraulic fracture monitoring where large volumes of microseismic data (commonly in excess of 100,000 events) must be processed.
-
-
-
Elastic imaging and its benefits — Permian Basin example
Authors David Langton, Alex Biholar, Kenton Shaw, Steve Adams, Mike Bradshaw, Jeff Codd, Xiaoling Tan, Allon Bartana and David KesslerAbstractSeismic imaging has been continuously advancing since the early days of computer revolution in the 1970s. Practical imaging during this time was carried out only in two dimensions using simplified wave equations on poststack data. Subsequently, in the early 1980s algorithm improvements in wave equation migration after the introduction of one-way phase shift methods and two-way reverse time migration occurred. Concurrently, improvements in ray-based Kirchhoff migration emerged after the introduction of eikonal and wavefront reconstruction solvers for calculation of travel times. In the late 1980s, 3D prestack Kirchhoff migration began to be used.
-
Volumes & issues
-
Volume 42 (2024)
-
Volume 41 (2023)
-
Volume 40 (2022)
-
Volume 39 (2021)
-
Volume 38 (2020)
-
Volume 37 (2019)
-
Volume 36 (2018)
-
Volume 35 (2017)
-
Volume 34 (2016)
-
Volume 33 (2015)
-
Volume 32 (2014)
-
Volume 31 (2013)
-
Volume 30 (2012)
-
Volume 29 (2011)
-
Volume 28 (2010)
-
Volume 27 (2009)
-
Volume 26 (2008)
-
Volume 25 (2007)
-
Volume 24 (2006)
-
Volume 23 (2005)
-
Volume 22 (2004)
-
Volume 21 (2003)
-
Volume 20 (2002)
-
Volume 19 (2001)
-
Volume 18 (2000)
-
Volume 17 (1999)
-
Volume 16 (1998)
-
Volume 15 (1997)
-
Volume 14 (1996)
-
Volume 13 (1995)
-
Volume 12 (1994)
-
Volume 11 (1993)
-
Volume 10 (1992)
-
Volume 9 (1991)
-
Volume 8 (1990)
-
Volume 7 (1989)
-
Volume 6 (1988)
-
Volume 5 (1987)
-
Volume 4 (1986)
-
Volume 3 (1985)
-
Volume 2 (1984)
-
Volume 1 (1983)