- Home
- A-Z Publications
- First Break
- Previous Issues
- Volume 22, Issue 1, 2004
First Break - Volume 22, Issue 1, 2004
Volume 22, Issue 1, 2004
-
-
Why the future lies in national data centres
Authors S.V. Gmyzin and B. BouffardMany countries around the world own E&P data, the value of which has not up to now been fully realized. It’s an issue that Schlumberger has been addressing with its concept of national data centres. Sergei V. Gmyzin, head of natural resources regulation and oil and gas complex development, Administration of Yamal-Nenetz Autonomous District, Russia, and Brice Bouffard, manager information management, Schlumberger Information Solutions, explain the background. National data repositories (NDR) of E&P data are to be found in some form or other in most countries where there are, or have been, significant oil and gas operations. It is the belief at Schlumberger that many of these NDRs could be the source of substantial untapped value if the data available were to be fully exploited and made available to E&P companies. Typically NDRs are simply storage places for data, effectively the vault where E&P data is gathered and archived. They are usually operated by an official agency representing some level of government, e.g. national, state, provincial or regional, which is seen principally as custodian for the E&P data assets. Such agencies may not of course be completely static, and their role may include: ■ to manage and preserve all data generated from oil and gas E&P activities in the geopolitical region; ■ to report to government regulatory agencies; fulfill the industry’s legal obligation to track and report activities and to archive reserves information; ■ to assist in promoting investment in the country’s oil and gas industry by making data of known quality accessible to prospective new investors; ■ to provide a central repository of knowledge for the geotechnical community at large, including oil companies, government and universities; ■ to manage technical publications on the country’s petroleum resources.
-
-
-
Updating seismic data management for the 4C/4D world
More LessJill Lewis and Robert Firth of Troika International say it is time to update exchange formats to meet the challenge of 4C/4D surveys highlighted by a huge explosion in data volumes. Information management and data exchange has always been a requirement in the world of hydrocarbon exploration. As an industry we have a completely different life cycle of data value than other industries. It is often quoted by record management experts that 95% of data loses its value in the first 18 months, and only the remaining 5% is ever looked at again. This is certainly not the case with exploration data, especially in those parts of the world where values may change dramatically due to geo-political or environmental issues. In order to support this extraordinary longevity, it is imperative that we are able to exchange information and have some commonality between data models and exchange formats. In the early days of exploration we had a very complex mixture of formats that were mainly driven by the architecture of the hardware. In most cases the format of digital field data was dictated by the acquisition system being used, and every processing contractor used their own proprietary format for processed data. This Tower of Babel situation was addressed by the standardisation efforts of the SEG, resulting in the SEGA/B/C formats for field data and SEGY for processed data. Although by this time general purpose computer systems were being used for seismic data processing, the gapless record structure used by the field formats meant that specialist tape controller hardware was required to read the data. This was a trade-off between the requirements of the acquisition and processing people. These issues were subsequently addressed by the SEGD format. On the other hand SEGY has proven to be extremely successful as an exchange format, although its limitations became increasingly apparent with the advent of 3D data, resulting in the recent revision of the SEGY standard.
-
-
-
Herding cats - the challenge of data and application integration
By C. HarterClay Harter, chief technology officer, OpenSpirit Corporation, explains how the ‘middleware’ solution to application integration has taken shape to manage data across multiple data repositories. The upstream oil industry is highly data-intensive, and oil companies face a variety of challenges in their quest to effectively manage data across multiple data repositories. Focusing just on the geotechnical realm, companies have many projects containing data of various types—engineering, geologic, and geophysical data—and a host of issues arises as a result of employing the different applications that need to utilize this data. Let’s consider the case of Jack, a hypothetical geologist in an oil company who needs to run a reservoir characterization program on his PC desktop computer to build a reservoir model covering several offshore lease blocks. The program needs to integrate diverse data types, including seismic data and its interpretation, well information, logs, and formation tops. Jack doesn’t know exactly where all the data he needs is located, so he asks his IT specialist, Jill, to help him locate and load the required data. Jill will have to determine which projects have data covering the lease blocks of interest and then export data from these projects into appropriate exchange formats (e.g., SEGY for the seismic data, LAS for the well logs, and ASCII for the horizons and faults), then FTP the exported data to the PC, where Jill, perhaps with assistance from another IT specialist, has to run the data loaders for Jack’s application. Additionally, Jill needs to make sure that all the data is in the same coordinate system and units before loading the data . Once all these tasks are complete, Jack is ready to roll. Elapsed time: two weeks…minimum. Jack’s workflow, while not out of the ordinary, illustrates many of the software integration problems that companies have faced ever since this industry first started doing computer- based analysis and interpretation. As a community, we have tried various approaches to address these problems with limited success. Only recently has a solution appeared that allows effective management of diverse data in multiple repositories and in addition offers end users a window to innovation in allowing flexible multi-vendor application interoperability.
-
-
-
Supplier integration: driving costs out of the energy industry supply chain
By R. MunroCanadian company Digital Oilfield has been developing e-technology aimed at reducing oil company expenditure costs with suppliers through automation of invoicing and contract compliance processes. Rod Munro, president and CEO, explains the background and the potential for savings estimated at anything between 5-10%. The global oil and gas industry spends nearly a trillion dollars per year in upstream and downstream operations. Although Digital Oilfield anticipated that there was significant value in supplier integration for both upstream and downstream, the focus of its analysis centred on E&P expenditure because this portion of the industry is heavily outsourced to suppliers. More than $400 billion is spent annually in E&P operations around the world. Driven by continued high commodity prices and global uncertainty, these expenditures are forecast to continue increasing over the next five years. E&P expenditures are derived from more than 500 different companies operating in the US, Canada and internationally. Forecast expenditures by region are shown in the chart. Digital reviewed public company financial information to derive a ‘typical’ E&P company profile. While a ‘typical’ E&P company doesn’t exist, the following graph provides a relatively average profile of E&P company expenditures, normalized to $100 million in total annual spend. Capital acquisitions of major producing properties or peer companies, on a percent of total expenditure basis, vary widely from year to year. Digital defines the ‘supplier-based spend’ as capital, operating and development expenditures made with suppliers to find, develop and operate producing oil and natural gas reserves. It is this supplier-centric spend that is the target of a supplier integration initiative. The graphic below shows that even in an E&P company with reasonable annual acquisition costs, the supplier-based spend represents more than 50% of a typical company’s total annual expenditures.
-
-
-
Designing for interoperability usinga multi-repository approach to E&P data
By N. ChartNicholas Chart, solution manager, åpos and data management solutions, Paradigm, describes the move away from the holy grail of integration to the more manageable goal of interoperability. In recent years we have seen various attempts to integrate software and databases in the E&P world. The result is a number of widely used heavyweight database systems and associated software packages which are more or less integrated with them. Early on in the evolution of these systems, data management professionals were already starting to say that interoperability is more desirable than integration. While integration is perceived as enabling some packages to work together closely while locking out other packages, thus preventing users from choosing their preferred software tools, interoperability implies that software packages can be used together with maximum effectiveness in each package and minimum connectivity problems between packages. While this facet of interoperability is undoubtedly a vital component of the modern E&P workflow, the real value of interoperability is that it enables oil and gas companies to protect their investment in data by providing flexible access to multiple repositories, for both new and existing data.
-
-
-
The stacking response: what happened to offset?
By I. GauslandThis paper is based on a presentation given by the author at the 2001 EAGE meeting in Amsterdam. At the end of his talk the author said: ‘If my presentation has stimulated your thinking on what seismic data processing really does to the seismic data, then I have succeeded in my efforts. Some of you may disagree with me, and that is fine. Hopefully your disagreementwill be constructive, and the net result will be an even better understanding of what stacking does to the data.’ He continued by inviting others to present their views at future EAGE meetings, or in First Break. We hope prospective authors will accept the challenge, and we look forward to publishing anyinteresting comments and discussion on this very important topic.
-
-
-
Analogue (plaster) modelling and synthetic seismic representation of hangingwall fault
Authors R. Lindanger, M. Øygaren, R. H. Gabrielsen, R. Mjelde, T. Randen and B. A. TjøstheimThe complex structuring associated with ramp-flat-ramp extensional master faults, as well as the strain pattern in the hanging-wall fault block of such faults, have been analysed with the help of analogue Plaster of Paris models. This analysis shows that the shallow-dipping master fault commonly develops several fault branches due to either asperity bifurcation or hanging-wall and footwall splaying, and that the deeper parts of steeper early generation faults are sometimes cut by younger faults. This generates a very complex fault pattern in the deeper part of the hanging-wall fault block. Based on the analogue models, the possibility of generating synthetic reflection seismograms by the use of simple modelling techniques (ray tracing and finite-difference models) has been investigated. It is concluded that the two methods produce significantly different results in terms of resolution and noise. The ray-tracing method resulted in a highly idealized image that can be viewed as the end product after an idealized processing sequence in that no noise, no multiples or other artefacts are incorporated. Furthermore, all reflections are in the correct position and are displayed with the correct amplitude in each case. Although the result can be used as a good reference for what can be obtained by simple seismic modelling, the weak aspect of this method is obviously the over-simplified and unrealistically ‘clean’ image presented. A more ‘realistic’ reflection seismic image is obtained in cases where the synthetic seismogram is generated by the finite-difference method. However, the structural features stand out with less clarity, and it is less likely that the interpreter of this section would be able to identify the two master fault planes or to distinguish the complex structural pattern in the deep part of the hanging-wall fault block.
-
Volumes & issues
-
Volume 42 (2024)
-
Volume 41 (2023)
-
Volume 40 (2022)
-
Volume 39 (2021)
-
Volume 38 (2020)
-
Volume 37 (2019)
-
Volume 36 (2018)
-
Volume 35 (2017)
-
Volume 34 (2016)
-
Volume 33 (2015)
-
Volume 32 (2014)
-
Volume 31 (2013)
-
Volume 30 (2012)
-
Volume 29 (2011)
-
Volume 28 (2010)
-
Volume 27 (2009)
-
Volume 26 (2008)
-
Volume 25 (2007)
-
Volume 24 (2006)
-
Volume 23 (2005)
-
Volume 22 (2004)
-
Volume 21 (2003)
-
Volume 20 (2002)
-
Volume 19 (2001)
-
Volume 18 (2000)
-
Volume 17 (1999)
-
Volume 16 (1998)
-
Volume 15 (1997)
-
Volume 14 (1996)
-
Volume 13 (1995)
-
Volume 12 (1994)
-
Volume 11 (1993)
-
Volume 10 (1992)
-
Volume 9 (1991)
-
Volume 8 (1990)
-
Volume 7 (1989)
-
Volume 6 (1988)
-
Volume 5 (1987)
-
Volume 4 (1986)
-
Volume 3 (1985)
-
Volume 2 (1984)
-
Volume 1 (1983)