- Home
- A-Z Publications
- First Break
- Previous Issues
- Volume 24, Issue 1, 2006
First Break - Volume 24, Issue 1, 2006
Volume 24, Issue 1, 2006
-
-
Turnaround in UK’s oil and gas industry fortunes needs careful nurturing, says report
The latest Economic Update from the UK Offshore Operators Association entitled ‘Taking the long term view’ argues that the recovery in investment and E&P activity on the UK Continental Shelf will require careful management if it is to be sustained against global competition. We publish here a slightly edited version of the report which was published at the end of November 2005. The UK’s offshore oil and gas industry is taking the opportunity offered by the success of co-operative initiatives instigated by PILOT (industry/government forum chaired by the UK Energy Minister) and higher oil prices to raise investment levels. This investment can be seen in all areas of activity and will slow the rate of production decline, maximize recovery of existing reserves, increase tax revenues for the nation, and give the best chance of finding and producing the undiscovered reserves that remain. The recovery following the 2002 tax shock is almost complete and has been encouraged by several years of good government/industry co-operation, higher prices, and the anticipation of fiscal stability. Competition for investment capital, equipment, and skills from other oil and gas provinces around the world is growing as investment opportunities expand in lower cost areas with bigger prospects.
-
-
-
PGS shows off electrical marine vibrator to capture ‘alternative’ seismic source market
By R. TenghamnOne of the prize exhibits at the recent SEG annual meeting in Houston was the large, bright yellow, elliptical-shaped object standing on the Petroleum Geo-Services (PGS) booth. ‘The reaction was overwhelming,’ says Rune Tenghamn vice president, marine technology. The oversize exhibit was in fact the first public showing of a prototype electro-mechanical marine vibrator for seismic surveys that can produce energy penetration and frequency bandwidth comparable to a small airgun or dynamite charge. Tenghamn says that PGS is responding to a perceived demand from industry for an alternative source of energy for meeting very specific marine survey circumstances. Probably the most pressing of these involves environmentally sensitive areas where the disturbance of marine mammals is an issue with use of airguns deemed inappropriate. The other area of potential interest in the marine vibrator may be as a seabed source in permanent monitoring of reservoirs using seismic methods. ‘To date our main concern has been to demonstrate that the concept works and let everyone know,’ Tenghamn says. He began taking an interest in alternative seismic sources before he joined PGS, and in 1994 presented a paper at the EAGE Annual Conference in Paris on an electrical-type source. At PGS ideas formulated in the mid 1990s were realized in 1999 with a series of field trials of a marine vibrator and testing with a major oil company which ended in 2002. ‘Since then we haven’t really done very much with the technology, but believe that the time is now ripe to go public. We know there’s a lot of discussion about alternative sources and we hope that this will translate into a collaboration to turn our prototype into a commercial product. There are clear cost and operational advantages over any other option.’
-
-
-
E&P information management: beyond Web portals
More LessJamie Cruise and Ugur Algan of Volantice review the effectiveness of current information management (IT) systems for the E&P and predict changes which will come from focusing on businesss process modelling and software advances from outside the E&P sector. As knowledge workers, we recognize that the outcome of a project will be impacted by how effectively we are able to gather relevant data and feed these data into the variety of computer applications that underpin our work. However, most of us would rather focus our efforts on the primary goals of the projects we work on, which are dependent on our analysis and design skills, rather than the mundane business of Information Management (IM). Unfortunately, the history of the E&P business shows that the IM burden is increasing rather than decreasing for information managers as well as end users, even though a number of new productivity tools have been introduced over the past decade. This situation adversely affects the effectiveness of E&P teams and the quality of their decisions. A number of factors contribute to this increase in the IM burden: ■ Increase in volume and diversity of available information: real-time drilling operations and production monitoring data; smart wells; 4D & 4C seismic; pre-stack seismic; multi-scenario projects with stochastic models, simulation, and optimization. ■ An increasing gap between existing IM standards and best practices, and the IM needs of newly introduced, widelyadopted, innovative tools and applications ■ A tighter regulatory environment leading to an increased need to audit our processes for compliance (e.g. Sarbanes- Oxley, Basel II, IFRS etc.). We must be able to identify the provenance of every piece of knowledge, information and data that supports our key decisions ■ Continued consolidation of operating companies, resulting in major challenges to resolve differences in technical platforms, skills, culture, and IM practices across previously disparate groups. ■ A change in demographics, leading to a shortage of skilled and experienced resources, resulting in increased pressure to preserve and disseminate corporate knowledge. In this article, we explore opportunities for innovations that may solve the dual problems of delivering information to our geoscientists, engineers, and managers, as well as capturing the results of their work in a way that allows easy re-use in other contexts. We discuss the use of business process modelling (BPM), which we believe will be an integral part of these future solutions. BPM drives the flow of information through the enterprise according to the needs of the endusers rather than forcing the users to adapt their processes to fit with convoluted IM systems. Furthermore, we believe that these innovations will be spurred on by advances in software developed and deployed by the broader (non-E&P) business community. The emergence of freely available enterprise class infrastructure and end-user components (such as Linux and other open source initiatives) should signal a further move away from proprietary technology for E&P systems. Of particular interest is the widespread acceptance of service oriented architecture (SOA) as the dominant approach for constructing dynamic, process-oriented systems that combine data and functionality from a variety of widely distributed sources.
-
-
-
Why data management is more important than ever
Authors B. Bouffard and L. BayneBrice Bouffard, vice president, information management, and Lester Bayne, product director, information management, both of Schlumberger Information Solutions (SIS), explain with examples from real E&P projects how a holistic approach to information management adds value and why, now more than ever, this approach is vital to governments, operators, and, indeed, all players. Information management (IM) has traditionally been viewed as part of the information technology (IT) cost centre. But this view is changing, both in business in general and in the E&P industry, as organizations recognize data as a critical asset that, when properly managed and with a view to addressing specific business challenges, provides significant competitive advantages. Webb (2005), also of SIS, recently discussed the importance of a holistic approach to managing E&P data. Previously in this publication, Bouffard (2004) focused specifically on the value of transforming national data repositories (NDR), passive receptacles of a country’s E&P data, into national data centres (NDC), systems with enhanced information management including public access to data and applications. This evolution of NDR to NDC makes it an essential asset for marketing opportunities to potential investors and also a valuable data source for E&P professionals working within a country (Gmyzin and Bouffard, 2004). This article looks at predictions previously made, the progress realized, and the value added in approaching IM holistically, including with respect to NDC implementations. Reducing IT costs vs. reducing total operating cost With oil hovering around $60 per barrel and operators reporting record profits, it might be difficult for an industry outsider to see cost as a reason not to invest in IM. But those of us inside the industry understand the cyclical nature of our business (it was only seven years ago that oil was floundering around $10 per barrel) and recognize that reduction of costs remains a persistent goal. However, instead of targeting individual costs, it is the authors’ position that the goal should be to reduce total operating cost, the cost to find and produce a barrel of oil, or stated in today’s terms, the barrels found or produced per dollar invested, or per available resource. It is in these terms that the value of IM must be measured. And given the unique challenges the E&P industry is now facing, the case for a holistic approach to IM is quite compelling.
-
-
-
Adapting to the new world of utility computing
Authors E. V. Hawes and N. WestonEdward V. Hawes, CEO, vCompute, and Nick Weston, director of oil and gas, Sun Microsystems, explain the concept of utility computing and the opportunities it presents for companies of every size in the E&P business. The ‘Utility computing,’ or the delivery of compute power as a service, is now a reality. Although many have questioned the feasibility – and even the value – of such services, vendors today are rolling out utility computing offerings that are changing the IT landscape for companies worldwide. Here we explore the benefits of utility computing, not only to the individual customer, but also to the industry as a whole. It will outline the questions that a customer should ask when considering a utility computing model, as well the steps that a customer should take once a decision has been made to purchase utility computing power. Levelling the playing field We are entering a new age of computing that will redesign the IT world more than any other wave of advancement in computing technology. Think for a moment about the implications of massive amounts of computing resources that can be easily and cost-effectively accessed by anyone or any company that needs them. One of the major results is that smaller organizations, in service markets that need computing resources to compete, can now effectively bid work against the biggest companies. With the IT resources of a much larger company available to them, only when needed, even the smallest organization can have worldwide reach. One of the key benefits to the industry of this utility computing model is the level playing field it creates, fostering the growth of ideas and technology from companies large and small. In the seismic data processing world, for example, smaller geoscience companies that have limited compute resources can now take on larger contracts to process and interpret geophysical data by supplementing their compute environments with utility compute power. Another example would be in the area of reservoir simulation: with easy access to a very large compute cluster, smaller companies will be able to perform more detailed and precise reservoir simulations that previously have only been available to companies that owned and maintained large internal compute farms. The ability to quickly access huge amounts of computing resources on a short term basis is invaluable to many markets. Any organization that utilizes high performance computing can benefit from immediate access to additional resources, whether its specific focus is in seismic processing or reservoir simulation.
-
-
-
E&P data management in the real world
By D. SullivanWithout an integrated, real time E&P database a company will no longer be able to manage risk effectively, maximise shareholder value, or address a number of key corporate obligations. That’s the view of David Sullivan, managing director of Tigress, the UK data management specialist which introduced one of the original integrated E&P databases some 15 years ago. This article is an edited version of a paper presented last month at the Petroleum Society of Great Britain Data Management Conference. Very few people are against integration as a concept. At Tigress we support the concept with a passion, it’s the whole basis for our business. However, most E&P professionals currently support integration the same way they support world peace, sustainable development, the abolition of poverty, and the achievement social justice. By this I mean there are no obvious reasons to be against any of these things and one can be supportive of the concept without putting in too much personal commitment. We believe that all this is about to change. In the industry we have always been acutely aware of how the realities of world and industry politics affect our business. Few would argue with propositions such as: ■ There is a relationship between the industry’s ability to satisfy current energy demand and the price of oil and gas. ■ There is a relationship between a company’s reserves of hydrocarbons (proven and potential) and its market capitalization. ■ Political instability in a company’s areas of operation is very likely to affect the share price. ■ Despite modern technology, there are no cast iron guarantees about how much oil and gas a company owns or how much it will be able to produce. ■ Risk and uncertainty abound. These can only be mitigated by information and knowledge. ■ The digital information revolution is used by a variety of stakeholders, regulators and investors. Information in real time is fast becoming the norm. Finally, and most important, an oil company needs to present its case, support its argument, and justify its conclusions if it is to raise its game. Increasingly this can only be achieved by an integrated approach, and a real time integrated E&P database is at the very heart of such a system. Surely it was always like this? In some ways that’s true. But while the traditional technical and administrative techniques to oilfield data management are still important (indeed more important than ever), a whole new strategy is needed to meet the challenges of the modern world. In the early days of oil exploration E&P data management was a specialist area of principal interest to industry professionals and of little interest beyond the industry. This is no longer the case. World demand for oil is closing the gap on the industry’s ability to keep pace. More importantly, the world has started to recognize this and as a result interest in E&P data has widened way beyond the E&P department. The investor relations and PR departments now need to supply analysts, journalists, shareholders, and the world at large. These new users will not wait. The information must be accurate and it must be immediate. Nor do markets take the information provided byoil companies at face value. It must be supported and, above all, be consistent. Mistakes can be very costly. Modern markets react in seconds rather than at the end of an accounting quarter or financial year.
-
-
-
Towards a new revision of the SEG-D format standard
By B. FirthBob Firth, Troika International, explains the issues involved in updating the industry standard format for seismic field data and reports on progress so far. For several years now there has been talk of the need for a new revision of the SEG-D standard for seismic field data. The last revision of the standard, Rev 2, was published in 1996, and in the intervening decade there have been significant developments in acquisition and computer technologies, and a new revision would bring the standard into line with current industry techniques and practices. At the SEG convention in Denver in 2004, the SEG Technical Standards Committee decided to revive the SEGD Format Subcommittee, with Jill Lewis of Troika International volunteering to chair the subcommittee and provide the necessary energy to drive it forward. Since then there has been considerable behind-the-scenes activity to ascertain what should be contained in the new revision, culminating in meetings of interested parties at the EAGE in Madrid and the SEG in Houston in November.
-
-
-
Information life-cycle management in the upstream oil and gas industry
By R. NicholsonIn this white paper, Rick Nicholson, vice president of IDC consultancy company Energy Insights, highlights the importance of information life-cycle management to optimize the use and value of E&P data. The value of exploration and production (E&P) assets varies over their life cycle. At the beginning of the life cycle, during exploration, the assets have a relatively low actual value but a high potential value. During field development, the value increases as hydrocarbon extraction begins. Asset value peaks during production and declines as the assets reach maturity. Exploration and production is a highly information-intensive process, and like the E&P assets themselves, the information has a distinct life cycle with the value of the information varying throughout the life of the associated assets. During exploration, a wide variety of surface and subsurface data is collected to guide the evaluation of reservoirs and selection of drilling locations. Information value increases throughout the exploration phase, in advance of the increase in asset value. In the field development phase, well data is collected and the value of exploration data peaks as production begins. During production, a new set of operations and maintenance data is collected, which, in turn, varies in value along with the life cycle of the assets. If, when a reservoir is in decline, the assets are sold to another company, the value of information once again rises as the new owner evaluates the assets. Current upstream information management practices are typically not aligned with the needs of the business. E&P data is typically managed using a two-tier storage architecture with active data stored on high-performance, high-availability disks and with inactive data archived to tape storage. Due to the mismatch between the storage architecture and the changing value of information throughout the life of E&P assets, current information management practices do not enable oil and gas companies to get the maximum value from their information at the lowest total cost at every point in the information life cycle. Information life-cycle management (ILM) is a data storage methodology and architecture that aligns IT infrastructure with business needs, based on the changing value of information over time, at the lowest possible total cost of ownership (TCO). More specifically, it is a process by which information is moved between different tiers of storage media to ensure that the service levels required by the business are met at the lowest TCO based on the content of the data. ILM also progressively automates the storage management process over time, minimizing the risk of human error or interference and optimizing the movement of data between the storage tiers. Benefits of ILM can include: ■ Organizational agility (e.g., finding the right data faster and reducing the impact of ‘blindside’ events) ■ Reduced risk (e.g., regulatory compliance, business continuity, and security) ■ Lower storage costs (e.g., better IT asset utilization and lower cost per unit stored) For example, one large independent upstream oil and gas company has estimated that its technical staff spends up to 80% of its time searching for and manipulating data. Reducing that figure to 50% of the staff's time and multiplying it by the number of engineers and geoscientists on staff would equate to over 100,000 staff hours per year. Furthermore, the company predicts that, if for every 100 technical staff members employed, the time spent searching for and manipulating E&P data can be reduced by just 10%, it will realize an effective staffing increase of five to eight people.
-
-
-
High-resistivity anomalies at Modgunn arch in the Norwegian Sea
Authors A. H. Bhuiyan, T.A. Wicklund and S. E. JohansenTypical seismic data provide information about subsurface stratigraphy and structure. Formation characteristics, such as lithology and fluid content, can also be predicted from seismic data. Well-log data can verify seismically extracted formation characteristics. However, well drilling is relatively expensive and the success rate of commercially viable exploration wells, depending on the seismic data, is only about 10–30% (Johansen et al., 2005). Additional remote sensing methods for the detection of subsurface formation properties (e.g. resistivity) can be used to minimize the uncertainties associated with drilling. The recently developed SeaBed Logging (SLB) method shows a very promising potential for the detection of deeply buried highresistivity layers (Eidesmo et al., 2002). Resistivity contrasts in the subsurface strata make SBL a potential tool for the detection of high-resistivity hydrocarbon reservoirs or other high-resistivity lithologies, such as salt domes, volcanic rocks or igneous sills. The first fullscale SBL calibration survey was conducted offshore Angola in 2000 (Ellingsrud et al., 2002), opening a new frontier in hydrocarbon exploration. Subsequently, several surveys were performed over known hydrocarbon fields offshore Norway. SBL calibration surveys from Ormen Lange and Troll Western gas province have been presented by Røsten et al. (2003) and Johansen et al. (2005), respectively. In this article, we present SBL data acquired across the Modgunn arch, which is located in the Norwegian Sea. The SBL data interpretation aims at finding the resistivity distribution within the seismically interpreted subsurface strata. The Modgunn arch is characterized by strong seismic anomalies, which may partially correspond to high-resistivity anomalies. The SBL data of this area, in parts, show strata with high resistivity. SBL data analysis can predict the presence of the high-resistivity layers and rocks, but due to low resolution, it is difficult to determine the exact geometry of the resistivity structure from the SBL data alone. To establish the quantitative relationship between the seismic anomalies and the resistivity distribution within the strata, SBL and seismic data interpretation play complementary roles. The integrated approach of seismic and SBL data interpretation provides a realistic subsurface resistivity distribution with fewer uncertainties. An interpretation study, based on electric field magnitudes taken from the same data set, has been presented by Bhuiyan et al. (2005).
-
-
-
Seismic interpretation using seismic trace frequency attribute properties - a preliminary study
Authors C. Gengyi, X. Zhu, W. Wenbo, Y. Qinfan and L. ChaoyingTime-frequency analysis (T-F) of the seismic reflection character of thin formations (Widess, 1973; Mahradi, 1983; Yang, 1988; Yin et al., 2003) is increasingly being applied to high-resolution seismic processing and interpretation, especially for sequence interpretation of complex continental facies. This includes identification and partitioning of multi-level sequences, micro-facies characterization of sedimentary layers and the cycle structure of thin sand-shale interbeds, and thickness prediction of thin reservoirs (Li, 1987; Jin, 1988; Mushen et al., 2000; Ling Yun, 2004), etc. Using different mathematical transforms e.g., Fourier transform, wavelet transform, Wigner-Vill transform, Gabor transform, etc, (Cohen, 1989; Williams et al., 1989; Boashash, 1991; Bahorich, 2002), the time-frequency analysis allows us to decompose and to depict in detail the local spectra of the input signals, so as to get an overall knowledge of their micro-structures by the relatively high resolution in the frequency domain. This paper uses the Fourier and inverse Fourier transform (full frequency band, local frequency decomposition, and superposition) to conduct the time-frequency analysis (Mushen, 1992; Du, 1998; Wang, 2000), using a bi-octave triangular broad-band recursive (exponential increase) filtering (Fig. 1) to bring out the tuning effects of thin bed thickness and lithology (Widess, 1973; Mushen, 1992; Du, 1998; Wang, 2000). The main characteristic of the filtering is that it can highlight the dominant frequencies of different frequency bands without missing out gradual changes of the frequency attributes (Fig. 2). In the study of the frequency characteristics of the seismic data from a 3D survey, using the filtering available in our time-frequency analysis software package, we find that there is a clear relationship between the multi-level ramifying structure of the frequency-time plot and the strata attributes and sedimentary cycles. This idea makes it feasible to study the sequence stratigraphy and reservoir directly from the frequency attribute of the seismic data, and also offers a new analysis alternative for highprecision geological interpretation of seismic data.
-
Volumes & issues
-
Volume 42 (2024)
-
Volume 41 (2023)
-
Volume 40 (2022)
-
Volume 39 (2021)
-
Volume 38 (2020)
-
Volume 37 (2019)
-
Volume 36 (2018)
-
Volume 35 (2017)
-
Volume 34 (2016)
-
Volume 33 (2015)
-
Volume 32 (2014)
-
Volume 31 (2013)
-
Volume 30 (2012)
-
Volume 29 (2011)
-
Volume 28 (2010)
-
Volume 27 (2009)
-
Volume 26 (2008)
-
Volume 25 (2007)
-
Volume 24 (2006)
-
Volume 23 (2005)
-
Volume 22 (2004)
-
Volume 21 (2003)
-
Volume 20 (2002)
-
Volume 19 (2001)
-
Volume 18 (2000)
-
Volume 17 (1999)
-
Volume 16 (1998)
-
Volume 15 (1997)
-
Volume 14 (1996)
-
Volume 13 (1995)
-
Volume 12 (1994)
-
Volume 11 (1993)
-
Volume 10 (1992)
-
Volume 9 (1991)
-
Volume 8 (1990)
-
Volume 7 (1989)
-
Volume 6 (1988)
-
Volume 5 (1987)
-
Volume 4 (1986)
-
Volume 3 (1985)
-
Volume 2 (1984)
-
Volume 1 (1983)