- Home
- A-Z Publications
- First Break
- Previous Issues
- Volume 24, Issue 7, 2006
First Break - Volume 24, Issue 7, 2006
Volume 24, Issue 7, 2006
-
-
Mineral exploration using modern data mining techniques
Authors C.T. Barnett and P.M. WilliamsColin T. Barnett and Peter M. Williams discuss how new data mining concepts such as visualization and probabilistic modelling can provide the key to improved exploration success in the mining industry. The article is a slightly abridged version of a contribution to the latest special publication from the Society of Economic Geologists, which this year celebrates its 100th anniversary. Following an analysis of recent performance of the gold industry, Schodde (2004) concludes that gold exploration is currently only a break-even proposition. In the last 20 years, the average cost of a new discovery has increased nearly fourfold, and the average size of a deposit has shrunk by 30%. The average rate of return for the industry has been 5–7 %, which is the same order as the cost of capital. Why should this be, and what can be done about it? Paterson (2003) observes that, historically, discoveries have taken place in waves, after the introduction of new methods or advances in the understanding of ore genesis. For instance, discovery rates jumped sharply between 1950 and 1975, following the development of new methods and instruments in exploration geophysics and geochemistry. In the last quarter century, there has been a comparable surge in digital electronics and computing that has resulted in a great increase in the quality and quantity of exploration data. Yet these developments, on their own, evidently were not sufficient to reverse a downward trend in the discovery rate during this period. So where should we look for new methods to drive the next wave of discoveries? It seems we are now collecting data faster than we can absorb it. But this is also true in bioinformatics with genome sequencing, or in making sense of other huge corpora of data available on the Internet. It is the thesis of this paper that the new methods are to be found in ways currently being developed for extracting meaningful information from data. Specifically, we should look to recent developments in visualization and data mining. Many data mining techniques are inspired by analogy with human intelligence and suggest a new idea of computing. Conventional computing is restricted to tasks for which a human can find an algorithm. Living creatures, however, are programmed by experience, not by spelling out every step of a process. Data mining is therefore about discovering how machines, like humans, might learn from data. Machine learning broadly distinguishes between supervised and unsupervised learning. Supervised learning, or learning from examples, requires sufficiently many labelled cases to be available. These form a set of known inputoutput pairs, usually called a training set, and the task is to learn the true input-output mapping from these examples. In the exploration case, the training set typically consists of a collection of known deposits and known barren regions. For unsupervised learning, we know only the inputs and not the corresponding outputs. The aim, then, is to search for ‘interesting’ features of the data, such as clusters or outliers, or for some latent structure which would account for how they were generated. In this paper only the case of supervised learning is considered, but see Williams (2002) for some further discussion of both approaches. The paper begins with a review of recent advances in visualization and supervised learning techniques, such as neural network models. The use of these ideas is then demonstrated in a study of gold exploration in the Walker Lane in the western United States. Finally it is shown how the results can be applied to a quantitative analysis of exploration risk, and how improved targeting accuracy can reduce exploration costs and increase the probability of success.
-
-
-
Mapping the subsurface for mineral exploration
More LessIncreasing integration of exploration software including 3D data with broad-based GIS systems promises to improve discovery success rates and decision making, according to Louis Racic and Tim Millis of Geosoft. The mapping of the Earth’s subsurface is critical to understanding our capabilities and limitations with regard to mineral exploration, oil and gas resources, and environmental management. But until recently, visualizing beneath the Earth’s surface within GIS has been a complex technical challenge. There is the data challenge. Mining exploration companies work with huge quantities of geological, geochemical, and geophysical data. Expert handling and visualizing of these multiple data sets and maps has typically required specialized, standalone software. And there is the efficiency challenge. Exploration is a moving target. Geoscientists are increasingly called upon to edit and refresh subsurface information based on new data, and to combine datasets collected in a variety of formats, such as geophysics, geochemistry, and geology. However, in recent years we’ve seen increasing integration of exploration software with broad-based GIS systems. New methodologies for the way you can view and interpret subsurface data in the GIS environment have simplified and added new capabilities for the managing of drillhole, borehole, or monitoring well data sets, allowing processing of this data within GIS, as well as presenting the data in a meaningful and appropriate fashion. Today’s GIS subsurface visualization tools allow geoscientists to present data normally visualized on a map, or a set of separate maps, in an integrated fashion, as well as handle and process ‘surface’ data. Add to that, the ability to display drillhole, borehole, and other subsurface data in 2D and in 3D. Users can manipulate a huge volume of surface and subsurface geochemistry, geophysics, and geology data in 3D within a single or transparently-linked interactive environment. They can augment the drilling results or the environment where the data was collected. And they can plot geochemical surface data and query geological mapping regions. Armed with these tools, geologists can integrate all available data at every stage of an exploration program to gain a better understanding of the underlying subsurface geology, verify their assumptions, and share their ideas with others on their exploration teams.
-
-
-
New technology approach needed for mining industry
Authors R. GordonRob Gordon of Quantec Geoscience, Toronto, Canada suggests that improved mining exploration results will only come with acceptance of new geoscience and other technology. Fundamental methodologies for mining exploration are finally benefiting from the precedent of the oil industry. The technologies used in 3D distributed seismic systems have now been applied to electrical earth imaging. Recent advances in digital signal processing and faster computers, coupled with the ability to collect very high resolution and deep geophysical data, means that physical property contrasts can now be discriminated from the surface with accuracy and depth penetration not previously seen. This provides new opportunities to further geoscientific investigation in mining environments at greater depths prior to drilling. Maximum value and more realizable returns are demanded from exploration expenditures today. Now the mining industry can systematically interrogate the ground in the search for orebodies while essentially sterilizing unfavourable ground in the process. In addition, as the oil sector continues to seek improvements in technology to increase exploration success rates, it has begun to experiment with electrical methods again. This technology should prove quite interesting to the forward looking groups in both sectors. Making a discovery is difficult and is arguably more difficult as undiscovered deposits today are more likely found at greater depths. The financial risk of deep drilling is hindering deep exploration. However, the mining industry has been traditionally slow to embrace new technology particularly if it is not easily understood or when the cost paradigm is out of sync with traditional spending habits regarding drilling versus other technologies. Drill targeting has to be more focused thereby providing better returns per metre drilled. In essence, high potential ground may gp be under-explored. Economists have often said a critical failure in exploration is the inefficiency even while targeting highly prospective regions. Today, imaging to depths of over 1500 m can assist with deeper exploration within favourable land packages. Moreover, technology can now provide a means to revitalize exploration in mature mining camps. A ‘bottom-up’ vs ‘top-down’ exploration process begins to address economic concerns about drilling risk and discovery rates.
-
-
-
Alliance approach to geoscience training
Authors D. Ireson and N. HarburyDick Ireson and Neil Harbury of Nautilus outline a new approach to geoscience training that has developed and matured over eight years. The oil price is high and likely to remain so, exploration is no longer a forbidden word, improving recovery factors using new technology is a key driver, and so the need for staff trained to support the burgeoning workload is increasing. Unfortunately, the demographics of the current work force make worrying reading: a high percentage of staff will be eligible for retirement within the next 5-10 years. Figure 1 shows the range of career experience of geoscientists attending the training programme discussed in this article. The histogram shows the proportion of people from Europe and North America attending courses relative to their length of service over a five-year period from 2001 to 2005. Two peaks are clear, one correlating with new hires and the other with very experienced people. Similar figures for North America over the same period are shown in the background with less of a peak at the low end and a very pronounced peak in the 20 to 30+ years of experience bracket. Until recently, particularly in North America, the information collected from the companies in this study suggested that the intake of new hires is nowhere near the actual and predicted exodus from the workforce. Suitable candidates attracted to and appropriate for our industry are a finite resource. In addition, the roles of current geoscience personnel and the techniques and methods they need to be acquainted with are changing rapidly. New hires need to be made effective as quickly as possible and there is a need to train experienced staff in new techniques, new ideas, and new workflows.
-
-
-
Dealing with staff recruitment, retention, and training in the oil and gas business
Authors D. O‘DonnellDeidre O’Donnell, managing director, Working Smart discusses three research projects undertaken by her company investigating (1) what the E&P industry is looking for in terms of graduates, (2) university and student perceptions of, and relationships with, the E&P industry and (3) trends in the use of consultants. Few personnel managers will be unaware of the looming personnel challenges due to the demographics of our industry, as illustrated by the age profile of SPE members. 50% of industry employees are estimated to become eligible to retire within the next 13 years, and this is probably an under-estimate if preferences for early retirement and/or late-career lifestyle changes are considered. A key reason for the shortage of young professionals is the historically cyclical nature of our business, resulting in several troughs in profitability such as after the 1999 oil price collapse and slow down in the global economy. Many would say that the problem has been exacerbated by kneejerk reactions by companies, including freezing graduate recruitment programmes and mass-layoffs of experienced personnel, many of whom have become disillusioned and moved permanently to other industries. Most forecasts now indicate a sustained high oil price which, combined with several other economic and geopolitical factors, indicate a buoyant and stable E&P market for the foreseeable future. Many oil and gas companies and service suppliers are struggling to adequately source and manage the workforce skills required for their current business, never mind the growth opportunities. To resolve this problem, we must address the different challenges across the complete spectrum of experience levels, including graduate recruitment, skills management, staff retention, and the optimal use of knowledge and experience.
-
-
-
Wireline tool to improve estimates of 3D borehole acoustic rock properties
Authors V. Pistre and K. SchillingVivian Pistre and Keith Schilling of Schlumberger introduce the company’s latest wireline technology which is said to provide robust measurements of acoustic rock properties. When used with other formation measurements, the knowledge gained may help to reduce overall drilling and completion costs while improving well productivity and ultimate recovery. Commercialized in November 2005, over 400 jobs have been run to date around the world. The new wireline acoustic tool has been developed with more transmitters (5) and receivers (104) than current generation tools. Its high quality waveforms and advanced processing techniques lead to improved estimates of compressional and shear slowness whilst their radial profiles, offers enhanced anisotropy detection, and mechanism identification as well as reliable through-casing slowness measurements. The tool’s advanced inversion algorithms accurately estimate 3D rock properties, making use of the multiple transmitter and receiver spacings and wideband acoustic signals featured by the tool. Accurately determining relevant rock properties leads to timely decision-making in areas such as well placement, perforating, stimulation, production optimization, and sand control. Examples demonstrating the new tool’s capabilities are shown from various wells around the world.
-
-
-
Reservoir mapping using the high frequency EM method: Taob oil field case study
More LessThis paper is an introduction to the high frequency electromagnetic (EM) method and its application. The method uses multiple sources (natural and artificial) and five electromagnetic tensors are recorded. After data processing, resistivity and phase anomalies can indicate reservoir boundaries and favourable areas for drilling. The method has been applied to the Taob oil field and the results are presented in the paper. In this case, the oil-bearing area was mapped accurately and another prospective area was discovered. Many successful cases in China have demonstrated that the high frequency EM technique can be widely used for reservoir mapping and indications of prospective areas. The setting The Taob oil field is located in the Songliao Basin, northeast China, where 2D seismic data have been acquired. The target formation K1 2 is shallow (300-400 m) and regionally eastward dipping (Fig. 1). It is especially worth mentioning that the recently discovered overthrust in the western blocks, which trends northeastwards, acted as a seal for the oil and gas migrating from the central depression, and that oil is present in several faulted structural noses associated with the eastern side of the overthrust. This has made commercial oil production possible on the regional slope. The traps delineated in the area are mainly structural. Study of this shallow reservoir shows that there might be many other lithological traps and overlap traps, which is a challenge for seismic prospecting. In order to probe the non-structural traps economically, the high frequency EM technique was adopted in this case and the results show its good effect.
-
-
-
Toward the low frequencies: land and marine equipment
Authors D. MougenotA large improvement in the vertical resolution of surface seismic (one order of magnitude from decametres to metres!) is viewed by oil companies as the most important step toward a wider use of seismic in reservoir description. Differential absorption of the higher frequencies during the propagation of the reflected signal and the inability to create and record a broadband signal have prevented the seismic industry from fulfilling this important need. Borehole seismic may improve the situation by shortening ray paths and by avoiding transmission through the highly absorbing weathering zone. Seismic data with a resolution of metres and dominant frequencies around 1 kHz have been recorded by cross-hole seismic (Sheline, 1998). However, the scarcity of wells with the appropriate geometry has led to the rather limited extent of cross-hole seismic images. Vertical resolution (i.e. the ability to discriminate an event) depends, among other parameters, on the S/N ratio and on the frequency content. If we consider a zero-phase seismic wavelet (Figure 1), its resolution depends on the width of the central peak and on its isolation with respect to the secondary lobes. The width is determined by the average signal frequency (Fav = (Fmax + Fmin) /2), and the isolation by the relative bandwidth measured in octaves n (2n = Fmax / Fmin). The typical frequency range of surface seismic (10-80 Hz) represents three octaves, that is three doublings of frequency (10-20 Hz/20-40 Hz/40-80 Hz - Figure 2). We assume that, to gain vertical resolution, it should be easier to add one octave by recording signal from 5 to 10 Hz than by recording it from 80 to 160 Hz. What has prevented the industry from using this opportunity? First, we have not wanted to record low frequencies because they have a reputation for being highly contaminated by noise and for causing subsurface structural damage. In addition, most low frequencies have been filtered out in the recording process to avoid the high amplitude surface noise and the corresponding loss of dynamic range. This explains why we have not focused on them in either source or recording technology, and for data processing. Today, we can take full advantage of new marine and land acquisition equipment that makes it possible to emit and record broadband signal, including low frequencies.
-
Volumes & issues
-
Volume 43 (2025)
-
Volume 42 (2024)
-
Volume 41 (2023)
-
Volume 40 (2022)
-
Volume 39 (2021)
-
Volume 38 (2020)
-
Volume 37 (2019)
-
Volume 36 (2018)
-
Volume 35 (2017)
-
Volume 34 (2016)
-
Volume 33 (2015)
-
Volume 32 (2014)
-
Volume 31 (2013)
-
Volume 30 (2012)
-
Volume 29 (2011)
-
Volume 28 (2010)
-
Volume 27 (2009)
-
Volume 26 (2008)
-
Volume 25 (2007)
-
Volume 24 (2006)
-
Volume 23 (2005)
-
Volume 22 (2004)
-
Volume 21 (2003)
-
Volume 20 (2002)
-
Volume 19 (2001)
-
Volume 18 (2000)
-
Volume 17 (1999)
-
Volume 16 (1998)
-
Volume 15 (1997)
-
Volume 14 (1996)
-
Volume 13 (1995)
-
Volume 12 (1994)
-
Volume 11 (1993)
-
Volume 10 (1992)
-
Volume 9 (1991)
-
Volume 8 (1990)
-
Volume 7 (1989)
-
Volume 6 (1988)
-
Volume 5 (1987)
-
Volume 4 (1986)
-
Volume 3 (1985)
-
Volume 2 (1984)
-
Volume 1 (1983)
Most Read This Month
