- Home
- Conferences
- Conference Proceedings
- Conferences
70th EAGE Conference and Exhibition - Workshops and Fieldtrips
- Conference date: 09 Jun 2008 - 12 Jun 2008
- Location: Rome, Italy
- ISBN: 978-94-6282-104-0
- Published: 09 June 2008
41 - 60 of 91 results
-
-
Seismic Imaging and the Road to Petascale Capacity: RTM and the Cell /B.E. Processor
Authors F. Ortigosa, J. M. Cela, M. Araya-Polo and R. de la Cruzch, on the other hand, is around the
corner. Seismic Imaging is definitively a field in our industry where petascale capacity is
needed. The question is not when this capacity will be widely available, but how. There are
several hardware processors and devices as a candidates for the brain of the new generation of
petascale supercomputers. The only thing in common is that all of them are difficult to
program, and the programming will be different from the programming of today’s x86
processor generation. We believe that among all the hardware options, the Cell /BE processor
has several characteristics that make it ideal for widely available petascale capacity. Besides
the difficulty of programing the Cell, we present a benchmark for RTM between Cell and
PowerPC processors. We show that using an early generation of the Cell, and a difficult
Kernel of a compute intensive algorithm, we may expect almost one order of magnitude of
performance increase.
-
-
-
Grid and Cloud Computing: Opportunities and challenges for e-Science
By F. GagliardiA new science paradigm has emerged in the last few years referred to as electronic Science (e
Science). It extensively uses simulation techniques based on software modeling which run on
distributed computing infrastructures. In addition, it makes use of huge amounts of distributed and
shared data captured by instruments or sensors and/or stored in databases, analyzed to provide new
results for science. This distributed HPC and data environment allows sharing the acquired
knowledge, accessing remote resources and enabling world wide scientific collaboration.
-
-
-
The Implications of Multicore Processor for High Performance Computing
More LessWith some of the largest supercomputing clusters on the planet, the seismic imaging industry
has a tremendous appetite for computing resources. Increasing demand for energy and
societal pressures for greener energy will only accelerate this growing need. It is therefore
clear that seismic imaging companies will be early adopters for the newest generation of
supercomputers that enable petascale computing and beyond – machines that are perhaps as
close as only months away. However, as this transition to petascale sweeps through the
industry, the industry will also be one of the first groups to come face-to-face with the
significant changes that will be required in order to go beyond petascale. The industry will
have to actively develop plans for how their business models and standard operations will
change in order for practitioners have to achieve the performance they require for their evergrowing
technical challenges. The next few years will present very important challenges and
opportunities for high performance computing in research, industry and business. Petascale
computing and, eventually, “exascale” computing will bring the promise of capability to
deliver full solutions to some of the most challenging and complex issues facing the industry.
However, for well documented technology reasons, these new computing systems
architectures will be radically different in design from traditional high performance
computing platforms. For example, in response to growing technological obstacles, the
processor industry is moving down the multicore path. This development is driving a sea
change in the computer industry for which a new "Moore's Law" may be arising - dictating a
doubling of the number of cores per unit time. As more cores are squeezed on to a chip, the
old programming approaches will not be adequate to achieve the performance required by this
industry; and naive assumptions of linear scaling of performance with the number of cores
will be very wrong. Recent experience with multicore has identified key challenges which
will have to be overcome in order to realize the potential of the next generation of
supercomputing. These challenges include fundamental algorithm design; integration of novel
architectures with more traditional computational systems; management of the unprecedented
amounts of data which are now a key component in all high performance computing
activities; and the development, improvement and validation of new applications solutions
which address the full complexity of the problems which these novel architectures will make
tractable. This presentation will discuss how the industry will be impacted by these changes
and what practitioners can do to achieve the full potential of multicore-based petascale and
exascale supercomputers.
-
-
-
Seismic Wave Propagation on GPUs
Authors A. Loddoch and W. R. VolzIn this work, we present the adaptation of a 3D Finite Difference acoustic wave propagation
code to GPU architecture using the NVIDIA CUDA framework. We demonstrate the general
concept of transforming a CPU-based code to fit the requirements of GPU computing. The
individual steps of this process are illustrated along with the necessary modifications to the
original program code and algorithm.
Problems that typically arise when porting a code to graphics cards, such as numerical
stability due to reduced accuracy and memory availability/handling are discussed as well as
the ability to use multiple GPUs simultaneously.
The described technique results in a GPU-based code that provides speedups of one order of
magnitude in execution time compared to the original CPU version.
-
-
-
Experiences with Seismic Imaging on GPUs
Authors S. Morton, T. Cullison and P. MicikeviciusFor years graphics co-processing units (GPUs) have been growing in performance capabilities
faster than CPUs, and have recently surpassed them. Simultaneously the hardware and
corresponding APIs have grown more generally applicable to a wide range of computational
tasks. This has fostered the use of GPUs for general programming.
In this abstract we will describe our experiences monitoring, testing and ultimately using
GPUs to run various seismic imaging kernels. The performance improvement of GPUs over
CPUs varies with algorithm but is high enough so that we've purchased a substantial GPUbased
cluster. In our workshop talk, we will discuss our production experiences running code
on this system.
-
-
-
NVIDIA Tesla, a way to dramatically speedup seismic processing and reservoir simulation applications
More LessMassive, fine-grained parallel computing capabilities will be needed to help researchers
effectively use petascale computing environments. In particular, petascale computing will
gain performance speed from the parallel processing capabilities of graphics processing units
(GPU). The concept behind the general-purpose GPU (GPGPU) is simple: Use the massively
parallel architecture of the graphics processor for general-purpose computing tasks. Because
of that parallelism, ordinary calculations can be dramatically sped up.
GPGPU is being used as a high-performance coprocessor for oil and gas exploration and other
applications—and it's much cheaper than a supercomputer. Scientists and researchers benefit
from the power of the massively parallel computing architecture. This availability of
supercomputing will unlock the answers to previously unsolvable problems in systems
ranging from a workstation to server clusters.
Using a GPU as a calculation unit may appear complex. It’s not really about dividing up the
task to execute into a handful of threads like using a multicore CPU but rather it involves
thousands of threads.
In other words, to try and use the GPU is pointless if the task isn’t massively parallel, and for
this reason, it can be compared to a super calculator rather than a multi-core CPU. An
application to be carried out on a super calculator is necessarily divided into an enormous
number of threads and a GPU can thus be seen as an economical version devoid of its
complex structure.
NVIDIA CUDA is a software layer intended for stream computing and an extension in C
programming language, which allows identifying certain functions to be processed by the
GPU instead of the CPU. These functions are compiled by a compiler specific to CUDA in
order that they can be executed by a GPU’s numerous calculation units. Thus, the GPU is
seen as a massively parallel co-processor that is well adapted to processing well paralleled
algorithms and like in seismic and reservoir simulation.
NVIDIA Tesla product line is dedicated to HPC. The Tesla Computing System is a slim 1U
form factor which easily scales to solve the most complex, dataintensive HPC problems.
Tesla Computing System is equipped with four new generation NVIDIA GPU boards, IEE
754 compliant Double Precision FP, and a total of 16GB video memory. The rack is used in
tandem with multi-core CPU systems to create a flexible computing solution that fits
seamlessly into existing IT infrastructure.
-
-
-
Hardware Hybrid Computing solutions
More LessParallel HPC applications benefit from multi core CPU technology and have been able to
multiply the computation density by a factor of 2 to 4 and later by 8. This improvement is not
enough compared to the computation requirements of today’s applications. This is why
people have been looking for new hardware and specialized processors which could give
applications gains from 20 up to 100.
Specialized processors like GPUs have improved performance at a greater pace than Moore
law predicts. They started 10 years ago with a technology using 350nm, 5 million transistors
at 75Mhz and now are using 55nm, 700Millions transistors at 800Mhz being able to deliver
512GFlops or more than 3.5GFlops/Watt.
This leads to improvements factors of 1.7x/year in transistors count, 1.3x/year in clock speed,
2.0x/year in processing units and 1.3x/year in memory bandwidth. Using such powerful
dedicated processors as well as CPU in a highly parallel environment of Multi-core for both is
showing the requirement to be able to use in the most efficient way this heterogeneous
environment of Hybrid computing.
This hardware environment exists and can be used today. The first challenge is on the
software development side. Development tools need to integrate heterogeneous programming
as well as multi core from the core of their language being able to support code generation on
different processors types as well as handling asynchronous behaviors. This comes with
compilers and libraries supporting this and being design or extended for it. Obviously those
tools need to support multiple hardware platforms to lead to some standards.
The second key challenge change is the evolution of buses and bandwidth linking together the
different cores of CPU and GPUs. And the way they talk to each other. Fusion projects will
address those evolutions in the future by defining new architectures around those processors
to improve the data flow between them which will be the key to use all the power available.
Cross bar memory controllers will allow GPUs to talk each other very quickly without
breaking parallelism. Hyper Transport bus will improve communication between GPUs and
CPUs. Finally Multi core GPUs and CPUS on the same die will increase even more the
compute density.
Different benchmarks and application codes have been used to demonstrate already the
benefits so such architecture. We will present SGEMM results as well as different algorithms.
The results will highlight the fact that performance is affected by in/out copy of the data on
the GPU at the moment and that finer tunings allows huge jump in performance. We will also
show that changing the way algorithms have been implemented for CPU to fit GPU
architecture adds even more performance gains.
-
-
-
Characterizing controls of geothermal systems through integrated geologic and geophysical studies: Developing recipes for successful exploration of conventional and unconventional geothermal systems
Authors J. Faulds, M. F. Coolbaugh, G. S. Vice and V. BouchotAlthough conventional geothermal systems have been successfully exploited for electrical
production and district heating in many parts of the world, exploration and development of
new systems is commonly stymied by the risk of unsuccessful drilling. Problems include
drilling of hot, relatively dry wells with low flow rates, decreasing temperatures with depth as
wells penetrate relatively thin and shallow geothermal aquifers (overturn), and wells with
reasonable flow rates but relatively low temperatures. Due to the high cost of drilling, such
problems can effectively preclude geothermal exploration. Proposals to generate enhanced
geothermal systems (EGS) by artificially stimulating hot dry wells, commonly through
mechanical hydro-fracturing of rocks, have therefore gained in popularity.
-
-
-
Lithosphere tectonics and thermo-mechanical properties: an integrated modeling approach for EGS exploration in Europe
By F. BeekmanFor geothermal exploration and production of enhanced geothermal systems (EGS)
knowledge of the thermo-mechanical signature of the lithosphere and crust is important to
obtain critical constraints for the crustal stress field and basement temperatures. The stress
and temperature field in Europe is subject to strong spatial variations which can be linked to
Polyphase extensional and compressional reactivation of the lithosphere, in different modes
of deformation. The development of innovative combinations of numerical and analogue
modeling techniques is key to thoroughly understand the spatial and temporal variations in
crustal stress and temperature. In this paper we present an overview of our advancement
developing and applying analogue and numerical thermo-mechanical models to
quantitatively asses the interplay of lithosphere dynamics and basin (de)formation. Field
studies of kinematic indicators and numerical modeling of present-day and paleo-stress fields
in selected areas have yielded new constraints on the causes and the expression of
intraplate stress fields in the lithosphere, driving basin (de)formation. The actual basin
response to intraplate stress is strongly affected by the rheological structure of the underlying
lithosphere, the basin geometry, fault dynamics and interplay with surface processes.
Integrated basin studies show that rheological layering and strength of the lithosphere plays
an important role in the spatial and temporal distribution of stress-induced vertical motions,
varying from subtle faulting to basin reactivation and large wavelength patterns of
lithospheric folding, demonstrating that sedimentary basins are sensitive recorders to the
intraplate stress field. The long lasting memory of the lithosphere, in terms of lithospheric
scale weak zones, appears to play a far more important role in basin formation and
reactivation than hitherto assumed. A better understanding of the 3-D linkage between basin
formation and basin reactivation is, therefore, an essential step in research that aims at
linking lithospheric forcing and upper mantle dynamics to crustal vertical motions and stress,
and their effect on sedimentary systems and heat flow. Vertical motions in basins can
become strongly enhanced, through coupled processes of surface erosion/sedimentation
and lower crustal flow. Furthermore patterns of active thermal attenuation by mantle plumes
can cause a significant spatial and modal redistribution of intraplate deformation and stress,
as a result of changing patterns in lithospheric strength and rheological layering. Novel
insights from numerical and analogue modeling aid in quantitative assessment of basin and
basement histories and shed new light on tectonic interpretation, providing helpful
constraints for geothermal exploration and production, including understanding and
predicting crustal stress and basin and basement heat flow.
-
-
-
Geophysical exploration methods at European sites
By D. BruhnMost geophysical exploration methods have been developed for the oil and gas industry, and
ever more sophisticated tools and refinements in the different approaches are designed to
solve specific problems associated with the detection and characterisation of hydrocarbon
reservoirs. The exploration of geothermal resources has profited greatly from these
developments, however, the methods cannot always by directly transferred from oil and gas
to hot water and/or steam. First of all, physical properties of H2O differ from those of
hydrocarbons, resulting in differing responses of physical measurement methods. Secondly,
geothermal reservoirs can be found in highly varying geological environments, mostly
associated with volcanism, where hydrocarbons are usually not present. Thirdly, the
economically most interesting geothermal reservoirs are much hotter than any oil or gas
reservoir. At the moderate temperatures comparable to those of hydrocarbons many of the
advanced exploration methods are simply cost-prohibitive, as the economic potential of a
medium-enthalpy geothermal reservoir is much lower than for an oil or gas well. For these
reasons, some of the existing geophysical methods have to be adapted to meet the needs of
geothermal exploration or different methods have to be developed and applied.
-
-
-
Technological challenges of geothermal exploration
By A. ManzellaThe most pressing technological challenges in exploration and investigation of Enhanced
Geothermal Systems (EGS) and Unconventional Geothermal Resources (UGR) are
considered to be those associated with the identification of the nature of geothermal heat
concentrations and prospective reservoirs without drilling, the improvement of methods
predicting reservoir performance/lifetime, the optimization of resource exploitation producing
the lowest possible effect to the environment.
A list of important research themes is provided, allowing a spatial and temporal
reconstruction of the subsurface geothermal condition that might not only cut the time from
discovery to production and improve efficiency, but also reduce environmental impacts
forecasting possible problems and finding solutions beforehand.
-
-
-
Available approaches for increasing the producibility of geothermal wells in natural fracture systems by optimizing their placement, design, and stimulation
More LessMaybe the greatest value of technology in geothermal energy issues can be assessed in
terms of its ability to reduce risk. The present contribution only addresses subsurface matters
and, even more specifically, how naturally fractured geothermal reservoirs could be more
efficiently tapped and developed in the framework of EGS operations. Lastly, among the two
aspects which have to be investigated in EGS projects, i.e. temperatures and flow rates, only
the latter is considered here, assuming that the isotherm geometry can, for instance, be
constrained by MT (magneto-telluric) surveys looking for electrically conductive altered rocks
(clay alteration zones) or other techniques.
The ability to characterize NFSs (natural fracture systems) in the early field development
stage of an EGS project reduces economic risk because it enables the development team to
determine optimal well placement and trajectories. Characterizing, tapping and developing a
geothermal NFS, as well as predicting the heat and fluid flows in response to hot water
extraction and cooled water re-injection, is a challenging task that span multiple disciplines
and multiple scales.
-
-
-
Induced seismicity: Setting the problem in perspective for EGS development
By K. F. EvansInduced seismicity is a recognised hazard in practically all engineering endeavours where
stress or pore pressure are altered. This can be taken as a reflection of the realisation that
has dawned in the past 20 years that the Earth's crust generally supports high shear stress
levels and is often close to failure. Historically, the most damaging events, which have
sometimes caused many fatalities, are associated with the impoundment of reservoirs.
However, earthquakes of sufficient size to cause damage to localities have also been
associated with mining activity, long-term fluid withdrawal wells, and long-term fluid injection
wells. Given that massive stimulation injections into crystalline rocks have routinely been
performed at EGS sites since the early 70s, it is perhaps surprising that the issue of the
seismic hazard associated with these operations has only recently come to the fore. Massive
injections of fluid have been conducted at Fenton Hill, Rosemanowes, Hijiori and Soultz (3.5
km reservoir) without producing events large enough to disturb the local population. More
recently, events approaching or exceeding 3.0 have occurred during or shortly following
injections at Soultz , Cooper Basin (Australia) and Basel, all of which were conducted at 4.5-
5.0 km. These events, particularly the event at Basel because of it proximity to a major
population centre and reports of damage, has galvanised attention on the seismic hazard
posed by EGS development.
-
-
-
Understanding the stress field and potential fault activity – a key issue to drilling and stimulation in man-made geothermal reservoirs
Authors I. Moek, H. Schandelmeier and T. BackersIt has been recognized that in-situ stresses have significant impact, either positive or
negative, on the short and long term behaviour of fractured reservoirs. The knowledge of the
stress conditions are therefore important for planning and utilization of man-made
geothermal reservoirs. The geothermal field Groß Schönebeck belongs to the key sites in the
north eastern German Basin in Germany. We present a combined approach of stress field
determination and application of the new knowledge for drilling and stimulation design at this
key site, where 4100 m deep sandstones and volcanic rocks of Lower Permian are ongoing
to be explored. In our comprehensive study we use detailed 3D fault mapping, based on
available well and 2D seismic data, stress regime determination based on empirical and
analytical methods, and slip-tendency analysis to estimate reactivation and leakage potential
of any fault population within the stress field under initial and changing pore pressure
conditions. We discuss the importance of various fault sets related to the stress field in terms
of their potential for conducting geothermal fluids based on the tendency of the faults to dilate
and slip. In particular, we demonstrate how the well path trajectory and mud weights can be
defined on the basis of principle stress orientation and magnitude to minimize formation
damage under mechanically stable borehole conditions and to optimise stimulation designs
of multiple fracs in multilayered rocks. Finally, the results of slip-tendency can be used to
control seismicity induced by massive stimulation campaigns at geothermal sites. Our
approach can be adopted to any other geothermal site investigation.
-
-
-
Supercritical fluids and their properties for heat transmission and geochemical reactivity: example of the supercritical CO²
Authors M. Azaroual, L. André, A. Lassin and A. MenjozThe thermodynamic and thermophysical properties of supercritical carbon dioxide (CO2(sc))
are known and theoretical approaches are introduced in many numerical modeling codes.
Various studies have identified the key mechanisms of transport and the physical – chemical
behaviour of the field near the CO2(sc) injection wells in saline aquifers (André et al., 2007,
and therein references). The contrast of thermophysical properties between water and
carbon dioxide is sufficiently large to envisage the use of CO2(sc) as a heat transmission fluid
in the context of enhanced geothermal systems - EGS (Brown, 2000; Pruess, 2006; Pruess
and Azaroual, 2006; Pruess 2008). The carbon dioxide is a poor conductor of heat, low
density and low viscosity fluid but it still offers some properties flow quite attractive especially
because of its low viscosity and high buoyancy. It is also a bad solvent of solids and water.
Analysis of these thermodynamic functions reveals the complexity of the thermal perturbation
induced by the injection of CO2(sc) in the geothermal heat exchangers in which initial
conditions of temperature and pressure correspond to the field of supercritical CO2(sc).
-
-
-
Green field evaluation approach for geothermal energy
By R. Bertani“Geothermal resource assessment for green fields” is the evaluation of the expected
potential of supplied geothermal electricity that might become available for exploitation of a
given reservoir.
The standard technique described (“stored heat method”) takes into account only the heat
reserves of the inferred geothermal field, without any consideration of the number of wells
and economical feasibility: the permeability of the system is simply not used.
This approach could be considered as “step zero”, for obtaining a first, rough approximation
of what it is possible to install on a given field, when the available information are very poor
and speculative. We will discuss the physical and mathematical basis of the method, and we
will present some application at two real cases. We are restricting our analysis only for water
dominated systems, both high enthalpy (flash plant) and medium enthalpy (binary
plant). Some examples have been chosen as a benchmark of the technique.
With the introduction of an high (realistic) value of the abandon temperature, the correction
for the cooling effect specific consumption degradation and the effective flash technology a
value significantly lower than the standard approach is obtained. We believe this result as a
better estimation of the effective industrial capacity that can be supported by a realistic
approach in a geothermal project. More detailed benchmarks will be conducted in the future,
with a comparison from the effective geothermal field performances and the estimated
capacity.
-
-
-
Advances in magnetotelluric sounding of geothermal zones
More LessAdvanced 3-D interpretation tools based on imaging, Bayesian inversion and artificial neural
network (ANN) recognition developed by the author (Spichak et al., 1999; Spichak, 2007)
form a basement of a new paradigm in the electromagnetic data interpretation that takes into
account the geological information known, noise level in the data, prior estimates of the
unknown parameters, hypotheses formulated in probabilistic terms, data available from other
methods and formalized expert estimates. Application of these methods to magnetotelluric
sounding data enables constructing 3-D electrical resistivity models of the geothermal areas,
mapping the geothermal reservoirs and monitoring macro-parameters of the fluid bearing
faults.
In particular, Spichak (2002) used Bayesian inversion of MT data in order to construct 3-D
resistivity model of the Minamikayabe geothermal area (Hokkaido, Japan). Spichak (2001)
has found the most suitable data transforms for adequate interpretation of MT measurements
carried out with the purpose of monitoring variations in the geothermal reservoir resistivity
with temperature. Finally, using ANN Expert System enabled to estimate the Minou fault
(Kyushu, Japan) macro-parameters (Spichak et al., 2002).
-
-
-
The Green tuff units of Mt. Epomeo, Ischia Island (Italy): evidence of an exhumed fossil geothermal system
Authors P. Fulignati, A. Sbrana, M. Vietina and A. J. BoyceThe hydrothermally altered Green Tuff units that outcrop on the Western and North-Western
flanks of Mt Epomeo on Ischia island offer the rare opportunity to see a section of a
hydrothermal system exposed on the surface. The Ischia island fossil hydrothermal system
shows numerous common features with the typical active geothermal systems developed on
volcanic islands.
The mineralogy, the chemistry and the space distribution of the hydrothermal assemblages
observed in the Green Tuff units and in hydrothermally altered xenoliths agree with most of
geothermal environments (Browne, 1978). Based on (1) the occurrence of temperaturesensitive
minerals such as mixed layers I/S (<150°C), illite-phengite-chlorite (>220-240°C)
and biotite (>320°C), and (2) chlorite and illite geothermometry, the secondary minerals of
Ischia island fossil hydrothermal system indicate ambient paleo-temperatures ranging from
120-140°C to about 340°C.
-
-
-
The Ischia Island hydrothermal system: hydrogeochemical conceptual model
Authors P. Fulignati, G. Giudetti, A. Sbrana, I. Giulivo and L. MontiIschia island is located in the north-western part of the Gulf of Naples and is part of the
Phlegrean Fields Volcanic District. Ischia is marked to be an example of resurgent caldera
and is characterized by a high heat flux comprised between 200 and 400 mWm-2.
Hydrothermal activity is well know on the island since Roman Age and more than 200 hotels
and SPA resources, located all over the island, use thermal waters (T ranges from 30°C to
99°C) for balneo-therapeutic medical cures. The deep wells, drilled by SAFEN in 1950's and
subsequent investigations (De Gennaro et al., 1984; Panichi et al., 1992; Inguaggiato et al.,
2000), also revealed the occurrence of high temperature fluids (up to ~220°C) in the subsoil
of the island, hosted within a possible geothermal reservoir. The aim of this work is to
characterize the main geochemical processes that explain the water geochemistry of the
thermal fluids of Ischia Island, to classify the water composition data into genetic groups and
to delineate a conceptual model to explain the composition of the discharges.
-
-
-
Migration of fluids in the Boccheggiano-Montieri (southern Tuscany, Italy) fossil geothermal system: insights for the Larderello high-enthalpy active geothermal field
Authors A. Brogi, A. Dini, P. Fulignati, D. Liotta, G. Ruggieri and A. SbranaUnderstanding the migration of hydrothermal fluids represents a continuous task for
successful exploration of geothermal resources. Contributions to better constrain the
hydrogeological models in geothermal areas can derive from field and laboratory studies on
fossil geothermal systems, evidenced by the concentration of ore deposits in wide areas.
This work presents an integrated study based on fluid inclusion and structural analyses on a
Pliocene-Pleistocene fossil hydrothermal system, located to the south of the present active
Larderello geothermal field. Mineralization, mainly made up of quartz and pyrite, is
widespread distributed in the damage zone of the Pliocene Boccheggiano normal fault and,
far from it, in the older cataclastic levels, deriving from previous deformational events.
-