- Home
- A-Z Publications
- First Break
- Previous Issues
- Volume 24, Issue 3, 2006
First Break - Volume 24, Issue 3, 2006
Volume 24, Issue 3, 2006
-
-
Trends in visualization for E&P operations
More LessDuane Dopkin and Huw James of Paradigm provide a user-friendly guide to the uses of visualization in the E&P business today and what we can expect in the future. Visualization of digital data has significantly enriched the arts and sciences and their industry sector derivatives (e.g., entertainment, government, architecture, engineering, medical, and oil and gas) by helping to communicate complex geometries, spatial models, simulations, and even ideas through sensory technology and processes. The complexity of these representations that we attempt to communicate through visualization can often be attributed to their size and scale, their diversity and heterogeneity, their dynamic nature, and data sampling barriers that prevent direct image or model recovery. Visualization has not only come to play a strong supporting role in understanding complex geometries and spatial data relationships, but it also can synergistically influence and subsequently improve the prerequisite data sampling, transformation, and modelling methods used to generate the visual representation. Assimilation of subsurface data and the construction of subsurface geologic models critical to hydrocarbon exploration and production carry all of these complexities. Therefore, it is not surprising that visualization of digital data has impacted the work processes of geoscientists and engineers who must work with a large diversity and large volume of subsurface signals and signatures and associate them with rock and fluid properties. Visualization of digital subsurface data can be traced back to the early 1980s with the introduction of interpretation workstations. Both the science and application were significantly enhanced in the early 1990s with the introduction of 3D voxel-based visualization and interpretation technology, providing subsurface insights through co-visualizations and opacity rendering. Advances in computer graphics, highperformance computing, and application functionality continued enabling a broader range of subsurface visualization investigative tasks. The intersection of these advances with higher levels of data access and application integration stimulated the creation and proliferation of large visualization centres for multi-disciplinary collaborative decision making. Visualization centres are largely used for prospect review sessions, collaborative team and executive management problem resolution, joint venture presentations, and final collaborative decision-making. Today, asset management teams are working in a hub-and-spoke environment where the Visionarium is the hub of decision-making, team rooms are used for collaborative efforts of four to six team members, and preliminary work is conducted at the desktop. In team rooms, small groups are able to gain a better appreciation and understanding of each other’s discipline, selecting geologic targets with a higher propensity for economic success, and developing safer well plans with less environmental impact. With performance and economic advances in computer graphics cards, microprocessors, and next generation ‘embedded’ visualization applications, visualization has pushed to the desktop enabling geoscientists and engineers to generate a more comprehensive ‘picture story’ from field data to geologic model to drilling completion; while capturing and understanding the transformation of subsurface field signals to rock and fluid properties. This trend is indeed timely as the search for quality prospects and demand for increased production has migrated to more challenging environments. Geoscientists and engineers are tasked with qualifying subsalt reflections, unravelling the position of anisotropic over-thrusts, interpreting the signatures of fractured reservoirs, resolving stratigraphic detail and recovering reflectivities for reservoir property prediction, and qualifying differences between time-lapsed surveys. To carry out these demands, they require continuous visualization connection between geophysics, petrophysics, interpretation, and drilling engineering solutions rather than a decoupled post-visualization solution. To meet these challenges, the latest hardware and software advances have come together to support visualizations that incorporate prospect and regional scale; seismic and reservoir scale; static and dynamic models; exploration and production (limited) data; and transformations with interpretation for improved rock and fluid property interpretation.
-
-
-
Extending the limits of earth data visualization and exploration
By L. RacicIncreasingly powerful tools for geoscientific mapping, 3D visualization, and analysis respond to geoscientists’ rising need for greater power, precision and productivity in accessing, integrating, and understanding large volumes of diverse data. Louis Racic, industry product manager, Geosoft, describes some of the practical implications for the minerals and other industries. Twenty years ago, earth exploration and computers were often thought of as mutually exclusive. Today, they are inseparable, and software is a critical enabler of timely and informed geoscience decision-making. Industry’s need to efficiently access, integrate, and visualize larger volumes and diversity of available data have given rise to more powerful productivity tools for geoscientific data access, analytics, mapping, and 3D visualization. Desktop applications are routinely used to collect and view sample data for quick site assessment in the field, while robust mapping systems in the office provide sophisticated visualizations and construction of complex 3D earth models to guide subsurface exploration. In this article, we look at some of the factors driving demand for more power, precision, and productivity and provide industry examples that illustrate the effectiveness of enabling technology in solving current data and workflow efficiency challenges in the geosciences. Industry data needs Earth science projects are growing in complexity and scope. As the generation and availability of digital, geoscientific data grows, geoscientists are increasingly pressed to deliver and account for results. They must meet the challenge of accessing, integrating, and strategically using this rising volume of data within compressed project timeframes in order to support business decision-making within all industries and across all disciplines. Easy access, frequent updating, and continual manipulation of data, in real time, are in demand throughout the lifecycle of a geoscience project – from data analysis in the field, to collaborative interpretation in team meetings, to the presentation of results in the corporate boardroom. Within mineral exploration, the need for more integrated, advanced and subtle methods is partially driven by the fact that companies are trying to find ore bodies in complex environments.. In many cases, geoscientists are working with larger volumes of geological, geophysical, and geochemical data. Exploration project data can include 500 or more drill holes, some of which are 1000 m deep or more - in combination with satellite imagery, GIS data, and surface and subsurface geology data. Geoscientists in a variety of fields now require software tools capable of efficiently processing, analyzing for statistical variation, relationships and other factors, interpreting and clearly presenting large volumes of data from multiple data sources and in diverse data formats. They must do so within a single or transparently-linked interactive environment that allows for frequent data update, modification, and enhancement.
-
-
-
VR and immersive environments for oil and gas exploration
More LessDo you know what today’s immersive environments can provide to assist data analysis and interpretation? Wim Maes and Ken Hunter of Barco Presentation & Simulation offer this guide to what you can expect. Virtual Reality (VR) technology has exploded into a range of visualization tools that can be used by geoscientists, engineers, and other asset teams to enhance and speed up oil and gas exploration, drilling, and production. New emerging VR systems can be tailored into fit-for-purpose solutions offering operational integration for all asset teams from the rig to the office, and from office to office. A reasonable expectation is that a large-scale visualization capability reduces project costs and field errors by 5-10%. Return on investment is typically agreed to be less than one year. VR and immersive environments combine advanced technology with social interaction to analyze complex problems and to take quick and accurate decisions. Very powerful, their applications range from large-scale 3D collaborative viewing rooms over relocatable and portable environments, to fully immersive spaces completely surrounding the interpreters with their data. These large-screen visual display systems show large amounts of data - typically 2.5 to over 4 Mega pixels - on large flat or curved screens at the same time. They allow multi-disciplinary teams of up to 20 viewers to effectively visualize and evaluate geophysical data for oil and gas E&P in 3D. Applications by the world’s leading oil and gas companies include real-time visualization, analysis, and decision-making of seismic data, complex reservoir models, well logs, and geologic cross sections. Most systems are tailored to meet specific needs with the screen size, the depth of the system, and number and type of projectors customized to provide the optimum display solution. As a standard they offer stereoscopic visualization and include interactive whiteboards and integrated videoconferencing systems. Large presentation environments usually carry a high price tag, but are reported to provide maximum return on investment. Many of the world’s leading oil and gas companies use them in their headquarters and in important subsidiaries all over the world. Recent new high-resolution three-chip LCD projectors deliver 1920 x 1080 pixels enabling the building of a stereoscopic large screen display using only two projectors. As most advanced network centric concepts even include a built-in powerful upgradeable display server, all information available in the company can be retrieved via the network and displayed in multiple windows on the large VR workroom screen. These windows with mono or stereo content from the network or from diverse external sources (video or data) can be freely positioned and scaled. Teleconferencing windows can be added and several locations can be visually linked for efficient collaboration.Frequently used content can be saved on the projector’s hard disk and retrieved on demand. Easy intuitive operation through the familiar Windows XP desktop interface enables all team members to access all sources and to control all content by simply using the wireless mouse and keyboard. By adding optical tracking and mouse emulation technology, the VR workroom allows direct, wireless interaction with the data. Connecting several network centric VR workrooms to an existing network enables companies to visually link multiple off- and onshore facilities, saving expensive traveling time and increasing business efficiency. All information can be shared from a remote location on one large canvas. Such direct access to all information and collaborative viewing on a large canvas makes for accurate analysis and fast decisionmaking. The networked VR workrooms themselves can be centrally managed to ensure optimal use of the investment.
-
-
-
How ModViz is putting the cluster into 3D visualization
By A. McBarnetSeeing is believing may be the only way to appreciate the remarkable technology being pioneered by a small US company ModViz, based in California’s Silicon Valley. Andrew McBarnet spoke to CEO Tom Coull about the company’s drive to deliver supercomputing 3D visualization on clusters of low cost computer nodes. After four years of operation, ModViz is looking more and more like a textbook example of seed capital sown to develop a great idea, which has then blossomed into an enterprise with the potential to transform the costs and productivity of an industry’s business processes. Early on, the company took aim at the computeintensive 3D data visualization requirements of the oil and gas E&P industry, but its approach is equally valid for numerous other businesses and organizations. ModViz’s vision from the beginning was to develop software which could deliver high performance 3D visualization of large 3D data sets on computing platforms with a low cost of ownership and operation. This is of course exactly the vision which has for some years been a tantalizing but unrealized prospect for the geoscience community in the E&P business. Visual processing of large seismic and related data surfaced some time ago as one of the major challenges for oil and gas companies, simply because successfully rendered results frequently provide the key to unlocking new reserves at both exploration and production phases of the field development cycle. ‘Cheaper, faster, and better evaluations of the subsurface is what the industry is looking for,’ according to Tom Coull, CEO of ModViz. His company believes that the need can be met by leveraging clusters of graphic processing units (GPUs) in much the same way clusters of computer central processingunits (CPUs) have made supercomputing affordable and accessible for a wide range of companies. For example, leveraging multiple CPUs is almost run-of-the-mill for today’s seismic processing companies who can provide greatly reduced cycle times for batch processing of large seismic data sets. In fact the use of multiple GPUs to achieve impressive visualization effects is nothing new, but the technology has invariably been costly and confined to proprietary hardware. What ModViz has set out to achieve is standards-based software offering a common OpenGL-based computing platform. Among other things, this means making 3D visualization of highly complex or very large 3D data sets possible on individual workstations. Tom Coull is not surprised that no one has come up with a solution earlier. ‘There’s nothing out there, mainly because it is very complicated, but also some of the technology to make our product possible simply wasn’t available until a couple of years ago.’ Right now the ModViz is on the 1.5 release of its highly innovative, flagship product Virtual Graphics Platform (VGP). Coull stated that ‘the superscaling level of performance provided by VGP 1.5 will help our clients significantly shorten existing design and exploration cycle time providing end results in less time.’ He cited an example of one test in which VGP 1.5 running an eight GPU cluster demonstrated a 60 fold performance increase over a one GPU system. These results have already translated into real value at a major oil and gas company evaluating VGP, Coull says. The geophysicist evaluating VGP explained that working with a 4 million-triangle horizon with his current workstation he was unable to use certain application functionality as it would ‘hang’ or lock up the computer. Manipulating the horizon or surface was very jerky as well, as the system struggled to draw the 4 million polygons interactively. In order to perform his interpretation he had to cut the surface into 4 smaller pieces and work on each piece individually. Once each piece is interpreted, he then knits them back together.
-
-
-
Comparison of spectral decomposition methods
Authors J.P. Castagna and S. SunJohn P. Castagna, University of Houston, and Shengjie Sun, Fusion Geophysical discuss a number of different methods for spectral decomposition before suggesting some improvements possible with their own variation of ‘matching pursuit’ decomposition. In seismic exploration, spectral decomposition refers to any method that produces a continuous time-frequency analysis of a seismic trace. Thus a frequency spectrum is output for each time sample of the seismic trace. Spectral decomposition has been used for a variety of applications including layer thickness determination (Partyka et al, 1999), stratigraphic visualization (Marfurt and Kirlin, 2001), and direct hydrocarbon detection (Castagna et al., 2003; Sinha et al., 2005). Spectral decomposition is a non-unique process, thus a single seismic trace can produce various time-frequency analyses. There are a variety of spectral decomposition methods. These include the DFT (discrete Fourier Transform), MEM (maximum entropy method), CWT (continuous wavelet transform), and MPD (matching pursuit decomposition). None of these methods are, strictly speaking, ‘right’ or ‘wrong’. Each method has its own advantages and disadvantages, and different applications require different methods. The DFT and MEM involve explicit use of windows, and the nature of the windowing has a profound effect on the temporal and spectral resolution of the output. In general, the DFT is preferred for evaluating the spectral characteristics of long windows containing many reflection events, with the spectra generally dominated by the spacing between events. The MEM is often difficult to parameterize and may produce unstable results. The CWT is equivalent to temporal narrow-band filtering of the seismic trace and has an advantage over the DFT for broad-band signals in that the window implicit in the wavelet dictionary is frequency dependent. The CWT has a great disadvantage, however, in that the wavelets utilized must be orthogonal. The commonly used Morlet wavelet, for example, has poor vertical resolution due to multiple side lobes. Furthermore, for typical seismic signals, the implicit frequency dependent windowing of the CWT is not particularly important, and experience has shown that a DFT with a Gaussian window of appropriate length produces almost the same result as a CWT with a Morlet wavelet. MPD (Mallat and Zhang, 1993) is a more computationally intensive process than the others, but, as will be shown in this paper, it has superior temporal and spectral resolution if a compact mother wavelet is utilized. Matching pursuit decomposition involves cross-correlation of a wavelet dictionary against the seismic trace. The projection of the best correlating wavelet on the seismic trace is then subtracted from that trace. The wavelet dictionary is then cross-correlated against the residual, and again the best correlating wavelet projection is subtracted. The process is repeated iteratively until the energy left in the residual falls below some acceptable threshold. As long as the wavelet dictionary meets simple admissibility conditions, the process will converge. Most importantly, the wavelets need not be orthogonal. The output of the process is a list of wavelets with their respective arrival times and amplitudes for each seismic trace. The inverse transform is accomplished simply by summing the wavelet list and the residual, thus reconstructing the original trace. The wavelet list is readily converted to a timefrequency analysis by superposition of the wavelet frequency spectra. Simple matching pursuit has difficulty in properly determining the precise arrival time of interfering wavelets – usually it will slightly misplace the wavelets which will also result in a slightly incorrect wavelet center frequency. Also, it can be seen that the process is path dependent: a slight change in the seismic trace may result in an entirely different order of subtraction. Thus, it may result in lateral instability of the non-uniqwue time-frequency analyses. Cross-correlation of the wavelet dictionary against the seismic trace is essentially a continuous wavelet transform, so it can be seen that the method involves iteratively performing hundreds, if not thousands, of wavelet transforms for each seismic trace. In this paper, we utilize a variation of matching pursuit called exponential pursuit decomposition (EPD). The method treats complex interference patterns as containing ‘gravity wells’ at the correct wavelet locations, and the selected wavelet location is iteratively attracted to the correct location. The profound advantage of EPD over other methods is that there is no windowing, and corresponding spectral smearing. The spectra for reflections from isolated interfaces that can be resolved by the method are the same as the seismic wavelet producing those reflections. The method can thus be used with confidence for direct hydrocarbon indication and stratigraphic visualization for thin beds. The classical Heisenberg Uncertainty Principle states that the product of temporal and frequency resolution is constant. One must normally pay the price of decreasing resolution in one domain, to increase resolution in the other. In EPD, there is no windowing and it is the bandwidth of the digital seismic data that limits resolution, not the windowing process. Thus, the Heisenberg Uncertainty Principle does not come into play. As a result, EPD provides better temporal AND spectral resolution than the other methods. In comparing spectral decomposition methods, it is important to keep in mind what the goal of the analysis is.
-
-
-
Virtual outcrop models of petroleum reservoir analogues: a review of the current state-of-the-art
Authors J. K. Pringle, J. A. Howell, D. Hodgetts, A. R. Westerman and D. M. HodgsonA subsurface reservoir model is a computer based representation of petrophysical parameters such a porosity, permeability, fluid saturation, etc. Given that direct measurement of these parameters is limited to a few wells it is necessary to extrapolate their distribution. As geology is a first order control on petrophysics, it follows that an understanding of facies and their distribution is central to predicting reservoir quality and architecture. The majority of reservoir modelling systems used for the subsurface are based on correlation of seismically-derived surfaces to define reservoir zones. Well data are then used to define further, sub-seismic scale horizons and determine the zone properties which are represented in grid cells. Understanding the distribution of both sub-seismic surfaces and potential heterogeneous geology between them remains a significant challenge. Furthermore as the typical grid cell size is c. 50-200 m2 it is challenging to incorporate small-scale heterogeneities. It is critical, therefore, to use realistic values for both key stratigraphic horizons and internal facies distributions. Depositional facies is a fundamental control on petrophysics. However, facies scale heterogeneities are not resolvable using current seismic methods, and well data provide little or no data on 3D geometries beyond the well bore. Studies of modern sedimentary events can give some indication of the link between depositional processes and facies distribution (e.g., Kenyon et al., 1995); however preserved depositional architecture is also strongly controlled by changes in accommodation through time (Jervey, 1988). Laboratory-based experiments (e.g., Kneller & Buckee, 2000) and process-based modelling (e.g. Aigner et al., 1989; Peakall et al., 2000) further illustrate the link between depositional mechanism and facies architecture. However, such models are typically on a scale that is far smaller than the typical field and are more applicable to upscaling studies (Nordhal et al., 2005; Ringrose et al., 2005). Outcrop studies have long been employed as a mechanism of studying analogues and understanding petroleum fields (Collinson, 1970; Glennie, 1970; Breed & Grow, 1979). Once the type of depositional system and the accommodation history of a hydrocarbon field are derived from subsurface data, appropriate outcrop analogue(s) can then be identified (e.g. Alexander, 1993). Suitable analogues are those that are geologically comparable to the system that is being studied and also have excellent 3D outcrop exposure over an area that is large enough to capture the scale of heterogeneity required (Clark & Pickering, 1996). Outcrop analogue studies are thus a key way of improving understanding of reservoir facies architecture, geometry, and facies distributions. Outcrop analogue studies have been undertaken both qualitatively and more recently quantitatively. Traditional quantitative studies (e.g., Dreyer et al., 1993; Chapin et al., 1994; Bryant & Flint, 1993; Clark & Pickering, 1996; Reynolds, 1999) have been focused on the collection of outcrop data to populate inter-well reservoir model areas by stochastic, object-based methods (Floris & Peersmann, 2002). However, it can be difficult to extract usable data from traditional outcrop studies, especially when it needs to be integrated with petroleum engineering databases or to be visualized in 3D. Furthermore, outcrops which represent a topographic cut through solid geology are 2D and while rare examples show multiple sections through the solid geology with different orientations, geological expertise is still required to fully understand and interpret the 3D nature of the bodies. Such work may also need geostatistical data manipulation to overcome outcrop orientation and size issues (Geehan & Underwood, 1993; Vissa & Chessa, 2000) but ideally the data should be reconstructed in 3D. Accurate 3D reconstruction is the only way that parameters such as channel sinuosity, connectivity, and continuity of target sandbodies in 3D may be defined. Such parameters are a key control on hydrocarbon production, including sweep efficiency (Pringle et al., 2004a; Larue & Friedmann, 2005). Software for representing geology in 3D is routinely used to model subsurface reservoirs. This paper will show how recent digital data capture technique advances aids the interpreting reservoir geologist by obtaining accurate and quantitative outcrop analogue datasets to aid and perhaps modify his reservoir model.
-
-
-
Seismic imaging of basalts at Glyvursnes, Faroe Islands: hunting for future exploration methods in basalt covered areas
Authors U. K. Petersen, M. S. Andersen and R.S. WhiteObtaining good sub-basalt seismic images is known to be problematic (Ziolkowski et al., 2003; White et al., 2003). Although the properties of basalts are quite different from those of most sediments Planke (1999) suggested that seismic energy is transmitted through basalt in much the same way as through sediments so the problem of seismic imaging through basalts amounts to the conventional task of separating primary energy from noise, even though the noise including multiples may be considerable. The physical properties of basalt are markedly different from those of the overlying and underlying sediments. Strong reflections due to high impedance contrasts at the top (and bottom) of the basalts leads to significant loss of transmitted seismic energy (Fruehn et al., 2001). Large variations of intrinsic properties along vertical cross-sections of basalt flows have been demonstrated and quantified by analyses of well-logs from wells penetrating successions of flood basalts (Planke, 1994) and from surface mapping (e.g., Self et al., 1998; Thordarson and Self, 1998). This causes the stratigraphic filtering effects of basalt successions to be more severe than that of sediments (Maresh and White, 2005; Shaw et al., 2004). Lateral variations in the thicknesses of sediments interbedded between basalt flows and in the thickness of the upper porous part of basalt flows have been demonstrated by detailed investigations in exposed flood basalts (Self et al., 1998; Thordarson and Self, 1998). The roughness of inter-beds causes 3D scattering of seismic energy, as demonstrated in studies comparing stratigraphic filtering and the effective quality factor, Q, of basalt successions (e.g., White et al., 2005; Shaw et al., 2005; Shaw et al., 2004). Taking these problems into consideration, experiments have been performed in the last decade using: longer offsets (both synthetic aperture, and longer streamers) to improve the signal-to-noise ratio and NMO resolution and to allow processing of post-critical reflections; larger energy sources to increase the general energy level; low-frequency tuning to allow for better penetration through basalts (characterised by low Q values); and shot-by-shot recording of the source signature and combination of OBS and seismic reflection data to improve velocity estimates. In one experiment all of the above-mentioned techniques were applied, providing considerable improvements in sub-basalt imaging relative to previous work (Spitzer and White, 2005; White et al., 2005). An other parameter for seismic acquisition is the orientation of the seismic line relative to the flow direction of the basalt flows (Reshef et al., 2003). However, poor effective transmission of seismic energy, scattering, strong multiple reflections, multiple mode conversions, and low-pass filtering of the energy that propagates through a layer of stacked basalt flows are still hampering routine imaging for petroleum exploration in sediment basins covered by basalts (Maresh and White, 2005). This was demonstrated by the UK164/07- 01 well where the base of a basaltic succession was found 700 m deeper than anticipated from interpretation of seismic reflection data (Archer et al., 2005). In order to obtain imaging quality and detail comparable to those obtained in other sedimentary basins, further improvements are necessary. The SeiFaBa Project (Seismic and petrophysical properties of Faroes Basalts), sponsored by the Sindri group, aims to create data-derived models for the propagation of seismic energy in basalt to provide a basis for better sub-basalt imaging. The project comprises drilling of the Glyvursnes-1 wells near Tórshavn on the Faroe Islands (Figure 1), core analysis for intrinsic physical parameters, recording of VSP and offset-VSP data in the Glyvursnes-1 and Vestmanna-1 wells, and surface seismic wide-angle and reflection data around the Glyvursnes- 1 and Vestmanna-1 wells (Japsen et al., 2005). At both sites these investigations of the elastic properties of basalts are made at a number of different scales. In this paper we present surface seismic reflection data from SeiFaBa experiment at Glyvursnes in the summer of 2003 illustrating that basalts can be imaged effectively using relatively small energy sources (250 g dynamite; 2.6-litre airgun cluster) and that stratigraphic details of flood-basalt constructions can be identified and characterized based on analysis of seismic data and then correlated to well data. We also demonstrate how different acquisition and processing techniques influence the effective frequency content of seismic reflection data and thus the effective propagation through and imaging of the basalts.
-
Volumes & issues
-
Volume 42 (2024)
-
Volume 41 (2023)
-
Volume 40 (2022)
-
Volume 39 (2021)
-
Volume 38 (2020)
-
Volume 37 (2019)
-
Volume 36 (2018)
-
Volume 35 (2017)
-
Volume 34 (2016)
-
Volume 33 (2015)
-
Volume 32 (2014)
-
Volume 31 (2013)
-
Volume 30 (2012)
-
Volume 29 (2011)
-
Volume 28 (2010)
-
Volume 27 (2009)
-
Volume 26 (2008)
-
Volume 25 (2007)
-
Volume 24 (2006)
-
Volume 23 (2005)
-
Volume 22 (2004)
-
Volume 21 (2003)
-
Volume 20 (2002)
-
Volume 19 (2001)
-
Volume 18 (2000)
-
Volume 17 (1999)
-
Volume 16 (1998)
-
Volume 15 (1997)
-
Volume 14 (1996)
-
Volume 13 (1995)
-
Volume 12 (1994)
-
Volume 11 (1993)
-
Volume 10 (1992)
-
Volume 9 (1991)
-
Volume 8 (1990)
-
Volume 7 (1989)
-
Volume 6 (1988)
-
Volume 5 (1987)
-
Volume 4 (1986)
-
Volume 3 (1985)
-
Volume 2 (1984)
-
Volume 1 (1983)