What is the current state of the oil industry and the geoscientist's place in it? According to the recent Cambridge Energy Research Associates conference, CERAweek, the industry faces an uncertain future. Venezuelan market dislocations and Iraqi tensions are keeping crude oil prices high, allowing the oil companies to freshen their balance sheets and produce a return for their shareholders. There is plenty of oil available to meet the world's needs with one caveat: reserve levels and quick accessibility are diminishing.
Long-term oil supply is the looming problem. Presently, the world economy is using 76-78 MMb/d of crude oil. This demand is expected to increase to 90-110 MMb/d over the next 20 years. When added to the 1.6 Bbbl/yr that must be replaced due to depletion, the industry faces a major challenge.
Natural gas demand is project to grow from 90 tcf/d to 135 tcf/d over the next 20 years. Gas sources are plentiful, but infrastructure will need to expand. Large reserves exist in Russia and the Middle East, but major investments in pipelines and LNG are needed to bring them to market.
Where does the geoscientist fit in this scenario? New reserves must be found, new fields developed, and mature fields redeveloped. A 30% recovery factor from older fields is inadequate with the world's expanding energy needs.
Geoscientists' productivity has im-proved through computer-assisted pros-pecting. But, this has reached the limit of its current ability to deliver "quality" prospects: major reserves at a finding cost of under $1/bbl or a hurdle rate of $12-15/bbl. The industry needs to rebuild the professional ranks, transfer oil-finding knowledge, reach farther into new areas, and adjust hurtle rates upward to meet the world's future demand.
Meanwhile, alternative sources are being studied. New research to manufacture fuels, using genetically engineered microbes producing hydrogen or methane, is in the very early stages, but hydrogen/methane generation cannot meet the world's demand in the next 10-20 years. Only new petroleum sources can meet that demand, and only geoscientists can locate the needed reserves.
Industry's newest tools
The daunting task ahead of geoscientists – demand expanding to 110 MMb/d and 135 tcf/d with depletion replacement needs of 1.6 Bbbl/yr – cries for attention. More people and better tools are needed to address the challenge. Training a new geoscientist requires eight to 10 years, and new technologies take at least that long to develop. Fortunately, the industry has been steadier with its tool investments than it has with its people.
The state-of the-art was examined at the recent 2003 Multi-Component Symposium sponsored by Veritas DGC and Input/Output. Multi-component seismic (4C), new processing techniques, and enhanced interpretation software to tease more detail from older fields are now available to meet the challenge.
Multi-component systems are now ready to implement the next advance in marine seismic. This cut-away of the I/O VectorSeis ocean bottom sensor shows the internal damping members that isolate the sensor case (finned box) from the towing cable.
Extracting the remaining oil in older fields requires more discrete information than that needed for discovery. Seismic acquisition ex-panded from 2D to 3D, increasing the interpreter's information by an order of magnitude. The move to 3D/4C will expand that by at least two additional orders of magnitude due to nine-component data vectors plus the pressure vector.
The key to interpretation is the linking of pressure (P) and shear (S) wave data. New software developed by Hampton-Russell, supported by an industry consortium, ties these data sets to derive S-wave velocities. This combination creates a host of new interpretation possibilities. Multi-component data is readily able to:
- Interpret through gas clouds
- Define salt domes and subsalt plays
- Image low contrast "P-wave transparent" reservoirs
- Validate amplitude versus offset indicators
- Predict lithologic "sweet spots"
- Improve shallow zone resolution
- Identify fracture orientation and density.
High-resolution aero magnetic data has been used to map basement structure and salt diapers and is a valuable tool for deep gas and sub-salt plays. Fugro Airborne Surveys recently acquired a high-resolution aero magnetic survey over a large portion of the Louisiana shelf in the Gulf of Mexico.
Fugro collected a high-resolution aero magnetic survey in the Gulf of Mexico for basement structure, salt diaper, and sub-salt mapping.
The data was acquired on a 0.5 mi x 1 mi grid at a survey altitude of 500 ft to compliment Fugro's existing data on the shelf, slope, and deepwater portions of the Gulf. Approximately 30,000 line mi of magnetic profile is now available.
Compagnie Générale de Géophysique opened a regional computing hub at CGGAP in Kuala Lumpur, Malaysia. The new hub will provide the data processing power for both the Kuala Lumpur (CGGAP) and Perth (CGG Australia Pty Ltd.) data processing centers using Linux PC clusters with the latest release of CGG's Geocluster seismic data processing package.
The hub follows the same model used in the CGG Group's London and Houston regional centers: huge Linux PC clusters and fast links to the data processing centers. PC clusters will produce 3D Kirchhoff pre-stack time and depth migration on a routine basis.
Fakespace Systems and Xi Graphics released a new software driver that enables stereo-scopic immersive environments to run on Linux. Fakespace visualization systems that incorporate digital projection technology can now run Linux-based applications using off-the-shelf graphics cards.
The driver is based on an open text customization solution using XiG's Accelerated-X driver software. Any stereoscopic-capable Linux application with supported Linux-based graphics cards can be used to display both active and passive stereo modes with digital projection technology.