Geophysical instrumentation no longer the limiting constraint

Oct. 1, 2000
History suggests need for applications

"People will not look forward to prosperity who never look backward to their ancestors." Edmund Burke (The Revolution in France, 1790).

Click here to enlarge image

Breakthroughs in geophysical technology and techniques have significantly affected our business in the last 20 years. Reliance on geophysical methods to find and develop oil and gas fields has rapidly increased.

Geophysics is defined as "the study of the earth by quantitative physical methods, especially by seismic reflection and refraction, gravity, magnetic, electrical, and radiation methods." This study dates back to at least 136 AD, when the Chinese scientist Choko designed a seismograph, using an earthen jug and balls to detect the foci of earthquakes.

However, the commercial use of geophysical methods did not emerge until the early 20th century. Throughout that century, breakthroughs and technological advances improved geophysical acquisition, processing, and visualization. Sometimes breakthroughs were accepted imm-ediately; at other times years passed before general acceptance. Trends are best seen in hindsight. And past trends provide valuable perspective and insight to the early 21st century. As Edmund Burke noted in 1790, future prosperity is built upon knowledge of what happened before.

Beginnings (1910-1945)

The 1910s and 1920s saw the development of devices and techniques that founded the geophysical industry. Reginald A. Fessenden, one of the most prolific inventors, demonstrated in 1912 that his submarine telegraph apparatus was capable of detecting icebergs. Recognizing the potential of this device for locating subsurface bodies, he filed for a patent in 1914 on his original fathometer, the first clear use of the reflection method.

In 1917, he filed and received a patent for his "Method and Apparatus for Locating Ore Bodies," which used explosive charges as sources and microphones or oscillators suspended in water-filled holes as receivers. Originally designed to locate ore veins for mining, this method combined reflection and refraction techniques and was successfully used in petroleum exploration.

Several other inventors, such as Dr. John C. Karcher, were active. His first patent filings occurred as early as 1919, while a graduate student at The University of Pennsylvania. Later, while conducting fieldwork in Oklahoma, Karcher recorded the first exploration reflection seismograph in 1921. He helped form the first seismic company, Geological Engineering Company, and in 1925, became Vice President and General Manager of Amerada's newly formed Geophysical Research Corp-oration (GRC).

GRC successfully used Fessenden's methods and devices in 1925 and subsequently entered into an agreement for exclusive use. Fessenden's widow noted in 1940 that the agreement was still providing "eminently satisfactory financial returns to the corporation as well as to us."

Refraction and gravity methods also progressed. Originally developed by Mallet in 1845, refraction methods became refined. Dr. L. Mintrop applied for German patents in 1919 on a refraction seismic method, with which he located a salt dome near Hannover in 1920. His methods migrated to the US and were used to discover several salt dome related fields, such as Orchard Dome in Texas in 1924.

In this same time period, the use of gravitational methods developed with the torsion balance, with measures changes in the earth's gravitation field. This tool was the primary method used to identify several salt domes in the US Gulf Coast, such as Nash Dome in 1924.

Subsurface geophysical methods arose. Starting with subsoil prospecting methods in 1919, Conrad and Marcel Schlumberger developed downhole instruments to measure electrical properties of formations and in 1928 tested it in Pechelbronn field in France. In 1932, they developed the measurement of spontaneous potential (SP).

Electrical logging, a geophysical tool, became a standard subsurface tool. By 1930, the reflection method had gained dominance over the refraction method. During the next decade, advances such as automatic gain control, use of multiple geophones per group, and reproducible recordings appeared. By 1940, reflection recording systems generally had up to 12 channels, with six or more geophones per channel. In 1941, 24 channel systems appeared. The industry remained at this level remained for 30 years. Static corrections and several different recording configurations were common. By the end of World War II, large-scale marine surveying appeared.

Post-WWII (1945-1959)

The post-war industry quickly began applying and improving technology developed for the wartime effort. Originally developed to detect wartime enemies, radar facilitated offshore survey positioning. Supersonic depth recorders were used for seafloor mapping. Miniaturization of equipment proceeded rapidly. A seismograph amplifier weighed about 8-lb in 1949, and dropped to less than 3-lb in 1950.

Several companies simultaneously developed a significant technique used commonly today. In 1950, W. Harry Mayne of Petty Geophysical Engineering Company filed the original patent application for the common-depth-point (CDP) method. The patent was issued in 1955. At least two oil companies also developed similar ideas independently and refined them for proprietary use. Different companies used different terms, such as common-midpoint (CMP) and horizontal stacking, which now are often used interchangeably with CDP.

The era of magnetic tape recording began with analog recording appearing by 1952 and moveable magnetic heads by 1955. Although paper sections appeared in the 1930s, seismic record sections became widespread only after magnetic tape recording.

Vibroseis and weight-dropping were created as onshore sources. A number of playback systems were advertised, with analog processing advancing. Papers on synthetic seismograms first occurred in Geophysics in 1955. Solid state electronics led to the development of all-transistorized seismogram systems.

Seismic crew activity in the US peaked in 1952 at about 660 crews. Afterward, it declined at about 30 crews per year until 1962, then at 15 crews per year, totaling around 200 crews in 1970. From 1952 to 1958, most of the decline was balanced by an increase in overseas activity. But the seismic industry suffered a steady decrease in activity, with contractors leasing geophones for $0.75 per month in 1960. Meanwhile, the posted price for WTI oil was $2.90 per barrel in 1959.

The 1960s

Acquisition methods, such as common-depth-point (CDP) and vibroseis sources, were commonly available by 1960. By 1965, onshore non-explosive sources were being used, with marine Vibroseis and air guns appearing by 1969. Offshore seismic streamers were improved, with satellite navigation assisting determination of location. By the end of the decade, standard subsurface coverage increased from 12-fold to 24-fold.

The computer industry flourished, fed by technology developed for the space race. Increasing computing power facilitated new seismic processing methods, including deconvolution and automatic migration. Digital recording systems for land and marine work appeared, although both analog and digital systems were used for some time.

As the seismic crew count continued to decline, contractors increased "speculative" surveys as a way to keep crews active. The contractors acquired and processed seismic in anticipation of selling it to many companies, who profited by acquiring seismic for much lower costs than for more conventional exclusive surveys. During 1950s, the price of oil remained at $2.50-3.00/bbl.

1970-1999

Like all industries, the seismic business saw ever-increasing returns from improvements in technology. Amplitude processing for "bright spots" appeared. New radio telemetry systems and at-the-geophone digitizing aided acquisition in remote areas. Increased recording and processing capabilities provided the foundation for the early 3D work, with time slices appearing as a display and interpretation tool. In fact, a 1971 ad for an interactive interpretation foretold the workstation systems that would develop in the next decade.

Publications appeared in 1970 on refraction work for subsalt exploration. VSP (vertical seismic profiling) and shear wave surveys increased. Seismic stratigraphy appeared in the mid-1970s. Other geophysical methods saw dramatic increases in instrument sensitivity, such as detailed pictures of the seafloor and shallow subsea sediments provided by a number of new tools.

Then the oil embargo shocked the world at the end of 1973. Per-bbl oil prices rose within two weeks from $3.01 to $5.12 by mid-October and then to $11.65 in mid-December. After decades of relatively constant prices, oil prices nearly quadrupled over the course of two months.

Seismic crew count in the U.S. had already begun rising, from about 200 in 1970 to a peak of around 690 in 1981. Then, for the next five years, the domestic count fell rapidly to 210 crews in 1986. The count continued to decline to about 80 in the US in 1993, and the crew count outside the US was about 240. Since 1993, the count both domestically and foreign slowly increased.

By the mid-1980s, several migration techniques were used, including wave-equation migration, pre-stack time migration, and post-stack and pre-stack depth migration. AVO (amplitude versus offset) analysis augmented "bright spot" technology. The new display techniques included "movies" of vertical and horizontal slices through 3D seismic volumes and the use of colors for amplitude, velocity, and frequency sections. Interpretation workstations and sequence stratigraphy significantly impacted the application of 3D seismic in exploration and development. Personal computers (PCs) became affordable, providing geoscientists a quicker means to perform basic computations.

In the 1990s, coherency processing and subsalt imaging became popular. 3D seismic became the standard for selling prospects. Companies experimented with 4D (multiple 3D surveys acquired through time) and 4C (four-component) acquisition. Interpretation workstations became common. The concept of using cross-disciplinary teams gained widespread acceptance, and these teams required integrated databases. Voxels, the 3D equivalent of pixels, were developed to aid 3D interpretation. PCs became increasingly powerful and decreasingly expensive, and specialty software applications proliferated. Advances in operating systems and hardware developments enabled massive computational capability.

New emphasis was focused on the hardware, software, and methodology for visualization. Teams, buried by the enormous amounts of data available, desperately needed new methods to see and understand the information they had to interpret. Computers generate more data than humans can process with the established visual displays of the late 20th century. Borrowing from the film and computer game industries, 3D visualization and immersion technology quickly entered the industry.

Several companies invested significant research in displaying data for human consumption and integration. In the late 1990s, Texaco led the industry trend to develop immersion facilities, providing exploration and development teams with the ability to integrate information and to visualize in 3D. By 2000, there are several commercially available immersion facilities worldwide. Ever lower-cost alternatives are quickly developing and will soon be affordable even to the smallest companies. Companies now experiment in virtual reality centers, allowing the interpreter to "step into" the data.

2000: looking forward

This review emphasizes two major trends for the geophysical industry in recent years: the advancing importance of computer power and the decrease in independent research. In 1985, Robert Sheriff noted the following in his review of geophysical technology, as revealed in Geophysics:

"Throughout most of the fifty years geophysical ideas were ahead of the instrumentation needed to permit their efficient execution ... Perhaps instrumentation is no longer the limiting constraint, and this may pose a challenge to us to generate application ideas."

The current rate of growth in computer power is overwhelming, challenging us to find ways to effectively use the technology available. The hardware capabilities are increasing almost exponentially, with concurrent decreases in cost and physical size. New programming languages and techniques facilitate rapid software development.

The technology and display methods that enable teams to easily integrate and visualize all available data - geophysical, geological, engineering, petrophysical, cultural data, and infrastructure - are the "cutting edge" of today and will be a defining advantage for future prosperity.

So we are now challenged, rather than limited, by the instrumentation. One lingering question though is "Who dreams for the future?" If we can dream it, we can have it.

About the author

Sandi Barber is President of Barber & Associates, consultants in workstation interpretation, training, and support. Barber can be contacted at 713-723-1480.