The success of 3D full tensor gradient (FTG) technology's ability to predict base of salt, delineate subsalt structures, detect shallow hazards, and delineate 3D density regimes has been widely reported. The purpose of this article is to explain the mechanics of how 3D FTG data is acquired and processed.
Several unique aspects of 3D FTG enable gradient data to be acquired and processed significantly faster and more cost-effectively than in marine seismic operations. In gradient data acquisition, the gradiometer is a self-contained unit, housed and operated near the waterline of the acquisition vessel.
With no streamers to drag during the collection process, navigation of the vessel is easier, allowing quicker, sharper turns and data collection at speeds of up to 10 knots, compared to seismic at approximately 5 knots. Gradient data can also be acquired around the clock allowing faster surveys.
Because the gradiometer is self-contained and does not need to be deployed, 3D FTG operations require only a small crew, reducing the costs associated with conventional acquisition. Because 3D FTG technology measures minute changes in gravity gradients passively, it is not affected by operations of seismic vessels nearby, nor does it interfere with their operations. Passive measurement also leaves the environment undisturbed.
Unprocessed tensor gravity data displays the "spikey" nature of raw data.
The acquisition vessel is about 200 ft long, allowing operations in seas of 6-8 ft for periods of up to three weeks. The three-person acquisition crew has expertise in offshore surveying, electronics, or engineering, and using GPS ensures that the actual survey conforms to a pre-designed survey. Ship speed, line spacing, and direction are tailored to the water depth and geologic situation.
Line spacing is usually 1 km by 2 km in deepwater and 1 km by 1 km or less in shallow water. Gradient data is collected by a gravity gradiometer located on the centerline of the vessel. The gradiometer stands about 3 ft high, weighs 500 lb, and is housed in a climate-controlled case mounted in a gimbal ring on a gyro-stabilizing platform. The instrument acquires approximately 400 megabytes of data an hour, including navigation data and acceleration information on the ship's motions. The gradiometer consists of a slowly rotating horizontal carousel that supports three rotating disks known as gravity gradient instruments (GGI).
The gradiometer contains 12 accelerometers, which are rotated to minimize calibration error, motion-induced components, and noise. They are mounted in orthogonal pairs on each GGI. The gradient is measured by the difference in readings between opposing pairs of accelerometers. This differencing means that platform acceleration is cancelled out so that meaningful high frequencies can be recorded. The 3D FTG system simultaneously measures nine tensors. Four tensors are redundant; five (Txx, Txy, Txz, Tyy and Tyz) remain as independent physical constraints.
Prior to surveying, self-calibration, and self-gradient procedures are performed. These results are later used during processing to remove the gradients generated by the vessel such as yaw, pitch, roll, amounts of fluids in the fuel and ballast tanks, and amount of hull below the water line.
Final processed tensor gravity after compensation and balancing of the dataset produces an interpretable data volume.
The first processing step, high rate post mission compensation (HRPMC), is done on the vessel during the acquisition process. HRPMC performs corrections on data sampled at the highest frequency (1,024 hertz), which reduces the amount of data to be transmitted back to the processing center and provides initial quality control (QC) information used to evaluate system performance.
Processing runs are made on groups of files that contain two hours of data. Usually 12 files with time overlaps are included in one HRPMC run to process 24 hours of data. During this step, pre-determined, self-gradient corrections are used to remove the effects of the gradients created by the mass of the instrument itself.
Calibration tables are also read during this process to remove varying effects of the vessel's mass relevant to each GGI. Additional calculated corrections are made to compensate for unmovable objects near the instrument, large physical changes onboard the vessel and centripetal accelerations caused by the rotation of each of the three GGIs and the rotation of the carousel. The HRPMC process automatically generates power spectrum density graphs and various log files containing run parameters and error estimates.
Once the HRPMC corrections are completed, the data is decimated to a sampling rate that is appropriate for geologic application, usually 10 seconds in deepwater applications. The plots, log files, and a decimated data set are transmitted to Bell Geospace via satellite and further analyzed as an additional QC method and to produce preliminary maps.
Next, the high rate compensated data is converted from binary to 24 column ASCII and the results of the 2-hour runs are merged into one file containing the entire survey. At this point, the data is in one contiguous time-ordered file, ready for input into the low rate process.
During the low rate post mission compensation (LRPMC) process, lower frequency errors, mainly relating to acceleration caused by horizontal vessel motions and instrument drift are corrected. The resulting data provides highly accurate position. Heading and speed measurements that are used to calculate the Eöetvöes correction for the gravity errors caused by low rate vessel motions. Errors inherent to each GGI, primarily low frequency non-linear drift, are then analyzed and removed.
The tensor components are mathematically related to each other as specified in part by the Laplace equation reduced for this application: Tzz + Txx + Tyy = 0. Complicated leveling procedures must be used to keep all tensors leveled both to themselves and to each of the other tensors so that this relationship is maintained.
This process involves smoothing and segmenting the data into short, equal length, segments joined at "knot" points. Data values are examined at every survey line intersection and pro-rated corrections are applied at knot points until the differences at line intersections are minimized. The result is a set of low frequency, non-linear corrections which are applied to the unsmoothed tensors, producing a tensor set that can be mapped.
At this point, high frequency misties remaining due to varying noise content from line to line are corrected, vessel survey turns are removed and the data is cut into separate survey lines. Each tensor is examined for spikes and noise content and appropriate low pass filters are selected.
Filtering may reintroduce small misties at line intersections, but this is easily corrected with statistical line leveling. To apply statistical line leveling to tensor data, some recalculation is required to continue to preserve the mathematical relationship. First the tensor Txx-Tyy is leveled separately, then Txx, Tyy, and Tzz are recalculated from the leveled Txx-Tyy. When this leveling is complete, the dataset is ready for equivalent source gridding, the last step in the processing sequence.
Equivalent Source Gridding (ESG) is used to eliminate spatial wavelengths that cannot be related to geology. ESG has the benefit over 2D line filtering of simultaneously considering all tensors on each line to absolutely ascertain the difference between noise and geology. ESG uses the minimum distance to geologic horizon (water depth) to eliminate frequencies that are too high to be related to anything subsurface. ESG corrects each tensor along with the gravity so that only geologic responses remain in the dataset, and the proper mathematical relationship between the tensors is preserved.
Processing data for a typical survey usually requires 2-3 days of processing for each day of acquisition, at which time the data is passed to the client or to the Bell Geospace geoscience department for optimization with the seismic data.
Acquiring and processing gradient data is environmentally neutral, cost effective, and efficient means of optimizing data to predict base of salt, delineate subsalt structures, detect shallow hazards, and delineate 3D density regimes.