DOT 2010: New ways to process seismic data in salt in development
The knotty problem of getting dependable seismic images under the Gulf of Mexico salt formations in deepwater requires new data processing approaches, Dr. Arthur B. Weglein of the University of Houston told the opening plenary session of DOT on Tuesday morning.
HOUSTON – The knotty problem of getting dependable seismic images under the Gulf of Mexico salt formations in deepwater requires new data processing approaches, Dr. Arthur B. Weglein of the University of Houston told the opening plenary session of DOT on Tuesday morning.
Weglein said that seismic challenges and failure ( i.e., blow-outs and/or dry hole drilling) occurs when assumptions behind seismic processing and imaging methods are violated. There are three different types of assumptions:
1. Data acquisition
2. Computing power
3. Innate algorithmic assumptions.
The first two challenges are addressed by such advances as wide-azimuth data acquisition and ever-increasing computer capabilities. It is the third challenge of innate algorithmic assumptions required to process data that is the “secret problem nobody talks about in polite society.” The third type of assumption or prerequisite is not caused by limited data collection or limited compute capability, and hence is not addressed by responding to those two important and useful issues, alone. A comprehensive response to deepwater GoM challenges must begin by recognizing these three distinct issues, as residing behind drilling failures. Defining the challenge in terms of the first two issues alone will be useful but will not lead to a comprehensive and fully effective response. Only a frank and forthright problem statement has any hope of problem solution, Weglein says.
He continued to say that there are two ways to overcome the issue of violated assumptions in making seismic data processing more effective, namely: (1) to remove the assumption violation by finding a way to satisfy the assumption, e.g., by satisfying the underlying data acquisition need or compute power, or velocity model and (2) to derive a new method that avoids the assumption and does not require the assumption to start with. Each of these two attitudes and approaches is appropriate for addressing different issues under different circumstances. For data collection and compute power we need to work to satisfy the assumption and requirement. For a velocity model and imaging tools there are cases where that is possible to improve and satisfy assumptions and others cases where it is beyond our capability to match the need of a highly complex and rapidly varying subsurface or reflector, e.g. in imaging a highly rugous top salt.
Through the joint industry project of the Mission-Oriented Seismic Research Program, Weglein is addressing this question of new processing algorithms by developing a method to do depth imaging directly and without a velocity model. That new imaging method is an extension of earlier methods that he and his colleagues pioneered to separate primaries from multiples without subsurface information. “The key to processing seismic data without subsurface information is for the data to get more involved in helping to achieve our processing goals. Our job is to provide seismic data with a guide and template for that involvement and cooperation.” Weglein illustrated the theory behind the new method as it applies to fault shadow zones and subsalt imaging and said that a deepwater test of the approach is scheduled for this year. Weglein expressed his gratitude to his sponsors for the chance to fill the gap between useful and comprehensive. He thanked Eldon Ball, Gail Killough, and the DOT organizers for the invitation to speak in the Plenary Session of DOT 2010.