MANAGEMENT & ECONOMICS - From exploration to enhanced production: obtaining seismic services 'by the drink'

Mimicking cell telephones, bar coding

Th 78600
Th 78600
Each of these libraries holds up to 6,000 high-density media tapes with seismic data. Using the latest technology in secure data storage and handling, new seismic data management solutions increase profitability by reducing the cost of finding and using data.
Click here to enlarge image

The use of seismic data has become pervasive throughout the oil and gas exploration and production business. While seismic data began life as a tool used during the exploration phase, seismic data is now used extensively in the development and production phases of a field's life.

During exploration, seismic determines the location of reservoirs and aids in well placement. In development, seismic assists in mapping reservoir attributes from the producing wells to other prospective locations. During production, seismic provides a baseline that shows changes in reservoir attributes for 4D monitoring to enhance recovery.

Advanced classification processes, such as multivariant statistics, assist in reservoir characterization and block evaluation for acquisition or divestment. However, the very nature of this expanded existence, coupled with the size of the large data volumes generated, are creating a challenge for geoscientists and oil and gas information technologists. Therefore, like the data itself, the need for management of this data and the capture of knowledge and information related to this data, is as pervasive as the data itself.

Traditional seismic data management tools provided mechanisms to manage the large data volumes, but did little to address the issues and consequently ease the pain caused by having to move, load, track, and manipulate these volumes.

Until recently, obtaining seismic data from the data providers was a purely manual process, where a request was made for certain data. Tapes were generated and then shipped via courier to the requesting customer. Today, some companies request that their data be delivered electronically, directly from the processing systems to the interpretation support systems.

While this procedure works for most post-stack seismic data volumes, it requires a substantially sized telecommunications pipe to support such transfers in a reasonable time span. While these larger data pipes have become increasingly cost-effective and more prolific in the developed world, they are still expensive and scarce in less developed locations.

Unfortunately, as the data volumes increase and more pre-stack data is used, even these large pipes will have difficulty transmitting larger data volumes. To effectively mitigate these aspects, something more than an incremental change is required. We need a step change in the methodologies used in seismic data management. Thus, we see a need to change from a model of request and transmission to a model of access as required.

New model needed

The obvious next step is for seismic data to be utilized under an access model whereby data would not be physically transmitted. The application and data are served together using a thin-client methodology. Only with coincidence of data and application, plus adoption and adherence to standards, can such a model encompassing life cycle management of seismic data be achieved.

As most professionals in the geophysical industry understand, connectivity to the data service providers (DSPs) is of key importance to clients and application providers alike. While this methodology would suggest a centralized data repository, the utopian dream of a single data location is not likely. This approach, while good in theory, does not address the issues of the data owners and their reliance on that data and its related services as their life's blood.

Therefore, a group of large data vendor-managed locations, positioned on a high-speed secure network fabric, will most likely arise. In order to embrace this change, what is required is a convergence toward better standards and closer working relationships between the data providers, application and service providers, and the oil companies themselves.

Licensing changes

In terms of multi-client data, changes to the licensing schema will most likely be required to match the new access model. Pressure to change these models has begun creeping into the marketplace, but little has taken place thus far.

The slow uptake is a defense mechanism to the pricing pressures that have been put on contractors over the past several years for speculative surveys. Contractors cannot afford a model that could possibly reduce the already strained revenue levels generated from speculative surveys. Therefore, until these apprehensions abate, new models for data usage will be difficult to implement.

However, what might be adopted is one much like that of cell phone service plans, whereby the user selects and pays for a minimum block of access and levels of associated services. In one possible model, data users will pay for access to a block of data, application, data management services, and other service offerings.

These monies would fund pre-commitment requirements so the survey could be shot, processed, and made available for access. If the "user" requires access to this data beyond the allotted amount, he or she pays the additional fees at an increased rate per "access unit."

Such access plans are customizable to meet the needs and requirements of the data provider, user, or survey, and might include data access, application, and data services, storage services, and other services. The access method provides the data owners with more control over their data, including increased data security, while allowing the users increased access to data on demand. The access model opens the portals for new collaborative services that until now have not been possible.

Collaboration model

The collaboration model would shift to a more interactive services form, where the geoscientist will not only access data and applications, but will access expertise. Before the availability of these services, using applications on a "per drink" basis was not possible as the training and experience curve for a technical application was generally long and steep. Under the new assisted-service model, the user will be able to truly use applications "by the drink," as support will be available to run or pilot the applications for the user.

The proposed model requires a system, which controls, tracks, and manages the seismic data, to be closely interfaced with the acquirers, processors, and interpreters of the data. This will become even more critical as near-real-time seismic monitoring of production begins to see more light, and as the line between processing and interpretation becomes less distinct.

To implement the ability to track data from its origins to the final product, we need to create a standardized data tag, which is analogous to a digital barcode. As we have seen throughout the years, tracking information about data is one of the most difficult areas to broach. However, implementation of a tagging system, using a persistent standard data tag we could carry information about the data such as:

  • Acquisition information
  • Processing history and parameters
  • Interpretation parameters.

These seismic information tags [SeisTagtrademark] need to be in a standardized form, updateable by the controlling process, and persistent, to allow movement of history and information from program or process. While such a standard will require cooperation between the software developers, DSPs, and the data user community, its adoption would reduce complexities in data management and access.

Conclusion

Due to the proliferation and pervasiveness of seismic data across the exploration, development, and production operations of an oilfield, there is a need to create a new model of operation and thinking as to how this data is managed and used.

The traditional models of shipping data on tapes to a location, storage of multiple versions of the same data by each data owner will give way to the access model. Under this model, the archival of the data volumes will shift toward the data providers and result in the development of a more cooperative structure for data access and management.

Data and applications will co-reside at, or near, these data providers, creating the delivery of a modified version of the current application service provider model and can be augmented by expert collaborative services. Movement to this model will provide for increased information tracking, more confident data, increased data security, and a more efficient industry.

More in Business Briefs