Moving toward fully populated models, long-distance collaboration

Oil company scientists and engineers have long dreamed of the day when they could create fully populated models of the earth, including real-time information, to manage the field through remote automated systems.

By Bill Bartling

Oil company scientists and engineers have long dreamed of the day when they could create fully populated models of the earth, including real-time information, to manage the field through remote automated systems. This concept has various names, including automated oil fields, e-fields, or real-time reservoir management. Regardless of what it's called, the challenge is the same: How to extract more information out of an oil field, put that information into a working model and use that model for decision management and remote management of the field. The key to this is connecting people, process, and data remotely through collaborative visualization. This enables specialists to become involved in real-time decision making.

What was a dream 10 years ago is rapidly becoming a reality. Different types of real-time data can now be delivered from a variety of evolving sources. New, inexpensive, and diverse instrumentation, placed downhole in the reservoirs, can send information back to the office in real time. Advanced time-series seismic analyses (4D) are starting to replace today's conventional, surface-based sources – vibrators, air guns, water guns, explosives, etc., for generating reflection sources. Researchers are experimenting with oilfields that use the earth's natural seismicity to procure better images of the earth. This can enable a company to layout a series of passive geophones in the field to collect information about the activity of the field itself. When researchers record seismic data in small Richter scale or negative Richter scale values, the activity of the oil field, i.e., the microseismicity associated with the oilfield operation, characterizes the behavior of the reservoir. Having these passive arrays collecting the signal from micro-events provides direct information about how the field is being produced. This, in combination with conventional seismic surveys, well logs collected at the time of drilling, and downhole sensors create decision models that are fully populated, with much better resolution than ever before.

"Fully populated"refers to an evolution in how data is presented. In the past, 2D panels, cross-sections, and maps, were built without detailed reservoir information on most of the reservoir being produced. Today, volumetric models are built, but much of the time they're not fully populated. This means that properties have not been fully distributed around the model, where each point in space includes values for each of the important parameters, such as porosity, saturation, seismic parameters, and relative permeability. With a fully populated model, the distribution of these properties is represented, and thus the dynamics of how the fluids are moving can be understood. In other words, the relative permeability field is understood.

The challenges of receiving such a wide variety and large amount of seismic data are clear. It will be necessary to process and understand the data in real time, using software powered by large, distributed computational systems. As we increase the volumes of data that we look at and try to analyze them in real time, another challenge is assessing the pertinent parts of that information quickly, in order to make decisions in real time. This is best accomplished in large-scale visualization systems connecting the right people to solve problems.

Once a team is making decisions in real time, the team members don't want to make those decisions alone, but rather collaboratively with other specialists in the company. Therefore, they will have to interconnect the models, the computer systems and the people without moving the people or the data. The ability to share models or images across long distances will become increasingly important. The ability to reassess the data model as the oilfield changes in real time and the ability to visualize large models from different perspectives will be critical as well.

Long-distance collaboration using fully populated data models for decision systems is no longer a dream. It is quickly becoming the future of the oil industry, and the capabilities and supporting technology are here now.

Author
Geologist Bill Bartling is Director of Global Energy Solutions for SGI (www.sgi.com), responsible for SGI's strategy and position in the oil and gas industry. Prior to joining SGI in April 2001, he held management posts with Chevron Corp. and Occidental Oil and Gas, among other leading energy companies.


This page reflects viewpoints on the political, economic, cultural, technological, and environmental issues that shape the future of the petroleum industry. Offshore Magazine invites you to share your thoughts. E-mail your manuscript to billf@pennwell.com. Please write "Beyond the Horizon" in the subject line.

More in Company News