Visualization rooms increasing analytical quality, productivity
John HybertsenOil companies are struggling to find more oil and to get more out of their reservoirs in production, while at the same time wanting to reduce cycle time in their E&P activities. To be able to do so, new technology must be developed and used. Statoil is now investigating virtual reality (VR) technology for use in the geosciences.
VR is a technology for creating, modeling and interacting within a virtual environment. A virtual environment can, for example, be a room where the users are immersed in a 3D data model, and where special purpose I/O devices are used to manipulate the data model. Earth models, for example, can be put into a virtual environment. Within this environment, geoscientists can together view, discuss and interact with their models.
Much time and resources are spent in analyzing and interpreting data in insufficient manners. To do something about this, Statoil has built several visualization rooms where data is displayed on several large screens, side by side. Here, seismic data, geologic and reservoir models, among other things, are displayed on the different screens. In this way, different data types can be seen together and compared on the same wall, but still not integrated in the same 3D view.
Statoil's visualization rooms are very popular, and are currently fully booked. These rooms have increased the productivity and the quality of subsurface analysis significantly, and are viewed as a success. Statoil's next step in visualization rooms is to provide integrated solutions. This includes developing software that enables visualization of different subsurface data types together in the same 3D data model on a single screen.
Why VR?VR allows geoscientists to optimize 3D visualization and create a more intuitive and effective interaction with subsurface data and models. VR provides the opportunity to be simultaneously engulfed in a 3D data model, which is much more dynamic and powerful to the user than viewing a 3D model from the outside.
The user has the option to navigate, zoom and scale the data within a virtual environment, and in this way, has the ability to examine data in detail from a preferred view. New special purpose interaction techniques are already developed in the VR industry. To use gloves to grab and manipulate 3D geometries for example is much more effective than using a mouse and keyboard, plus it provides relevant physical feedback to the user.
Geoscience demoIn conjunction with the Virtual Environment Technology Laboratory (VETL) in Houston, Statoil tested VR on an integrated data set, and made a test to prove the concept of using VR for geoscience applications. Seismic data, interpreted horizons, well logs, and well trajectories were combined into one model as well as a geologic horizon, and a simulation grid. Users were able to move around in the data set and investigate in detail whether the interpreted horizons matched the seismic data. Geoscientists confirmed this part of the test.
For the internal Statoil demo, a wall-sized back-projected screen was used. The projection was in stereo, and was viewed with stereo-glasses. Navigation and interaction with the model was achieved by having a tracking unit hanging from the ceiling monitoring the movement of one sensor that was attached to the user's neck, one sensor in the user's left hand, and a combination sensor/pointer/selector in the user's right hand.
The sensors communicate with the tracker continuously to update the positions of the user's hands and body within the data model. This gave the user the opportunity to move around and interact with the data model. By bending down it is possible to see the model from underneath, and the model could be seen from different angles by turning or moving the body.
Moving around in the data is easy, if there are not too many different data types present and the model is small enough for the computer to draw in a continuous manner. Movements within the data model must be updated immediately and in a continuous manner in order to have control over one's position in the virtual environment.
The sensors in the user's hands have interaction facilities attached. The menus developed for the demo were attached to the left-hand sensor. The data cube could also be attached to the left-hand sensor, such that all positioning would be performed by that hand. A pointer device was held in the right hand. It provided the opportunity to select functions from the menus, grab data, or interact with the data. The sensors were easy to control, easy to learn, and very intuitive.
Switching data typesSome interaction was demonstrated. An example was the ability to turn data on and off. The different steps in the process of analyzing subsurface data needs integration of different types of data. For example, seismic and reservoir simulation data is necessary to verify certain properties of a reservoir.
Data can be combined and then investigated in a variety ways. One way is to use an interactive clipping-plane or light. The clipping-plane can be moved around and positioned in the data, with the data located at the position of the clipping-plane shown. A directional light can be used to focus on details, for example to make shadows to enhance the contours of horizon.
The clipping-plane or directional-light is managed by the sensors in your hand. By moving or twisting your arm, you can move a clipping-plane in any direction or position you want. Interactivity on the data itself was also demonstrated by the ability to edit a well path. By grabbing the bends of a well trajectory, its geometry and position could be changed completely.
Additionally, it was possible to scale the model in the depth direction making it easier to navigate or zoom close to details in the model. This is a vital function when using many data types together, where everything appears messy to the user. After scaling the model it was much easier to get close and inspect well logs in the data cube.
Test experiencesThe first challenge encountered was taking data from its traditional application and transforming it into a virtual environment. This process was demanding and time consuming. One difficulty was to give the geoscience data a proper representation for both its geometry and content.
A stack of 2D texture-mapped parallel faces representing seismic was used. From most angles, this gave a good 3D appearance. For 3D, geo-models files were read to find the dimensions of the grid that built the geo-model. Geological objects were then created by assigning different grid blocks to the different objects. Each object had a separate color.
The complexity of the geometry caused problems for horizons. There were also problems with undefined areas. Undefined areas blocked the view of the defined areas. They also had a negative impact on the performance when working with the model.
Even though the results did not always look optimum, a program was developed that could remove the undefined areas. Several ways to simplify the geometry of the horizons was also investigated. One way was to not include all points that built the horizon, for example, use only every fourth point in X and Y directions. This proved to not be very good, as important details were lost after a minimum amount of simplification.
With further investigation a technique was found and implemented to reduce the complexity of horizons by 1/20th and still retain important details. This solution provided a good compromise between performance, complexity, and accuracy.
The level of detail brings us to a tricky point in VR in the geoscienses. The user must have the ability to be immersed in the data with immediate response to user interaction, the users must also have the opportunity to interact with the virtual models and modify them, and have these changes reflected properly in the databases. This is difficult to solve when operating on only a subset of the total model.
Entering the caveDuring the development period with VETL, a few virtual environments and interaction techniques were also tested. Any single user equipment is not seen as a benefit beyond a screen with stereo view, as VR is intended as a collaboration environment. Another virtual environment is the Cave. The Cave is a 10 ft by 10 ft by 10 ft room with rear projection on three side-walls, and direct projection on the floor.
Much time was spent in the Cave at VETL. In the Cave, the data models covered 270! of the room plus the floor. The Cave proved to be very good for visualization, presentation, and collaboration. Users adjusted to working in the Cave, and did not feel any discomfort such as dizziness or eye strain. The negative points were that standing in the Cave strained the legs and the users had no place to lean or rest their arms.
The responsive workbench is a table-like device where the data models are projected from underneath. When using stereo-glasses, the images float in the air in front of you above the table surface. The workbench proved to be very good for interacting with the models, and is a good collaborative environment. In addition working at the workbench did not strain legs or arms, since the user could sit and lean their arms on it. If haptic feedback is wanted, the workbench is better suited than the Cave.
Statoil tourThe VR demo was taken to different Statoil sites and presented. Half-hour shows were given in the one-wall Cave, guided with presentations. Employees were impressed by the visualization, and they felt they were litterally inside the data model.
Even though the interaction shown was simple, possibilities were immediately seen for enhancements and beneficial use. The most enthusiastic groups were the asset teams that perform the day to day operations on the Statoil fields. More than 600 people viewed the demo, and there were few that did not see short-term and long-term benefits of VR in the geosciences.
Statoil sees virtual reality as a tool that must be used on complex 3D work. An example of this is seismic interpretation. Interpreters have expressed a wish to be able to interpret 3D seismic data three-dimensionally. Work is currently in progress outside of Statoil to develop VR facilities for this, but so far without concrete results.
Not only should one be able to see the structures from any angle or from within, but it could be advantageous to feel them at your fingertips. Haptic feedback has also been tried. Seismic interpretation is complex, so being able to use other senses than sight would seem most beneficial. An example of this could be sound. When the user move his fingers through the seismic, sound could make him aware of certain changes not visible to the eye.
Another example of the use of VR is reservoir simulation. Simulations or time-lapse changes in a reservoir are visualized much better in 3D. These are events that happen in 3D over time in the reservoir and can not be sufficiently explained in any representation less than 3D. Perhaps moving symbols or color changes over time can express processes developing over time.
Presently, a simple reservoir simulator that demonstrates flow towards a well is being developed for another oil company. This has shown to be quite beneficial when evaluating the behaviour of a reservoir. The wish is to alter parameters for a reservoir, and view the change in behaviour immediately.
For example, it should be possible to grab a well and move it around in the reservoir. The flow, pressure or other parameters that depend on the well position would then respond to these changes. The well can be moved around with a glove or another haptic device, flow can be shown as streamlines, and development of pressure can be seen as a change of color or pressure lines. This can be studied from anywhere within the reservoir, and many users can experience this simultaneously. By this, one can exploit the real benefits of VR.
Viewing 4D surveysTo be able to see and investigate differences between models is of great value. This can be differences between two seismic surveys from the same area acquired at different times, or it can be differences between different data models that were supposed to have similar geometries. Proper tools to adjust geometries must be developed. One solution could be a "virtual hammer" to adjust geometries in the same way metal is modelled. Much of this kind of work is very time-consuming today as much is done in 2D, which gives a very limited view. Tools to view integrated data must also be developed. Perhaps a "magnifying glass" showing data together and separately, with a choice of colors or contrasts can separate and clarify.
Having a good representation of uncertainties is also useful. All subsurface models have uncertainties attached to them, and a good representation of these will give an idea of the margins one should operate within. When planning wells, there are certain areas one wants to avoid, and other areas one wants to hit. When working one's way through the dataset one can visually evaluate the best path for a well, leaving a track behind as one move.
Statoil has taken an initiative on the creation of a consortium for VR in the geoscienses. The purpose of the consortium is to further develop VR technology for use in the geosciences. During the first year, the consortium will concentrate on well planning. To be able to do accurate well planning, both geoscience data and interaction techniques are necessary.
Interaction techniques will be developed by the German National Research Center for Information Technology, GMD, and VETL from Houston. Data used will come from Statoil's Gullfaks field. Eleven companies are members of the consortium so far, and more partners are welcome.
AcknowledgementsFor their contribution to the project, the author wishes to thank Kjell Arne Jakobsen, Reservoir Engineer in Statoil; P. Christian Hagen and Endre Molster Lidal, two students that did their master theses on VR in the geosciences; and Adolfo Henriquez, Chief Engineer in reservoir technology in Statoil, for support of the project.
Copyright 1998 Oil & Gas Journal. All Rights Reserved.