These were themes that ran through much of the new technology presented at this year's Offshore Technology Conference. Companies have made huge investments in computers and equipment to gather, process, and store all types of data related to exploration and development activities. This data is processed and made available to teams of experts who in turn use it for benchmarking purposes with the goal of improving processes at all levels.
Technology push or pull
It can be tricky determining which came first. With the rapid advancement of seismic gathering and processing, downhole sensors, and telecommunication technology, it has become relatively easy and inexpensive to gather more information than can practicably be put to good use. Often, there is the argument made that real-time downhole data, to use one example, offers more information than is needed for most wells.
On the other hand, the move into deepwater has led to a whole new generation of exceedingly complex wells. Many of these have required the development of new technology just to be viable. While the prize in deepwater can be large indeed, the risk and upfront costs are staggering. Suddenly, there's no such thing as too much data. An operator uses the latest in seismic technology to build an earth model and plans his wells with as little uncertainty as possible. While drilling, a company has to closely monitor every parameter of these wells to ensure that they are not only successful, but also steered into the most productive area of a reservoir. The wells need to pay off big, and they need to come online rapidly.
It is in these projects that the idea of benchmarking, gathering as much data as possible, and using the latest communication system begin to make perfect sense. Driven by operators, service companies have developed special shore-based centers where drilling activity is monitored and key decisions are made. To ensure every decision is the right one, each is evaluated and recorded as best practices.
It is no accident that photos of these centers remind one of NASA's Mission Control. In a sense they perform similar functions. A rig crew cannot be expected to gather and process the massive amount of data available from downhole and topside sensors. These workers, after all, have to keep the rig itself running. Even if they had the time to process information, it can be incredibly complicated, and without years of experience it would be easy to draw incorrect conclusions. Fortunately, most companies have a core group of individuals with decades of hard-won experience. These people can rapidly read and evaluate data specific to their area of expertise and offer a solution that will work. The problem is, with an aging workforce, their numbers are dwindling. It is not practical to place such an expert on every rig where a company is drilling or evaluating a well.
The high-tech solution is to staff these mission control centers with each company's best, most experienced engineers. Not only are they able to advise rig crews on several projects at once, but they help develop and mentor the workers in the field so they can more rapidly gain experience.
The valuable offset data that is gathered on these projects is married with a much improved earth model and a systematic approach to benchmarking and best practices. While the technology looks impressive, the key is reducing risk and cycle times. Improving rates of penetration, early production rates, and safety are three major benefits being touted by these systems. Another, less concrete advantage is the knowledge younger employees gain by working with the best, most seasoned individuals their company has to offer. While there is no question the industry is profiting from these systems in the short term – without them, many deepwater projects might be too expensive to ever bring online – in the long-term, lessons learned and the experience gained will be the biggest benefits. These learnings are what the younger workers will take with them into the future, long after the exciting projects of the day are P&A'd.