Turning big data into information benefits the bottom line

Dec. 12, 2013
As we all know, the amount of data being generated today has never been greater.

Ian Verhappen
Industrial Automation Networks Inc.

As we all know, the amount of data being generated today has never been greater. In fact, the amount of data is now so large that it is placing a strain on much of the infrastructure on which we rely. Unfortunately, much of the data is simply consuming bandwidth and is not adding value to the owners and generators of the trillions of bytes of potentially useful information. A significant portion of this new data is being generated by business, and it should come as no surprise that today's "smart" field devices are capable of generating even more data than is presently being captured, let alone used.

A "typical" transmitter used in the hydrocarbon industry contains in excess of 300 parameters, with 10-20% of those parameters being updated on a regular basis. And, at a minimum, there are usually up to four process variables measuring the process conditions, updating on at least a 1-second basis.

If we consider instead a variable frequency drive (VFD) which is commonly used instead of control valves to control the flow or pressure of a process with its associated motor control center (MCC), we have an order of magnitude additional real-time data available. All of this data can be processed to be turned into information by computing platforms and algorithms to better predict not only the actual operating conditions but also the status and health of the equipment itself.

Taking this information to the next step by converting it to knowledge is where the true benefits begin to be realized through reduced downtime, improve maintenance, and increased system reliability. Reliability is paramount offshore, where getting a specialist on site may be expensive, but is still much less than the potential costs from an unplanned outage or system failure.

Managing this sort of "big data" is already providing benefits in the consumer realm, with tracking of purchasing habits through your credit or debit card as well as loyalty card usage. Making use of big data starts with acquiring the data, then organizing using massive parallelism so that you can analyze all data at one time, resulting in completion of the process to find patterns that could not be seen from the individual data points. Big Data solutions can be based on relational or NOSQL (unstructured and without schema) databases. The NOSQL option is becoming more widely used.

Machine learning or programming computers to perform tasks that humans perform well but are difficult to specify algorithmically, such as recognizing patterns (seeing something in the shape of clouds, or the image in abstract art for example), provide a principled way to build high-performance information processing systems. Common examples of data mining include search engines, information retrieval, and adaptive and personalized user interfaces. Personalized assistants (information systems) are being applied so that software is trained by analysts to learn the "thought processes" of the analysts and then apply those same processes to be able to learn as a by-product of the data processing itself.

The industry term being used for managing all this data is "digital oil field," though the biggest companies have trademarked their own versions. At Chevron, it's the "i-field." BP has the "Field of the Future," and Royal Dutch Shell likes "Smart Fields."

By applying similar data mining techniques, benefits are being found in all phases of the development life cycle, including exploration and geophysics. Another use of the multivariate statistical analysis in data mining enables improved well placement decisions. Then, once the well is drilled, wave modeling is used in real-time to manage fracturing stages to maximize the efficiency of that operation.

A recent study by MIT indicates how the digital oil field is affecting the way companies operate. The article indicates that Chevron's internal IT traffic alone exceeds 1.5 terabytes a day. While the initiatives leading to the digital oil field initially intended to make better use of the expertise of an aging and retiring workforce, the application resulted in what industry-wide estimates suggest can be 8% higher production rates and 6% higher overall recovery from a fully optimized digital oil field. Chevron anticipates savings of $1 billion per year as a result of its i-field initiative, with the biggest benefits coming in the larger projects such as the Sanha field offshore southern Africa, deepwaters of the Gulf of Mexico, off the coast of Nigeria, and off the coast of Australia at the $37-billion Gorgon project, the single largest natural gas project in Australia's history

Making better use of all available data and managing it so that it is transformed from data into knowledge not only improves the "bottom line" through better operations, but also increases opportunities for reliable, safe operations, especially in high-risk areas such as those in the offshore sector.

About the author

Ian Verhappen, P.Eng. is an ISA Fellow, ISA Certified Automation Professional (CAP), Automation Hall of Fame member and a recognized authority on process analyzer sample systems, Foundation Fieldbus and industrial communications technologies. Verhappen provides consulting services in the areas of field level industrial communications, process analytics and hydrocarbon facility automation. Feedback is always welcome via e-mail at[email protected].

Reference

"Big Oil Goes Mining for Big Data" MIT Technology Review,http://www.technologyreview.com/news/427876/big-oil-goes-mining-for-big-data/ 2013-11-11