'Digital twin' concept underpins successful digitization strategy
Creating order from data overload is essential for the decision-making process. Imposing structure to this massive amount of data is critical. This is the role of the digital twin. In this current economic environment, data plays a prominent role.
Creating order from data overload is essential for the decision-making process. Imposing structure to this massive amount of data is critical. This is the role of the digital twin.
The extended low oil price environment has led to an increased focus on managing and maintaining assets and improving efficiencies. In this current economic environment, data plays a prominent role. Oil and gas companies are collecting and analyzing more operational data than ever before in an effort to make better and smarter decisions, but the enormous volume of data from disparate sources poses a problem that continues to expand in complexity with the addition of new data streams.
According to a publication by Northeastern University, the total amount of data in the world was 4.4 zettabytes in 2013. To put that in more digestible terms, one zettabyte equals 44 trillion gigabytes. This volume is expected to grow to 44 zettabytes by 2020.
While individual companies are not wrangling with zettabytes of data, they struggle to manage a significant volume of information and find it difficult to determine the hidden value of data.
Exploring the concept
Before explaining the value of the digital twin, it is important to understand the concept. Newly introduced to the oil and gas industry, the concept of a digital twin has been around since 2002. With the introduction of the Internet of Things, the digital twin is now cost-effective to implement, and the concept is rapidly becoming imperative to obtain a competitive advantage in efficient business operations.
|Pairing the virtual and physical worlds via a digital twin helps improve operations. (Images courtesy DNV GL)|
Gartner Inc., which has the largest base of IT research analysts and consultants in the world, includes the digital twin in its list of Top 10 Strategic Technology Trends for 2017, listing it among technologies that analysts believe will have, “substantial disruptive potential across industries.”
In simple terms, a digital twin is a virtual model. In the case of oil and gas operations, it is a model of any production and processing asset, such as a semisubmersible or drillship. Pairing the virtual and physical worlds via a digital twin allows analysis of data and system monitoring in a way that dramatically improves operations, preventing downtime, reducing maintenance costs, and providing data that can be used to streamline operations throughout the lifecycle of the asset.
The digital twin uses smart sensors to gather and communicate real-time performance data from an asset to both onsite and remotely located teams that can leverage extensive data sets to monitor operations, identify trends, and more rapidly implement lessons learned to improve operating efficiency.
A digital twin can be built with varied layers of complexity – for example, separate models can be developed for structural, machinery, control systems, and process systems. A digital twin also can plug into enterprise systems to gather business performance data like maintenance costs and contractor performance data and can manage these data on the same platform.
Key benefits include the ability to:
• Analyze production rates and identify system bottlenecks
• Analyze equipment failure rates and optimize maintenance programs
• Conduct root cause analysis of equipment failure
• Manage structural integrity
• Integrate knowledge management systems
• Optimize new designs based on historical data
• Explore hypothetical “what if” scenarios allowing operators to be ready for the unplanned
• Visualize asset risk and key performance indicators on a single platform.
Analyzing business and equipment performance data holistically allows business decisions like field life extensions to be made on the basis of accurate data and a clear understanding of asset integrity.
The decision to extend the life of an existing asset depends on a number of things, including the future production rate from the asset, market price of the product that the asset produces, cost of modifications, refurbishment, repair, and the cost of future investment in operating the asset. A key concern in determining whether to extend the life of an existing asset is finding a way to minimize unplanned maintenance and repairs. These can prove very costly, especially if there is an associated loss of production, and trying to get a handle on unplanned interruptions to production is extremely difficult in the absence of historical and real-time data.
A data-driven systems approach is an essential first step in prioritizing future investment decisions and maximizing the performance of an existing asset.
The digital twin is a valuable tool in the data-driven asset management process because it is a virtual representation of the asset, which includes multiple components that can be used to manage the asset throughout its lifecycle. These components can include drawings and reports, inspection and survey results, numerical models (such as finite element models), sensor data and other information particular to the asset.
Currently, DNV GL is developing digital twins that integrate data such as metocean conditions, survey results, and sensor data with advanced numerical structural modeling of the entire unit for structural integrity management of offshore assets. What sets this effort apart from others is the work that has gone into streamlining and fasttracking modeling capabilities.
The computational engine for the digital twin is based on the Reduced Basis Finite Element Analysis (RB-FEA) model developed at the Massachusetts Institute of Technology and is commercially available in the Akselos software. Because the Akselos RB methodology is based on component-based 3D models of structures and the software is based on the RB method, it enables very fast fully 3D structural analysis to be performed. This approach routinely produces models 1,000 times faster than FEA for industrial-scale simulations and delivers models that are 1,000 times larger. This speedup and level of detail is a key enabler for DNV GL’s vision for digital twin technology since it enables true condition-based modeling of large and critical assets, allowing all relevant structural details to be included in the model, along with inspection-based condition data such as cracks, corrosion, and damage due to impact or collision.
Each component of the RB-FEA model is based on a standard FEA mesh that represents the geometry of the component, with associated material properties, loads, and boundary conditions. The characteristics of components are parameterized, which makes varying a component’s properties (geometry or density, stiffness, loads etc.) simple. Components connect on ports, a design that allows a complete system to be modeled. The component-based nature of this approach makes it easy to modify a model by adding, removing, or replacing elements.
DNV GL and Akselos recently partnered to demonstrate the capabilities of the RB-FEA technology for offshore structures. Together, the two organizations are primarily focused on investigating hydrodynamic loading on floating structures (semisubmersibles, drillships, etc.) and fracture mechanics in a global model, amongst other topics.
Hydrodynamic loading analysis can require thousands of solves. Akselos’s RB-FEA makes this fast and efficient for large, detailed models. For each time step, wave pressures (generated by such software as WAMIT/WADAM) are mapped onto the hull, and the RB-FEA solve is performed in less than one second. This speedup in computational capability makes it possible to carry out time domain fatigue analysis of large offshore structures, which previously was not possible because of the vast amount of time needed to perform the calculations using conventional FEA software programs. Applying the capabilities of the RB-FEA method, actual sensor measurements of platform motions and wave data now can be used to perform detailed fatigue assessments.
The RB-FEA global models, with highly refined local regions, enable evaluation of J-integrals and other fracture mechanics quantities within the context of a global (i.e. full-asset) solution. Parametric models enable crack geometry to be updated in seconds – speeding up analysis and studies.
A significant step toward improving maintenance is the capability for inspection results to be imported directly into the global model so crack growth assessments can be performed to optimize inspection intervals.
Asset life extension – enabled by the digital twin and condition-based monitoring using sensor systems – is the key to capitalizing on the value of existing facilities that are operating near (or beyond) their original design life. The new paradigm of condition-based monitoring is a shift from the traditional prescriptive, time-based approach to inspections. Using a digital twin, it is possible to analyze sensor data and survey results to optimize inspection programs based on the actual state of the asset.
Knowing the true physical status of the asset allows the inspection program to target areas that merit closer inspection and to avoid spending time evaluating segments of the hull that exhibit no significant degradation. This, in the end, results in time and cost savings for the operator.
The additional capability of using a digital twin for real-time risk assessment has the potential to significantly improve safety on offshore facilities. History has shown that major accidents result when there is a breach in multiple safeguards or barriers. Experience with complex engineered systems demonstrates the need for recognizing and developing an approach to assessing and managing real-time degradation of barriers in an interactive way during operations which recognize rare but potentially catastrophic failures. Until now, the challenge has been that companies have been grappling with the degradation of critical barriers while at the same time struggling with a large volume of data coming from multiple sources. Without a harmonized process for extracting and managing the appropriate vital data, it was not possible to get value from the influx of information coming from disparate streams in dissimilar formats. The digital twin addresses that problem by delivering a common data platform that enables the timely exchange of information among stakeholders to help prevent these types of accidents in the future.
The digital twin also helps companies achieve regulatory compliance by using data to illustrate that risks on a given asset are being managed.
The challenges of the oil and gas industry is changing. This is the sort of disruption that will allow companies to contend with oil prices in a rapidly evolving operating environment.