Software technology evolution helps streamline offshore engineering
Offshore and maritime engineers have seen amazing leaps in technology over the past 30 years. In the 1980s, it took more than 10,000 man-hours to create the finite element models for a gravity-based structure in the North Sea. In those days, engineers would create every single node in the mesh manually.
Because of limitations in software and hardware 30 years ago, engineers often worked with over-conservative designs. The result was that they designed structures that often used too much steel and concrete; and they were very expensive to build. Everything had to be defined in several different programs: one program to analyze the results, one for code check, and one for fatigue check. It was a cumbersome and time-consuming way of doing things.
If a change was made in one program, then the values in the other programs had to be redefined. Of course, this was the source of mistakes. Every user will make input mistakes once in a while. In those days, they had to run all the various analyses over again within the external programs. Today the model is defined only once.
The five most important developments in offshore engineering over the last 30 years are: user experience (graphical user interface, yet scripted); common understanding of a data-centric model (leading to efficient re-design); support of frequency, deterministic, and time domain analyses depending on the needs of the engineer; the possibility to easily make both small and large design changes (concept modeling paired with powerful mesh editing and refinement); and close interaction between hydrodynamic and structural analysis.
The rote work and the frustration of offshore engineers working in the 1980s can be illustrated by the work process of making a finite element model of a gravity-based structure. It was necessary to plot in the values for every single node, finite element, and load. Knowing that there could be 100,000 nodes, this obviously took a very long time. There were some advantages, though. Engineers had extreme control of the finite element mesh used in the analysis. Another source of much frustration was that it was extremely difficult to make changes. If somebody wanted to change something - for example put a new arc in the mesh - somebody would have to change everything. It would take so much time.
In the late 1980s, engineers were able to visualize the numbers on a screen. In the 1990s, geometric modeling and mesh algorithms enabled users to define a mesh in the program. By that time, if somebody dropped by and wanted to change something, only that part would have to be done over again.
Another breakthrough was the ability to automatically apply hydrodynamic loads to a structural model. A huge amount of labor-intensive work was eliminated while the data quality increased.
Then in the 2000s, DNV GL was one of the first to develop commercial software that automatically found and maintained the connectivity between objects - the company calls it concept modeling. This meant that new structural parts could be inserted or existing ones could be moved without remodeling. Another benefit is that the same model is used for a variation of analysis. This allows the model to grow over time. The panel model (the hydrodynamic analysis model) and the strength analysis model are now based on the same concept model. So much has been gained from that. Today, making changes (edit or automatic refine) in the mesh is simple. Fine or coarse, the user or the client can make the call. The loads are transferred automatically.
The greatest thing is that parameters can be changed and the consequences seen at once. Users can change the section properties, code check parameters, loads, or something else. The program will do the redesign at the click of a mouse, and in a fraction of time new code check results are available.
Another area that has seen improvement in leaps and bounds is the ability for automatic graphic reporting. It is a dream scenario. It only takes a couple of minutes to create a report with images. Engineers spent several hundred hours on this in the 1980s.
All in all, the amount of work has reduced by 50-75%, if not more. For shell structures (structures built from surfaces and stiffeners), it can be 90%. Ships, barges, offshore floaters, and concrete gravity-based structures are typical shell structures, while jackets, offshore wind towers and jackups are typical beam models. The trend is, however, that details of beam models are modeled using shell modeling techniques to compute more precise results. The results, of course, are also much more accurate because it is easy to run multiple analyses.
It is based on the same principles of physics. But today engineers can run more “what if” scenarios because of modern personal computers with graphical capabilities and powerful processors.
In the future, the focus will be on engineering efficiency and smartness in reporting. This covers, among other things, usability (easy to use), fit for purpose (completeness in functionality), openness (easy to import data from other sources), speed (e.g. cloud solutions), and the ability to summarize the results from multiple analyses into one design report.
Looking back, there is a satisfaction in seeing all that is being accomplished. Software development has come so far to help the offshore engineering community work more efficiently and with more confidence. It has been a great adventure.
Ole Jan Nekstad
Sesam Product Director
DNV GL - Software