In a recentTime article on efforts to humanize robotics, the author leads with the declaration, "Let me correct an impression you may have: robots are pretty much idiots."
While certainly an oversimplified generalization, the author's assertion nevertheless carries an underlying truth that we would do well to heed as our industry advances efforts to automate key elements of the drilling process: The need for a human to lead, built on core competencies, must never become de-valued in an automated environment.
As a former military combat pilot and flight instructor, this author has profound appreciation for the incalculable contribution to safety that automated control systems have brought to a high-risk industry.
The offshore drilling sector, likewise, has made enormous and sorely needed advances in automating pipe handling, tripping, connections, and other repetitive processes. These advances not only remove personnel from the "firing line" where injuries, or worse, are most likely to occur, but also help eliminate efficiency-robbing invisible lost time. Despite the tremendous HSE and efficiency benefits these and other advanced automation technologies bring to the table, we must not lose sight of the fact that no machine is perfect. As with human error, we frequently encounter system error, and the two often go hand-in-hand.
Likewise, it is important to keep in mind that humans write the control algorithms, humans build the architecture, and it falls on humans to be absolutely familiar with the capabilities, and more importantly, the limitations of the automated systems they engineer and oversee. The dilemma, of course, is to overcome the natural human tendency to become over-confident in the efficacy of the systems with an organization that has awareness, recognition, and reaction to unplanned events.
Returning to the aviation analogy, airline safety experts, as well as the US National Transportation Safety Board (NTSB) and other federal agencies, attributed over-reliance on automated systems as a major contributor in at least two deadly accidents in 2013.
According to the Associated Press, investigators specifically cited incorrect response to in-flight warnings caused by pre-flight programming errors, and failure to recognize, and react appropriately, to the frequent computer mode changes that occur during the course of a flight. A Federal Aviation Administration (FAA) study ranks so-called "pilot mode awareness" - or more correctly lack thereof - as one of the most common causes of the automation-related causal factors in accident and incident reports. Michael Barr, a former Air Force pilot turned safety investigator and instructor, perhaps best summarized how automation can lead to risky over-dependence, telling the AP that "once you see you're not needed, you tune out."
Consequently, it can be argued that the steady drive to automation strengthens, rather than diminishes, the need to foster and nurture an organizational culture that puts a premium on leadership skills, overall competency and, above all, human interaction. In other words, ever-advancing automation makes it imperative to instill a company-wide behavior at the task level that emphasizes total "crew awareness," rather than one that relies simply on setting an automated control mode, sitting back, and depending on the system to do its thing with no hiccups. It goes without saying that conditions during drilling, as with flying, are ever-changing. In these circumstances, unexpected events can happen that automation alone is unable to overcome. Just as the human pilot must be poised to react intuitively when a glitch in the aircraft's fly-by-wire system causes the plane to pitch nose down, no automated process can replace the core competencies that allow the human driller to respond instantly and instinctively when taking a kick.
Thus, the high-reliability and steadily automated world in which we operate accentuates more than ever the need for a competency-centric organization that promotes leadership and strong team behaviors - again at the task level - over unyielding allegiance to a plan. After all, a prerequisite of continuous improvement is for every member of an organization to learn from experience and past mistakes, which fall strictly within the human domain. Ours is an industry where multi-disciplined groups must constantly manage safety, efficiency and economic risks, making it essential that we cultivate a team behavior where no one operates wholly on autopilot.
The bottom line is that regardless of the level of sophistication, automated systems must be regarded as valuable aides to safe and efficient operations, and not an unattended cure-all. The awareness and instinct that comes with human leadership and team behavior must remain prominent in the automated environment. At the end of the day, competency will always provide the ideal antidote to complacency.
Yarko "JJ" Sos