Human element on autonomous vessels
Who’s responsible when nobody’s on board – and what needs to happen to transition to that point safely?
As more remotely controlled prototypes take to the sea, voices from the Maritime Autonomous Surface Ships Special Interest Group (MASSIG) and Human Element Special Interest Group (HESIG) have come together for Interactions to highlight the human element challenges ahead for crews working with autonomous vessels.
“We have little idea how safe operational, rather than prototype, uncrewed ships will be, especially when yards start competing on cost,” starts IMarEST President-elect Martin Shaw and former chair of HESIG.
“The safety argument has been based on comparing the actual global safety record of the existing industry, with the safety of autonomous ships being viewed in the same environment. The fact is there are trades and areas that contribute a large percentage of those accidents, for capital cost reasons, and those will be the last to change to uncrewed ships. Indeed, the safest ships may well be the first ones to go uncrewed.
The blame game
Shaw believes too much is made of the percentage of accidents aboard ships and not enough focus is placed on the level of human error that is made by professionals onshore. “The rest of us have to start taking some responsibility. When you take humans off ships, 100% of all accidents will still be caused by human error - embodied within the ship by the designers, builders, owners, and yes indeed computer programmers - but there will be no-one on board to recover the situation or even blame.”
Shaw projects that the pace of development will be slow. “Talking to shipowners, there is little appetite for uncrewed ships and the extra cost involved. The commercialisation of autonomy in the long-haul commercial sector is a long way off. Ports need to be ready and will need to spend large sums of money preparing for the berth-to-berth operation of autonomous ships. And deep-sea commercialisation means funding the replacement of 60-70,000 long haul ships at a time when owners will be funding zero-carbon newbuilds. Shaw suspects “a gradual evolution with each ‘chunk’ of automation being justified and assimilated is more likely to get investment rather than the ‘big bang’ totally uncrewed approach.”
HESIG co-chair Steve Palmer believes it’s easier to envision what will take place by looking to air and road transportation autonomy. “We’ll make the same shift as autonomous cars and driverless trucks on motorways. A hybrid state will continue for some time with a traditionally trained person using their knowledge and experience on board to monitor.”
Palmer likens this hybrid phase with air travel: “In aviation, an engineer used to sit behind the pilot or co-pilot, checking systems. But as technology and reliability advanced, that position wasn’t required as the pilot and co-pilot were able to intervene. “We will have to be in that state. But the next generation won’t have the older skills – and the technology may not even allow for such intervention - and that’s when the real shift will take place.”
HESIG co-chair Steve Palmer
Re-skilling and trust in technology
“Technology is moving so fast,” says Palmer, “and we still train our staff in very traditional skills, as we have done for a half century or more. If you take a mobile phone engineer from 20 years ago, and put them in today’s environment, they wouldn’t recognise much. It’s the same today and we don’t have a clear end point – so how can we train people?
It takes a generation to grow up with the equipment and have that trust in technology. Palmer says it’s a frequent topic of conversation at SIG level: “A lot of technology is already very demanding for staff on board. Take a chief engineer who commenced their career over 30 years ago, when none of the current automation existed. They may rely more on the younger people coming through – and the specialists that come onboard to repair or intervene – which is quite frustrating for these trained engineers and professionals.”
Cyber security and piracy
Crew working in waters frequented by pirates may look forward to operating remotely controlled vessels, but Palmer suggests there will still be piracy challenges to tackle onshore.
“The golden prize for any hacker would be to take control of an asset, with cargo worth hundreds of millions, even if they don’t know what the contents are,” says Palmer. “You’re also making it possible for the equivalent of a pirate attack to take place from a bedroom. But you do remove crew members being held captive and held to ransom from the equation.
“Steering a ship through a piracy area – the captain will react to any threat he or she sees, or, even call to a naval vessel to assist,” he says. But hackers that learn the algorithm for an autonomous vessel, for example, if the algorithm says to avoid a collision with another craft above all else, will be able to manipulate the vessel and send it where they want.
Reacting to the unimaginable is the part that’s impossible to automate. “Everything in automation is based on what we know and expect,” says Palmer, “but there will be scenarios based on something that nobody could imagine happening. “Captain Sully was able to land a plane on the Hudson River,” he says. The equipment was never designed to do this as the engineers never imagine the scenario for it to happen. “That’s always going to be that thing in the back of your mind – if the unimaginable happens…”
Training
MASSIG chair Gordon Meadow says the speed of adoption depends on the reskilling and training. “From our perspective, to benefit fully, maritime must adopt, reskill and scale the use of both new and existing technology by becoming champions of human-centred digital transformation.
“Our sole focus should not be to replace humans in certain tasks but to empower them with new skills to work alongside machines performing shared tasks and using shared decision making.
“The application of new technologies should be consistent with legal and regulatory requirements, but also aligned to ethical principles that ensure that implementations avoids unintended harm.
AI or IA
Meadow highlights the debate on the ethics of safety taking place with differences emerging between Artificial intelligence (AI) and Intelligent automation (IA). Meadow says MASSIG is very much led by Intelligent automation (IA) and he is personally also, concluding that he has recently submitted a joint white paper with Dan Vincent, Head of Future Training for the Royal Navy at I2TEC on this matter.
Read about the IMarEST’s recent role in the revision to IMO’s Human Element checklist.
Tune into Gordon Meadow’s ‘Remote and autonomous vessels – putting the human back in the headlines’ from 2019 (members only)
Read Remote and Autonomous Shipping putting the Human Back in the Headlines I and Remote and Autonomous Shipping – putting the Human Back in the Headlines II.