Accountability in an autonomous era
Neil Salter, chair of IMarEST’s Maritime Autonomous Surface Ships (MASS) Special Interest Group and Ross Macfarlane from MASSPeople reflect on unresolved questions surrounding accountability in incidents involving autonomous vessels.
“Under maritime law, at the moment, it’s the master of the vessel that has ultimate responsibility. There can be others responsible, say if a member of the crew hasn’t been doing their job and that led to an incident, [but] it always comes back to the master, as they are the accountable figure on location,” opens Macfarlane.
For semi-autonomous vessels with a captain on board, this system could stay in place, but where there is no captain, Macfarlane suggests looking towards the aviation industry.
“There is this role of an accountable manager. So, if an incident happens anywhere in the world, that person is responsible to the regulatory authorities.”
For vessels operating from a remote operations centre, Macfarlane says this role could lie with someone at the centre. However, it is feasible that there may not just be one single centre operating a vessel.
“There have been other possibilities proposed, such as a three-fold accreditation process: the flag state, the remote operations centre, and then the vessel itself, which would say that your operation centre could control A, B, C types of vessel in X, Y, and Z countries.
“Say you were sailing from country X to country Z, but the operations centre used in country Y is not certified in country Z; somewhere along the line, you will have to hand over responsibility to a different centre that has permission to control that type of vessel in country Z waters,” explains Macfarlane.
Another complex issue surrounds responsibility for the algorithms that lie behind the automation. Salter says it could be that: “Insurance companies would seek to recover the outlay from the writers of the algorithms.”
Macfarlane notes that the responsibility of algorithm writers is “an ethical question”.
“Are they aware of what they’re signing up to when taking on that job? Or when you have a system of systems, is it the company that puts all the systems together who is responsible?” Macfarlane questions.
There could also be unintended or unexpected decisions. “There's going to be certain decisions that theoretically are logically sound, but [the system] doesn't have that depth of thought to weigh up the full consequences of its decision,” says Macfarlane, reflecting on variations of the ‘trolley problem’, which poses the ethical dilemma of whether it is better to sacrifice one person in order to save many others.
Greater transparency on what happened
“When an incident occurs, two questions are asked of the parties involved,” states Salter. “What information was available, and what they did with that information. These are fundamental questions that won’t change.”
What could change, Salter says, is that we gain greater transparency on the circumstances leading up to the incident and exactly what decisions were made and when.
“The sort of information that [investigators] are going to have is hydrographic data, information on shipping lanes, the closest point of approach of other vessels, how the ship was operated, [and] how the algorithms processed all the information. I am sure there will be a legislative requirement that this sort of information will be documented and accessible in the vessel records.”
Salter notes that access to such detailed information will also help improve vessel operation and safety.
For example, should it be determined there is an issue with one of the underlying algorithms, “We could see an immediate block on any autonomous ship using that same algorithm until the issue is resolved,” Salter says, noting this is the approach adopted by the aviation industry.
If you want to explore these and other questions surrounding autonomous surface ships, you can join IMarEST’s MASS SIG or visit MASSPeople.org.
Main image: Thales AVS Halcyon unmanned surface vehicle in Plymouth, UK; credit: Shutterstock