The ocean data projects improving shipping safety and combating climate change
Data fuels our world, feeding the algorithms that power our connected lives. But when it comes to oceans, we are sometimes flying blind.
We may live on the blue planet, but its vastness and complex interactions make it hard to fully understand. Scientists are still perplexed by many phenomena and are aware these data blind spots could leave us vulnerable as climate change continues to shock and awe with its impacts.
The past year has seen the ocean break temperature records every single day and, according to one analysis of data from the EU’s Copernicus Climate Service, nearly 50 days have beaten existing highs for the time of year by the largest margin in the satellite era.
This is part of an increasing and accelerating trend in ocean heat, one which should concern us all. These super-heated oceans are devastating in their short-term impacts on marine life but are also likely to have longer term effects, with ice sheet melting and deep ocean warming likely to continue to fuel sea level rise in the centuries to come.
Understanding these processes is one reason why France-based Mercator Ocean International, recently made an intergovernmental organisation reporting to the UN, is building a ‘digital twin’ of the ocean.
This high-resolution, multi-dimensional virtual model of the ocean will encompass its physical, chemical, biological, and socio-economical dimensions. The model will be fed by continuous real-time and historical observations from thousands of sensors across the world’s oceans and numerous satellites, and is powered by artificial intelligence (AI), machine learning and supercomputing.
This digital twin will simulate scenarios, monitor and analyse impacts and help design solutions, not just to protect marine ecosystems and mitigate climate change, but also to support the blue economy on which so many depend for food, livelihoods, energy and transportation.
Data capture: floats, buoys, and cables
Capturing the datasets to feed these data-hungry models requires vast networks of sensors. One of the most famous is the Argo program of free drifting profiling floats, which measure temperature and salinity in the upper 2,000m of the ocean.
Researchers are also leveraging the million-plus kilometres of telecommunications cables that traverse the ocean. By incorporating environmental monitoring sensors into transoceanic cable systems, they plan to monitor ocean heat content, circulation, and sea level rise, provide early warning for earthquakes and tsunamis, and detect seismic activity for earth structure and related hazards.
This is important because our ocean is a working ocean and data is essential to improve maritime safety and improve efficiency for shipping, offshore energy and aquaculture operators. Hydrosphere’s low cost, lightweight Aquanode sensor, for example, which can be integrated into a wave buoy or retrofitted onto an existing platform, lets users view wave data in near real-time, set alerts and create automated messages.
This allows them to incorporate the current wave conditions into their everyday risk assessments so operators can improve safety and reduce the times operations, such as a pilot boarding a vessel, service personnel maintaining a windfarm or a ferry docking, have to be abandoned due to wave conditions. In these busy working environments, it’s essential to optimise uptime and minimise wasted trips, which is where data to make the right operational call is essential.
“The challenge in the ocean is accessibility,” says Jeff Gibson, Sales Director at Hydrosphere, pointing out that solutions must be sufficiently robust to survive all conditions. “A failure will be expensive and the loss of data longer than an equivalent system onshore, and the financial costs much higher.”
Putting data to hard work
The good news is that technologies are converging to make maritime data collection, storage and analysis ever quicker, cheaper and more resilient. The cost of data storage has dramatically reduced, and the storage capacity of hard drives and cloud storage has increased. Even so, there are still challenges associated with accumulating this mass of raw data, including limitations in downloading and uploading data at sea.
“This is something that we have not resolved in the world of bioacoustics (the study of animal sounds), often resulting in the requirement to ship data to its destination,” explains Liz Ferguson, CEO and founder of California-based Ocean Science Analytics.
AI technologies can quickly unlock new insights from these datasets, although Ferguson says the ocean data community needs to get better at sharing these. Too often, she said, the outcome may be: “A static report which quickly becomes buried in a cavern of digital documents.”
“As a society, [we] do not prioritise digital data visualisation and interpretation tools for sharing information with the general public,” she says, adding this work is also important to share data across multi-disciplinary teams. “There is an abundance of potential for unlocking previously unevaluated environmental and ecological conditions in our oceans.”
While accessible, community-based programming languages like R and Python make it ever easier to democratise access to ocean data, there is still a barrier between ocean scientists within a specific domain, and the broader scientific community and general public. Ferguson urges scientists, despite restrictions on funding, to lean into innovation when it comes to science communication on such important topics.
“Too often ‘report’, ‘publication’, or ‘conference presentation’ is the box checked next to the outreach box,” Ferguson states. “There is so much potential with cloud storage of data products and data visualisation and analysis dashboards to continually engage a global audience.”
Deep learning in the deep blue
For Ferguson, who specialises in marine mammal bioacoustics, this is an exciting time to work in ocean data. “Within my discipline, there is an increase in accessible, long-term passive acoustic data through hydrophones on cabled arrays and in sanctuaries,” she says, saying she’s excited by the potential of accessible deep learning methods and tools for acoustic data.
“Deep learning networks provide a means of efficiently and reliably processing raw audio to procure information on the spatiotemporal patterns (connecting time and space) of marine mammals within a region, but it is still early in the development of those methods for multiple species,” Ferguson concludes.
For more information about the topics mentioned in this article, join IMarEST’s Operational Oceanography Special Interest Group or Marine Mammals Special Interest Group.
Tell us what you think about this article by joining the discussion on IMarEST Connect.
Image: Aquanode sensor being used on the ocean; credit: Hydrosphere.