Bathymetric data volumes are ever increasing. New developments in swathe technology are leading to denser survey data and initiatives like Crowdsourced Bathymetry and Seabed 2030 are widening out the scope of collection. These are positive things, but they do present challenges around data volumes, processing, quality assurance and validation. Inevitably most raw bathymetric datasets contain some amount of noise that needs to be dealt with in some way. Even though recent developments in swathe technology have reduced the amount of noise present in collected data, it does still exist and is still a problem.

To try and meet these challenges and problems, teams at the UKHO have spent several years researching and developing Artificial Intelligence (AI) bathymetric data cleansing methods. It is hoped that these techniques will allow the UKHO to be able to work with the increasing volumes of incoming data more quickly and efficiently by dramatically reducing the time and resource required to take raw data through to a bathymetric data product.

This presentation will cover the work we have done to date, our collaboration with industry and planned further development.

 


Andrew Talbot

Andy Talbot joined the UKHO in 2004 after working 9 years in the offshore survey industry. He works at the UKHO as a subject matter expert on bathymetry. His work involves the continuing development of procedures and best practices, ensuring the UKHO is using the latest developments in the hydrographic industry and always looking at emerging technologies.

He gets involved with making key decisions on the collection and handling of bathymetric data as well as providing training and advice both internally and externally, with a view to understanding and improving data quality.