PoS - Proceedings of Science
Volume 429 - The 6th International Workshop on Deep Learning in Computational Physics (DLCP2022) - Track3. Machine Learning in Natural Sciences
Underwater biotope mapping: automatic processing of underwater video data
O.O. Iakushkin*, E. Pavlova, A. Lavrova, E. Pen, O. Sedova, V. Polovkov, N. Shabalin, T. Yana and F.H. Anna
Full text: pdf
Pre-published on: November 14, 2022
Published on: December 06, 2022
Abstract
The task of analysing the inhabitants of the underwater world applies to a wide range of applied problems: construction, fishing, and mining. Currently, this task is applied on an industrial scale by a rigorous review done by human experts in underwater life. In this work, we present a tool that we have created that allows us to significantly reduce the time spent by a person on video analysis. Our technology offsets the painstaking video review task to AI, creating a shortcut that allows experts only to verify the accuracy of the results. To achieve this, we have developed an observation pipeline by dividing the video into frames; assessing their degree of noise and blurriness; performing corrections via resolution increase; analysing the number of animals on each frame; building a report on the content of the video, and displaying the obtained data of the biotope on the map. This dramatically reduces the time spent analysing underwater video data.
Also, we considered the task of biotope mass calculation. We correlated the Few-shot learning segmentation model results with point cloud data to achieve that. That provided us with a biotope surface coverage area that allowed us to approximate its volume. Such estimation is helpful for precise area mapping and surveillance.

Thus, this paper presents a system that allows detailed underwater biotope mapping using automatic processing of a single camera underwater video data. To achieve this, we combine into a single pipeline a set of deep neural networks that work in tandem.
DOI: https://doi.org/10.22323/1.429.0024
How to cite

Metadata are provided both in "article" format (very similar to INSPIRE) as this helps creating very compact bibliographies which can be beneficial to authors and readers, and in "proceeding" format which is more detailed and complete.

Open Access
Creative Commons LicenseCopyright owned by the author(s) under the term of the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.