Skip to main content

EDITORIAL article

Front. Mar. Sci., 14 August 2023
Sec. Ocean Observation
Volume 10 - 2023 | https://doi.org/10.3389/fmars.2023.1256183

Editorial: Optics and machine vision for marine observation

  • 1Department of Ocean Engineering, Ocean College, Zhejiang University, Zhoushan, Zhejiang, China
  • 2Shenzhen International Graduate School, Tsinghua University, Shenzhen, China
  • 3Department of Intelligent Mechatronics Engineering, Sejong University, Seoul, Republic of Korea
  • 4Department of Civil and Environmental Engineering, University of Houston, Houston, TX, United States

The aquatic ecosystem of the planet makes up a sizable amount of 71% of its surface that contain numerous living forms and an abundance of organic and inorganic resources throughout this enormous area (Issac and Kandasubramanian, 2021). Scientists and researchers have long been enthralled by the immense and enigmatic expanse of the marine ecosystems. The ocean’s intricate ecosystems, diverse marine life, and the profound impact they have on our planet make understanding and monitoring these environments crucial (Maximenko et al., 2019). Both anthropogenic and natural activities have significantly increased recently, causing ecological problems in the marine environment (Huang et al., 2023). To successfully address and mitigate the resulting ecological mutilations, these disturbances call for the development of quick monitoring and mitigation mechanisms. As a result, the scientific community has been forced to explore numerous routes to push the limits of marine observation.

Underwater ecosystems have been mostly shrouded in darkness due to light attenuation, hindering comprehensive observation and data collection. But improvements in optics have fundamentally altered our capacity to perceive the underwater environment. Advances in high-resolution image capture, video recording, and spectral data acquisition have been made possible by cutting-edge imaging technology like underwater cameras, spectrometers, and hyperspectral sensors (Song et al., 2021a; Shahani et al., 2021). Through the study of species’ behavior, distribution, and interactions, hidden ecosystems are revealed and scientists are able to explore marine habitats in new detail.

Automated analysis of underwater imagery has been made possible by machine vision techniques used in conjunction with optics. Computers can now extract complex traits and accurately categorize marine organisms thanks to deep learning techniques, a subset of machine learning that has revolutionized image processing and pattern identification. There are many new possibilities for marine surveillance now that machine vision systems, optics, and deep learning approaches have been combined. Automation, data analysis, and real-time monitoring are just a few advantages that machine vision and deep learning algorithms together offer. The topic of marine species tracking and identification is one of the most notable applications (Chuang et al., 2016). Massive volumes of underwater imagery may be quickly analyzed using deep learning algorithms, which can then accurately and automatically identify and classify aquatic organisms (Song et al., 2020; Song et al., 2021b). These developments are essential for following migration patterns, evaluating the health of marine populations, and spotting possible threats to biodiversity. Machine vision and deep learning speed up research efforts by reducing the time-consuming and labor-intensive process of manual identification, enabling scientists to make educated conclusions about conservation measures and policy-making.

In conjunction with machine vision algorithms, remote sensing systems can monitor changes in ocean currents, sea surface temperature, and the spread of dangerous algal blooms (Son et al., 2015). For studying climate patterns, predicting weather occurrences, and reducing the possible effects of natural disasters on coastal communities, these real-time measurements are crucial. Additionally, the monitoring of human activities and their effects on marine habitats is made easier by the integration of optics, machine vision, and deep learning. Machine vision systems can monitor and identify potential pollution, illicit fishing, and habitat devastation (Mehdi et al., 2022; Yasir et al., 2023).

Understanding and maintaining a close eye on the dynamics and health of oceans depends heavily on marine observation. We can employ machine vision, which focuses on creating algorithms and systems for understanding visual data, and optics, which deals with the study and manipulation of light, to better observe and understand marine ecosystems. For this purpose, the Research Topic “Optics and machine vision for marine observation” focuses to explore the intersection of optics, machine vision, and deep learning technologies and their applications in making the field of marine observation more effective. It provides a collection of recent findings, developments, and innovative strategies related to underwater sensors, imaging systems, computer vision algorithms, and data analysis techniques that leverage optics and machine vision technologies for various aspects of marine observation. The Research Topic explores the transformative potential of optics and machine vision and their applications in contributing to the advancements of marine observation systems. The Research Topic is comprised of 24 articles, collectively representing the contribution of 118 authors (Table 1).

TABLE 1
www.frontiersin.org

Table 1 Summary of chapters published in this Research Topic.

A wide domain of research is involved in the development and implementation of optics and machine vision for marine observation, including optical sensors and monitoring systems, image processing, deep learning techniques, deep-sea illumination, spectral image analysis, etc. Several researchers address the development of underwater monitoring methods based on optical fiber sensing for real-time study of environmental parameters (Liu et al.), and water quality observation based on multi-sensor fusion for early warning of starfish disaster (Li et al.). Two studies discuss underwater sensor networks and protocols for explorations of underwater resources through efficient data collection (Ahmad et al.; Bharany et al.). Numerous papers deliver improved techniques and applications of deep learning for underwater object detection while several studies highly concentrated on underwater image enhancement in support of algorithm development, validation, and verification. Multiple papers explore the applications of deep learning for underwater object detection (fish classes, and organic and inorganic submarine objects: Yan et al.; Khan et al.; hydrothermal plumes detection: Wang et al.) and image segmentation (fish: Kim and Park; Haider et al.; Chen, J. et al.). One study proposes an advanced trajectory tracking mechanism for underwater fish classes including multi-object detection (Hao et al.). Another study proposes and assesses a starvation grading model for fish class based on image processing and CNN that can benefit the field of fisheries (Zheng et al.). For aerial-based monitoring of coastal areas, a paper suggests small size objects detection technique based on CNN (Gao et al.). Papers based on spectral technologies address a range of topics including deep-sea illumination to compensate light attenuation (Quan et al.), effects of turbidity on spectral imaging (Song et al.), and spectral imaging based deep-sea mineral exploration (Yang, G. et al.). In the field of marine observation, remote sensing provides valuable insights into the state of marine environment. Two papers contributed to the field of ocean remote sensing using hyperspectral imaging and CNNs for the detection of ships (Yasir et al.), and the classification of oil spills (Yang, J. et al.). Several contributions in the field of underwater image processing include image restoration (Ali and Mahmood; color restoration: Hu et al.), and image enhancement (Lai et al.; Zhao et al.; Chen, T. et al.; Deng et al.).

The ability to monitor and understand the marine environment has changed dramatically as a result of the merging of optics and machine vision technologies with marine research. These developments have given scientists the tools they need to solve the urgent ecological problems that are being caused by both natural events and human activity. These potent tools can be used by researchers to gain insightful knowledge of the marine ecosystem, facilitating well-informed decision-making and efficient mitigation measures. As a result, the limits of scientific understanding in marine science are being widely pushed, advancing our comprehension of this complex field to unprecedented heights.

Author contributions

HS: Data curation, Investigation, Supervision, Writing – original draft, Writing – review & editing. SM: Data curation, Investigation, Writing – original draft, Writing – review & editing. MW: Data curation, Investigation, Writing – original draft, Writing – review & editing. RL: Data curation, Investigation, Writing – original draft, Writing – review & editing. RN: Data curation, Investigation, Writing – original draft, Writing – review & editing. SX: Data curation, Investigation, Writing – original draft, Writing – review & editing.

Acknowledgments

We extend our appreciation to the researchers and scientists who provided this Research Topic with their insightful knowledge. We also express our sincere gratitude to the devoted reviewers for their meticulous consideration and constructive criticism. Their efforts were crucial in determining the success of this Research Topic.

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher’s note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

References

Chuang M.-C., Hwang J.-N., Ye J.-H., Huang S.-C., Williams K. (2016). “Underwater fish tracking for moving cameras based on deformable multiple kernels,” in IEEE Transactions on Systems, Man, and Cybernetics: Systems, Vol. 47. 2467–2477.

Google Scholar

Huang H., Cai H., Qureshi J. U., Mehdi S. R., Song H., Liu C., et al. (2023). Proceeding the categorization of microplastics through deep learning-based image segmentation. Sci. Total Environ. 896, 165308. doi: 10.1016/j.scitotenv.2023.165308

PubMed Abstract | CrossRef Full Text | Google Scholar

Issac M. N., Kandasubramanian B. (2021). Effect of microplastics in water and aquatic systems. Environ. Sci. pollut. Res. 28, 19544–19562. doi: 10.1007/s11356-021-13184-2

CrossRef Full Text | Google Scholar

Maximenko N., Corradi P., Law K. L., Van Sebille E., Garaba S. P., Lampitt R. S., et al. (2019). Toward the integrated marine debris observing system. Front. Mar. Sci. 6, 447. doi: 10.3389/fmars.2019.00447

CrossRef Full Text | Google Scholar

Mehdi S. R., Raza K., Huang H., Naqvi R. A., Ali A., Song H. (2022). Combining deep learning with single-spectrum UV imaging for rapid detection of HNSs spills. Remote Sens. 14 (3), 576. doi: 10.3390/rs14030576

CrossRef Full Text | Google Scholar

Shahani K., Song H., Mehdi S. R., Sharma A., Tunio G., Qureshi J., et al. (2021). Design and testing of an underwater microscope with variable objective lens for the study of benthic communities. J. Mar. Sci. Appl. 20, 170–178. doi: 10.1007/s11804-020-00185-9

CrossRef Full Text | Google Scholar

Son Y. B., Choi B.-J., Kim Y. H., Park Y.-G. (2015). Tracing floating green algae blooms in the Yellow Sea and the East China Sea using GOCI satellite data and Lagrangian transport simulations. Remote Sens. Environ. 156, 21–33. doi: 10.1016/j.rse.2014.09.024

CrossRef Full Text | Google Scholar

Song H., Mehdi S. R., Huang H., Shahani K., Zhang Y., Raza K., et al. (2020). Classification of freshwater zooplankton by pre-trained convolutional neural network in underwater microscopy. Int. J. Adv. Comput. Sci. Appl. 11 (7), P 252–258. doi: 10.14569/IJACSA.2020.0110733

CrossRef Full Text | Google Scholar

Song H., Mehdi S. R., Wu C., Li Z., Gong H., Ali A., et al. (2021a). Underwater spectral imaging system based on liquid crystal tunable filter. J. Mar. Sci. Eng. 9 (11), 1206. doi: 10.3390/jmse9111206

CrossRef Full Text | Google Scholar

Song H., Mehdi S. R., Zhang Y., Shentu Y., Wan Q., Wang W., et al. (2021b). Development of coral investigation system based on semantic segmentation of single-channel images. Sensors 21 (5), 1848. doi: 10.3390/s21051848

PubMed Abstract | CrossRef Full Text | Google Scholar

Yasir M., Zhan L., Liu S., Wan J., Hossain Md S., Colak A. T. I., et al. (2023). Instance segmentation ship detection based on improved Yolov7 using complex background SAR images. Front. Mar. Sci. 10, 1113669. doi: 10.3389/fmars.2023.1113669

CrossRef Full Text | Google Scholar

Keywords: underwater optics, underwater imaging, image enhancement, image processing, machine learning, marine observation, object recognition

Citation: Song H, Mehdi SR, Wang M, Liao R, Naqvi RA and Xie S (2023) Editorial: Optics and machine vision for marine observation. Front. Mar. Sci. 10:1256183. doi: 10.3389/fmars.2023.1256183

Received: 10 July 2023; Accepted: 02 August 2023;
Published: 14 August 2023.

Edited and Reviewed by:

Oliver Zielinski, Leibniz Institute for Baltic Sea Research (LG), Germany

Copyright © 2023 Song, Mehdi, Wang, Liao, Naqvi and Xie. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Hong Song, hongsong@zju.edu.cn

Download