Impact Factor 3.086 | CiteScore 3.08
More on impact ›

Technology Report ARTICLE Provisionally accepted The full-text will be published soon. Notify me

Front. Mar. Sci. | doi: 10.3389/fmars.2019.00521

Next-Generation Optical Sensing Technologies for Exploring Ocean Worlds - NASA FluidCam, MiDAR, and NeMO-Net

 Ved Chirayath1* and Alan Li1
  • 1Ames Research Center, United States

We highlight three emerging NASA optical technologies that enhance our ability to remotely sense , analyze, and explore ocean worlds – FluidCam and fluid lensing, MiDAR, and NeMO-Net.
Fluid lensing is the first remote sensing technology capable of imaging through ocean waves without distortions in 3D at sub-cm resolutions. Fluid lensing and the purpose-built FluidCam CubeSat instruments have been used to provide refraction-corrected 3D multispectral imagery of shallow marine systems from unmanned aerial vehicles (UAVs). Results from repeat 2013 and 2016 airborne fluid lensing campaigns over coral reefs in American Samoa present a promising new tool for monitoring fine-scale ecological dynamics in shallow aquatic systems tens of square kilometers in area.
MiDAR is a recently-patented active multispectral remote sensing and optical communications instrument which evolved from FluidCam. MiDAR is being tested on UAVs and autonomous underwater vehicles (AUVs) to remotely sense living and nonliving structures in light-limited and analog planetary science environments. MiDAR illuminates targets with high-intensity narrowband structured optical radiation to measure an object’s spectral reflectance while simultaneously transmitting data. MiDAR is capable of remotely sensing reflectance at fine spatial and temporal scales, with a signal-to-noise ratio 10-103 times higher than passive airborne and spaceborne remote sensing systems, enabling high-framerate multispectral sensing across the ultraviolet, visible, and near-infrared spectrum. Preliminary results from a 2018 mission to Guam show encouraging applications of MiDAR to imaging coral from airborne and underwater platforms whilst transmitting data across the air-water interface.
Finally, we share NeMO-Net, the Neural Multi-Modal Observation & Training Network for Global Coral Reef Assessment. NeMO-Net is a machine learning technology under development that exploits high-resolution data from FluidCam and MiDAR for augmentation of low-resolution airborne and satellite remote sensing. NeMO-Net is intended to harmonize the growing diversity of 2D and 3D remote sensing with in situ data into a single open-source platform for assessing shallow marine ecosystems globally using active learning for citizen-science based training. Preliminary results from four-class coral classification have an accuracy of 94.4%.
Together, these maturing technologies present promising scalable, practical, and cost-efficient innovations that address current observational and technological challenges in optical sensing of marine systems.

Keywords: remote sensing, coral reefs, UAVs, machine learning, active sensing, fluid lensing, MiDAR

Received: 23 Dec 2018; Accepted: 12 Aug 2019.

Edited by:

Leonard Pace, Schmidt Ocean Institute, United States

Reviewed by:

Christoph Waldmann, University of Bremen, Germany
Steven G. Ackleson, United States Naval Research Laboratory, United States  

Copyright: © 2019 Chirayath and Li. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

* Correspondence: Dr. Ved Chirayath, Ames Research Center, Mountain View, United States, ved.chirayath@nasa.gov