NASA NeMO-Net - A Neural Multimodal Observation and Training Network for Marine Ecosystem Mapping at Diverse Spatiotemporal Scales

Ved Chirayath, Alan Li, Juan Torres-Perez, Michal Segal-Rozenhaimer, Jarrett Van Den Bergh

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

We present NeMO-Net, the first open-source fully convolutional neural network (FCNN) and interactive learning and training software aimed at assessing the present and past dynamics of shallow marine systems through habitat mapping into geomorphological (9 classes) and biological classes (22 classes). Shallow marine systems, particularly coral reefs, are under significant pressures due to climate change, ocean acidification, and other anthropogenic pressures, leading to rapid, often devastating changes, in these fragile and diverse ecosystems. Historically, remote sensing of shallow marine habitats has been limited to meter-scale imagery due to the optical effects of ocean wave distortion, refraction, and optical attenuation. NeMO-Net combines 3D cm-scale distortion-free imagery captured using NASA's airborne FluidCam and fluid lensing remote sensing technology with low resolution airborne and spaceborne datasets of varying spatial resolutions, spectral spaces, calibrations, and temporal cadence in a supercomputer-based deep learning framework. NeMO-Net augments and improves the benthic habitat classification accuracy of low-resolution datasets across large geographic and temporal scales using high-resolution training data from FluidCam. NeMO-Net's FCNN uses ResNet and RefineNet to perform semantic segmentation and cloud masking of remote sensing imagery of shallow marine systems from drones, manned aircraft, and satellites, including FluidCam, WorldView, Planet, Sentinel, and Landsat. Deep Laplacian Pyramid Super-Resolution Networks (LapSRN) alongside Domain Adversarial Neural Networks (DANNs) are used to augment low resolution imagery with high resolution drone-based datasets as well as recognize domain-invariant features across multiple instruments to achieve high classification accuracies, ameliorating inter-sensor spatial, spectral and temporal heterogeneities. An online active learning and citizen science application is used to allows users to provide interactive training data for NeMO-Net in 2D and 3D, fully integrated within an active learning framework. Preliminary results from a test case in Fiji demonstrate 9-class classification accuracy exceeding 84%.

Original languageEnglish
Title of host publication2020 IEEE International Geoscience and Remote Sensing Symposium, IGARSS 2020 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages3633-3636
Number of pages4
ISBN (Electronic)9781728163741
DOIs
StatePublished - 26 Sep 2020
Externally publishedYes
Event2020 IEEE International Geoscience and Remote Sensing Symposium, IGARSS 2020 - Virtual, Waikoloa, United States
Duration: 26 Sep 20202 Oct 2020

Publication series

NameInternational Geoscience and Remote Sensing Symposium (IGARSS)

Conference

Conference2020 IEEE International Geoscience and Remote Sensing Symposium, IGARSS 2020
Country/TerritoryUnited States
CityVirtual, Waikoloa
Period26/09/202/10/20

Keywords

  • Multimodal remote sensing
  • coral reefs
  • fluid lensing
  • machine learning
  • neural networks

Fingerprint

Dive into the research topics of 'NASA NeMO-Net - A Neural Multimodal Observation and Training Network for Marine Ecosystem Mapping at Diverse Spatiotemporal Scales'. Together they form a unique fingerprint.

Cite this