Skip to content

Introducing FathomNet: New open-source image database unlocks the power of AI for ocean exploration

Introducing FathomNet: New open-source image database unlocks the power of AI for ocean exploration

A new collaborative effort between MBARI and other research institutions is leveraging the power of artificial intelligence and machine learning to accelerate efforts to study the ocean.

In order to manage impacts from climate change and other threats, researchers urgently need to learn more about the ocean’s inhabitants, ecosystems, and processes. As scientists and engineers develop advanced robotics that can visualize marine life and environments to monitor changes in the ocean’s health, they face a fundamental problem: The collection of images, video, and other visual data vastly exceeds researchers’ capacity for analysis.

FathomNet is an open-source image database that uses state-of-the-art data processing algorithms to help process the backlog of visual data. Using artificial intelligence and machine learning will alleviate the bottleneck for analyzing underwater imagery and accelerate important research around ocean health.

FathomNet is an open-source image database for understanding our ocean and its inhabitants

“A big ocean needs big data. Researchers are collecting large quantities of visual data to observe life in the ocean. How can we possibly process all this information without automation? Machine learning provides a pathway forwards, however, these approaches rely on massive datasets for training. FathomNet has been built to fill this gap,” said MBARI Principal Engineer Kakani Katija.

Project co-founders Katija, Katy Croff Bell (Ocean Discovery League), and Ben Woodward (CVision AI), along with members of the extended FathomNet team, detailed the development of this new image database in a recent research publication in Scientific Reports.

Recent advances in machine learning enable fast, sophisticated analysis of visual data, but the use of artificial intelligence in ocean research has been limited by the lack of a standard set of existing images that could be used to train the machines to recognize and catalog underwater objects and life. FathomNet addresses this need by aggregating images from multiple sources to create a publicly available, expertly curated underwater image training database.

“In the past five years, machine learning has revolutionized the landscape of automated visual analysis, driven largely by massive collections of labeled data. ImageNet and Microsoft COCO are benchmark datasets for terrestrial applications that machine-learning and computer-vision researchers flock to, but we haven’t even begun to scratch the surface of machine-learning capabilities for underwater visual analysis,” said Ben Woodward, co-founder and CEO of CVision AI and a co-founder of FathomNet. “With FathomNet, we aim to provide a rich, interesting benchmark to engage the machine-learning community in a new domain.”

FathomNet leverages expertly labeled visual data from contributors around the world, including video from MBARI that has been annotated in detail by research technicians in our Video Lab. Image: © 2022 MBARI

Over the past 35 years, MBARI has recorded nearly 28,000 hours of deep-sea video and collected more than 1 million deep-sea images. This trove of visual data has been annotated in detail by research technicians in MBARI’s Video Lab. MBARI’s video archive includes approximately 8.2 million annotations that record observations of animals, habitats, and objects. This rich dataset is an invaluable resource for researchers at the institute and collaborators around the world.

FathomNet incorporates a subset of MBARI’s dataset, as well as assets from National Geographic and NOAA.

The National Geographic Society’s Exploration Technology Lab has been deploying versions of its autonomous benthic lander platform, the Deep Sea Camera System, since 2010, collecting more than 1,000 hours of video data from locations in all ocean basins and in a variety of marine habitats. These videos have subsequently been ingested into CVision AI’s cloud-based collaborative analysis platform and annotated by subject-matter specialists at the University of Hawaii and OceansTurn.

National Oceanic and Atmospheric Administration (NOAA) Ocean Exploration began collecting video data with a dual remotely operated vehicle system aboard NOAA Ship Okeanos Explorer in 2010. More than 271 terabytes are archived and publicly accessible from the NOAA National Centers for Environmental Information (NCEI). NOAA Ocean Exploration originally crowd-sourced annotations through volunteer participating scientists, and began supporting expert taxonomists in 2015 to more thoroughly annotate collected video.

“FathomNet is a great example of how collaboration and community science can foster breakthroughs in how we learn about the ocean. With data from MBARI and the other collaborators as the backbone, we hope FathomNet can help accelerate ocean research at a time when understanding the ocean is more important than ever,” said Lonny Lundsten, a senior research technician in MBARI’s Video Lab, co-author, and FathomNet team member.

Machine-learning algorithms trained using FathomNet data can help identify marine life in underwater images and video. The numbers on each box include each organism’s identification number, likely identity, and the algorithm’s confidence score. Image: © 2020 MBARI

As an open-source web-based resource, other institutions can contribute to and use FathomNet instead of traditional, resource-intensive efforts to process and analyze visual data. MBARI launched a pilot program to use FathomNet-trained machine-learning models to annotate video captured by remotely operated underwater vehicles (ROVs). Using AI algorithms reduced human effort by 81 percent and increased the labeling rate tenfold.

Machine-learning models trained with FathomNet data also have the potential for revolutionizing ocean exploration and monitoring. For example, outfitting robotic vehicles with cameras and improved machine-learning algorithms can eventually enable automated search and tracking of marine animals and other underwater objects.

“Four years ago, we envisioned using machine learning to analyze thousands of hours of ocean video, but at the time, it wasn’t possible primarily due to a lack of annotated images. FathomNet will now make that vision a reality, unlocking discoveries and enabling tools that explorers, scientists, and the public can use to accelerate the pace of ocean discovery,” said Katy Croff Bell, founder and president of the Ocean Discovery League and a FathomNet co-founder.

As of September 2022, FathomNet contained 84,454 images, representing 175,875 localizations from 81 separate collections for 2,243 concepts, with additional contributions ongoing. FathomNet aims to obtain 1,000 independent observations for more than 200,000 animal species in diverse poses and imaging conditions—eventually more than 200 million total observations. For FathomNet to reach its intended goals, significant community engagement—including high-quality contributions across a wide range of groups and individuals—and broad utilization of the database will be needed.

“While FathomNet is a web-based platform built on an API where people can download labeled data to train novel algorithms, we also want it to serve as a community where ocean explorers and enthusiasts from all backgrounds can contribute their knowledge and expertise and help solve challenges related to ocean visual data that are impossible without widespread engagement,” said Katija.

To join the FathomNet community, visit fathomnet.org and follow @FathomNet on Twitter.

Seed funding for FathomNet was provided by National Geographic Society (#518018), National Oceanic and Atmospheric Administration (NA18OAR4170105), and MBARI through generous support from the David and Lucile Packard Foundation. Additional funding support has been provided by National Geographic Society (NGS-86951T-21) and the National Science Foundation (OTIC #1812535 & Convergence Accelerator #2137977).

Original journal article:

Katija, K.E. OrensteinB. SchliningL. LundstenK. BarnardG. Sainz, O. Boulais, M. Cromwell, E. Butler, B. Woodward, and K. Croff Bell (2022). FathomNet: A global image database for enabling artificial intelligence in the ocean. Scientific Reports, 12: 15914. doi.org/10.1038/s41598-022-19939-2

For additional information or images relating to this article, please send an email to pressroom@mbari.org

Collaborators

Erin Butler – CVision AI Inc.
Benjamin Woodward – CVision AI Inc.
Megan Cromwell – National Centers for Environmental Information, Stennis Space Center, NOAA
Katy Croff Bell – Ocean Discovery League
Oceane Boulais – Southeast Fisheries Science Center, NOAA