Assessing the determinants of larval fish strike rates using computer vision

Shir Bar*, Liraz Levy, Shai Avidan, Roi Holzman

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Measuring behaviors that affect fitness is a critical task in the study of ecology and evolution. Behaviors such as feeding, fleeing predators, fighting conspecifics and mating are critical to an individuals’ fitness but are often unpredictable in space and time and can be rare in natural and experimental systems. Sparsely occuring behaviors are therefore difficult to quantify, and extracting them from video streams is extremely time consuming. In this study, we use a case study of larval fish feeding, a sparse behavior that is critical to larval survival, to demonstrate how an AI-assisted system can be applied to overcome the problem of quantifying sparsely occuring events. We deployed our system in aquaculture rearing ponds to directly estimate the strike rate of larval fish outside the laboratory for the first time, and assess the effects of environmental factors on these rates. Our analysis pipeline far surpassed the performance of manual annotation, both in terms of time efficiency and its ability to retrieve feeding strikes. We found that strike rates were similar and low across age groups, irrespective of pH and oxygen levels. However, strike rates increased significantly with increasing temperature. Our system allowed probing into the biology of a sparsely occuring behavior with unprecedented efficiency. However, it revealed that analyzing rare behaviors requires further development of research methodologies suited for low sample sizes and highly imbalanced data.

Original languageEnglish
Article number102195
JournalEcological Informatics
Volume77
DOIs
StatePublished - Nov 2023

Keywords

  • Action classification
  • Automated video analysis
  • Feeding behavior
  • Machine vision

Fingerprint

Dive into the research topics of 'Assessing the determinants of larval fish strike rates using computer vision'. Together they form a unique fingerprint.

Cite this