Abstract
AbstractInsects represent nearly half of all known organisms, with nocturnal insects being particularly challenging to monitor. Computer vision tools for automated monitoring have the potential to revolutionize insect study and conservation. The advancement of light traps with camera-based monitoring systems for insects necessitates effective and flexible pipelines for analysing recorded images. In this paper, we present a flexible and fast processing pipeline designed to analyse these recordings by detecting, tracking and classifying nocturnal insects at the taxonomic ranks of order, suborder as well as the resolution of individual moth species. The pipeline consists of four adaptable steps. The first step detect insects in the camera trap images. An order and suborder classifier with anomaly detection is proposed to filter dark, blurry or partly visible insects that will be uncertain to classify correctly. A simple track-by-detection algorithm is proposed to track the classified insects by incorporating feature embeddings, distance and area cost. We evaluated the computational speed and power performance of different edge computing devices (Raspberry Pi’s and NVIDIA Jetson Nano) and compared various time-lapse strategies with tracking. The minimum difference was found for 2-minute time-lapse intervals compared to tracking with 0.5 frames per second, however, for insects with fewer than one detection per night, the Pearson correlation decreases. Shifting from tracking to time-lapse monitoring would reduce the amount of recorded images and be able to perform edge processing of images in real-time on a camera trap with Raspberry Pi. The Jetson Nano is the most energy-efficient solution, capable of real-time tracking at nearly 0.5 fps. Our processing pipeline was applied to more than 3.4 million images recorded at 0.5 frames per second from 12 light camera traps during one full season located in diverse habitats, including bogs, heaths and forests.
Publisher
Cold Spring Harbor Laboratory