METHODS article
Front. Ecol. Evol.
Sec. Conservation and Restoration Ecology
BugNet: a rapid and scalable pipeline for automated insect monitoring using hierarchical data
Provisionally accepted- 1University of Wisconsin-Madison, Madison, United States
- 2University of Nevada Reno, Reno, United States
Select one of your emails
You have multiple emails registered with Frontiers:
Notify me on publication
Please enter your email address:
If you already have an account, please login
You don't have a Frontiers account ? You can register here
Despite the importance of monitoring insect diversity to ecological and conservation questions, we lack sufficient technologies to monitor insects at scale. While research into automated systems for monitoring biodiversity through camera traps has led to the development of a number of machine learning approaches for insect monitoring, these tools suffer from a lack of training data and face challenges in classifying insects in highly diverse systems where the majority of species are unknown to science. To address these challenges, we developed BugNet, an automated pipeline for aggregating insect image data from online databases and training hierarchical classification models, and test a large-scale insect detection model on GBIF and field images. We show that this system can be used to rapidly create and validate classification models with high accuracy on internet and field images. Furthermore, we show that incorporating hierarchical data into classification models improves their ability of models to handle unknown taxa. These systems are an important step towards a generalized and scalable insect detection platform. While not capable of monitoring every dimension of insect diversity, BugNet can be used to accurately classify insects from camera trap images, and is can be scaled to meet the data needs of larger ecological and conservation questions.
Keywords: Biodiversity monitoring, Computer Vision, Data pipeline, Insect diversity, tropical forests
Received: 20 Nov 2025; Accepted: 30 Jan 2026.
Copyright: © 2026 Grele and Richards. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
* Correspondence: Ari Grele
Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.