ORIGINAL RESEARCH article
Front. Behav. Neurosci.
Sec. Individual and Social Behaviors
This article is part of the Research TopicEthological neuroscienceView all 20 articles
Smart Lids for deep multi-animal phenotyping in standard home cages
Provisionally accepted- 1Olden Labs, San Francisco, United States
- 2Harvard University, Cambridge, United States
- 3NextIT doo, Sarajevo, Bosnia and Herzegovina
Select one of your emails
You have multiple emails registered with Frontiers:
Notify me on publication
Please enter your email address:
If you already have an account, please login
You don't have a Frontiers account ? You can register here
The reproducibility crisis and translational gap in preclinical research underscore the need for more accurate and reliable methods of health monitoring in animal models. Manual testing is labor-intensive, low-throughput, prone to human bias, and often stressful for animals. Although many smart cages have been introduced, they have seen limited adoption due to either low throughput (being limited to single animals), low data density (a few metrics only), high costs, a need for new space or infrastructure in the vivarium, high complexity use, or a combination of the above. Although technologies for video-based single-animal tracking have matured, no existing technology enables robust and accurate multi-animal tracking in standard home cages. To solve these problems, we built a new type of assay device: the Smart Lid. Smart Lids mount to existing racks, above standard home cages and stream video and audio data, turning regular racks into high-throughput monitoring platforms. To solve the multi-animal tracking problem, we developed a new computer vision pipeline (MOT - Multi-Organism Tracker) along with a new ear tag purpose-designed for computer vision tracking. MOT achieves over 97% accuracy in multi-animal tracking while maintaining an affordable runtime cost (less than $100 per month). The pipeline returns 21 health-related metrics, covering activity, feeding, drinking, rearing, climbing, fighting, cage positioning, social interactions and sleeping, with additional metrics under development.
Keywords: 3R, AI, Animal Welfare, Computer Vision, home-cage monitoring, Multi-animal tracking, Phenomics, phenotyping
Received: 01 Sep 2025; Accepted: 08 Dec 2025.
Copyright: © 2025 Florea, Kaca, Delalić, Alimsijah, Weber, Selmanović, Galindez, Marquez, Balmaceda, Delalic, Bekkaye, Bakija, Kurtagić-Pašalić, Agić, Anderson and Wagers. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
* Correspondence: Michael Florea
Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.
