Machine Vision Tools for Quantitatively Measuring Animal Behavior in Large Scale Experiments
-
1
HHMI Janelia Farm Research Campus, United States
-
2
University of California, San Diego, Computer Science, United States
We are developing automatic, machine vision-based tools for quantitatively measuring animal behavior. In this talk, we describe their use in measuring the differences in locomotor and social behavior in our neural activation screen of over 2,000 transgenic Drosophila lines from the Rubin GAL4 collection. We also present their use in measuring the effects of different social rearing conditions on behavior. Our experimental rig is a high-throughput version of the Fly Bowl (Simon & Dickinson, 2010), a chamber designed to allow the flies to perform a wide variety of locomotor and social behaviors while still being amenable to automatic video-based tracking. We collect video of 20 flies exploring and interacting in a Fly Bowl at a rate of 75 transgenic lines (plus controls) per week, resulting in 400 x 15-minute videos collected per week. We use a modified version of the Ctrax multiple fly tracking software (Branson et al., 2009) to automatically and accurately track the flies' positions in every frame of each video. We have developed the Janelia Automatic Animal Behavior Annotator (JAABA) to automatically classify the flies' behaviors in each frame into categories such as walking, stopping, chasing, and jumping. JAABA is an interactive machine-learning-based system that allows the biologist to create new automatic behavior detectors efficiently, accurately, and painlessly, without the necessity of understanding the underlying machine learning algorithm. Through JAABA, the user labels the frames he chooses in videos he chooses, trains the classifier, examines visualizations of the classifier's performance, chooses more frames in more videos to label, and repeats. The JAABA interface has been designed with the immense variety of behavior one finds in a large-scale screen in mind. JAABA allows the user to create diverse training data sets that result in robust behavior definitions very close to the biologist's intuition. Using this system, we were able to develop a set of locomotion and social behavior detectors that worked across over 2,000 behaviorally diverse Drosophila genotypes from the Rubin GAL4 collection in our neural activation screen. In addition, a biologist can easily create a new behavior detector, apply the detector to all the transgenic lines in our data set, and find all lines that are behaviorally different in that particular behavior. Our goal with this data set and these computer science tools is to develop a large and extendable corpus of Drosophila behavior definitions, leading toward a biologically relevant, quantitative language for describing Drosophila behavior.
Acknowledgements
Reza Azanchi, Ulrike Heberlein, Christen Mirth, Roian Egnor, Gerry Rubin, Wyatt Korff, Lowell Umayam, Mary Philips, Sonia Roberts, Todd Laverty, James McMahon, Ben Arthur, Jin-yang Liu, Tanya Tabachnik, Janelia Fly Olympiad Team, Janelia Fly Facility, Janelia ID&F, Janelia Scientific Computing, Pietro Perona, and Michael Dickinson contributed to this work.
Keywords:
automatic,
behavior analysis,
Computer Vision,
Drosophila,
machine learning,
quantitative,
tracking
Conference:
Tenth International Congress of Neuroethology, College Park. Maryland USA, United States, 5 Aug - 10 Aug, 2012.
Presentation Type:
Invited Symposium (only for people who have been invited to a particular symposium)
Topic:
Social Behavior
Citation:
Kabra
M,
Robie
A,
Rivera-Alba
M,
Hirokawa
J,
Branson
S and
Branson
K
(2012). Machine Vision Tools for Quantitatively Measuring Animal Behavior in Large Scale Experiments.
Conference Abstract:
Tenth International Congress of Neuroethology.
doi: 10.3389/conf.fnbeh.2012.27.00041
Copyright:
The abstracts in this collection have not been subject to any Frontiers peer review or checks, and are not endorsed by Frontiers.
They are made available through the Frontiers publishing platform as a service to conference organizers and presenters.
The copyright in the individual abstracts is owned by the author of each abstract or his/her employer unless otherwise stated.
Each abstract, as well as the collection of abstracts, are published under a Creative Commons CC-BY 4.0 (attribution) licence (https://creativecommons.org/licenses/by/4.0/) and may thus be reproduced, translated, adapted and be the subject of derivative works provided the authors and Frontiers are attributed.
For Frontiers’ terms and conditions please see https://www.frontiersin.org/legal/terms-and-conditions.
Received:
30 Apr 2012;
Published Online:
07 Jul 2012.
*
Correspondence:
Dr. Kristin Branson, HHMI Janelia Farm Research Campus, Ashburn, VA, 20147, United States, bransonk@janelia.hhmi.org