Ethoflow: computer vision and artificial intelligence-based software for automatic behavior analysis

Published: July 24, 2020, 9:02 p.m.

Link to bioRxiv paper: http://biorxiv.org/cgi/content/short/2020.07.23.218255v1?rss=1 Authors: Bernardes, R. C., Lima, M. A. P., Guedes, R. N. C., Martins, G. F. Abstract: Manual monitoring of animal behavior is time-consuming and prone to bias. An alternative to such limitations is the use of computational resources in behavioral assessments, such as a tracking system, to facilitate accurate and long-term evaluations. There is a demand for robust software that addresses analysis in heterogeneous environments (such as in field conditions) and evaluates multiple individuals in groups while maintaining their identities. The Ethoflow software was developed using computer vision and artificial intelligence (AI) tools to automatically monitor various behavioral parameters. A state-of-the-art object detection algorithm based on instance segmentation was implemented, allowing behavior monitoring in the field under heterogeneous environments. Moreover, a convolutional neural network was implemented to assess complex behaviors, thus expanding the possibilities of animal behavior analyses. The heuristics used to automatically generate training data for the AI models are described, and the models trained with these datasets exhibited high accuracy in detecting individuals in heterogeneous environments and assessing complex behavior. Ethoflow was employed for kinematic assessments and to detect trophallaxis in social bees. The software runs on the Linux, Microsoft Windows, and IOS operating systems with an intuitive graphical interface. In the Ethoflow algorithm, the processing with AI is separate from the other modules, which facilitates kinematic measurements on an ordinary computer and the assessment of complex behavior on machines with graphics processing units (GPUs). Thus, Ethoflow is a useful support tool for applications in biology and related fields. Copy rights belong to original authors. Visit the link for more info