Knowing where the nose is
© Datta. 2017
Published: 15 May 2017
Improvements in imaging technology and the development of powerful machine learning algorithms are revolutionizing the study of animal behavior in the laboratory. These innovations promise to reveal both global and local features of action relevant to understanding how the brain functions. A study in BMC Biology describes one such tool called OptiMouse, which is an open source platform that uses video to capture key features of mouse behavior, including information relevant to olfactory investigation.
See research article: https://doi.org/10.1186/s12915-017-0377-3
Traditionally, studies of rodent behavior have been performed using focused lenses: for conceptual and/or technical reasons high-dimensional behavior is collapsed into a much smaller number of dimensions, which are usually hand-selected based upon the hypothesis the researcher wishes to test . For example, the locomotory exploration of an open field by a mouse—a simple yet incredibly rich pattern of behavior—is typically reduced into a single metric capturing the number of times the mouse enters the center of the arena . The recent availability of cheap and high-resolution video cameras, powerful computing hardware, and sophisticated statistical techniques, adapted from fields such as machine vision and machine learning, is enabling a dramatic shift towards more quantitative and objective methods of behavioral analysis. Significant improvements have been made over the past 5 years in camera resolution, feature extraction, animal tracking, supervised behavioral identification, and unsupervised identification of behavioral modes or motifs, for example, and many of these methods have been packaged into end-to-end pipelines in which rodents are imaged on the front end and a dizzying array of parameters describing behavior are spit out the back end [3–6].
However, these pipelines have two important constraints that often limit their usefulness. First, there is limited generalization. The code that converts video images into data relies upon a set of parameters that typically are specific to a particular camera, lighting condition, and arena. If these parameters are hard-coded (as they often are), analysis of video obtained under experimental conditions that differ from those used to build the analytical pipeline can fail, limiting the types of hypotheses one can test. Second, there is a lack of transparency: often these parameters are hidden so far under the hood that it is not clear why some videos are easily handled while others remain refractory to analysis. Addressing these problems is crucial given the parallel advances being made in methods for probing the structure and function of the nervous system, including gene editing, pharmaco- and optogenetics, and high-density neural recordings; understanding how manipulating a gene or neural circuit influences behavior—or how patterns of neural activity might be correlated with patterns of action—will necessarily require the generation and analysis of large-scale behavioral data, largely in the form of video.
In addition to developing a platform for rapid and transparent behavioral analysis, OptiMouse explicitly measures the position of the mouse’s nose with respect to the mouse’s body and arena. Olfaction is an essential sense used to forage for food in the wild, to avoid potentially deadly conflicts with conspecifics or predators, and to obtain suitable mates . Because olfaction is an active sense—optimal sensory interrogation requires the nose to be actively positioned by the mouse, followed by rapid inhalation to facilitate odor sampling—understanding the position of a mouse’s nose is crucial for understanding how the mouse processes odor information . Indeed, a large subset of the mouse’s behavior in a given arena appears to be some sort of rearing or sniffing behavior, as if their body dynamics are disproportionately devoted to probing the olfactory world . However, nose tracking is notoriously difficult for most automated behavioral classification software, in part because the context in which the nose is found in the video is constantly evolving. As a consequence of this limitation, we lack an understanding of both how basic odor sampling is accomplished by rodents and how neural activity in olfactory centers might be altered as a consequence of active sampling.
Given the robustness of the image processing framework within OptiMouse, one could even imagine using this tool in the future to explore olfactory sampling in complex environments such as those including multiple mice. By design, OptiMouse is meant to be modular, and can integrate with Matlab data processing code seamlessly, allowing it to be updated by a user community over time. While machine vision approaches to characterizing behaviors are currently challenged by complex or dynamic environments, as tools for segmenting objects in video data improve, the capability of OptiMouse can be augmented to enable ever-more sophisticated measurements of mouse behavior.
We thank Jeff Markowitz and Julia Nguyen for helpful comments on this manuscript.
WFG and SRD wrote the manuscript. Both authors read and approved the final manuscript.
The authors declare that they have no competing interests.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.
- Crawley JN. What’s wrong with my mouse? Hoboken: Wiley; 2007. http://doi.wiley.com/10.1002/0470119055.View ArticleGoogle Scholar
- Skinner BF. The experimental analysis of behavior. Am Sci. 1957;45:343–71.Google Scholar
- Egnor SER, Branson K. Computational analysis of behavior. Annu Rev Neurosci. 2016;39:217–36.View ArticlePubMedGoogle Scholar
- Anderson DJ, Perona P. Toward a science of computational ethology. Neuron. 2014;84:18–31.View ArticlePubMedGoogle Scholar
- Machado AS, Darmohray DM, Fayad J, Marques HG, Carey MR. A quantitative framework for whole-body coordination reveals specific deficits in freely walking ataxic mice. Elife. 2015;4:e07892.View ArticlePubMedPubMed CentralGoogle Scholar
- Spink AJ, Tegelenbosch RAJ, Buma MOS, Noldus LPJJ. The EthoVision video tracking system—a tool for behavioral phenotyping of transgenic mice. Physiol Behav. 2001;73:731–44.View ArticlePubMedGoogle Scholar
- Ben-Shaul Y. OptiMouse: a comprehensive open source program for reliable detection and analysis of mouse body and nose positions. BMC Biol. 2017. doi:https://doi.org/10.1186/s12915-017-0377-3.
- Verhagen JV, Wesson DW, Netoff TI, White JA, Wachowiak M. Sniffing controls an adaptive filter of sensory input to the olfactory bulb. Nat Neurosci. 2007;10:631–9.View ArticlePubMedGoogle Scholar
- Wiltschko AB, Johnson MJ, Iurilli G, Peterson RE, Katon JM, Pashkovski SL, et al. Mapping sub-second structure in mouse behavior. Neuron. 2015;88:1121–35.View ArticlePubMedPubMed CentralGoogle Scholar
- Kramer DL, Weary DM. Exploration versus exploitation: a field study of time allocation to environmental tracking by foraging chipmunks. Anim Behav. 1991;41:443–9.View ArticleGoogle Scholar