Skip to main content
Fig. 3 | BMC Biology

Fig. 3

From: An automated, low-latency environment for studying the neural basis of behavior in freely moving rats

Fig. 3

Operation of the RIFF. a Rats interact with the RIFF through movement and nose pokes. Their location is identified online using the video camera and a dedicated computer. The experimental environment reacts to rat actions by changing its state and providing different types of feedback—sounds, rewards, or punishments. b The RIFF collects multiple data types: behavioral events (e.g., nose pokes, rewards, punishments); task states; neuronal activity; analog sensor signals (microphone, motion sensors); and the animal location tracked by the ceiling mount camera. c An illustration of a sequence of interactions between a rat and the RIFF during the LD task. (1) The rat moved to the center of the arena in order to initialize the trial. (2) Once it crossed into the central area (a circle with a radius of 30 cm), the RIFF started sound presentation from both loudspeakers in a randomly selected interaction area. (3) Initially, in the example presented here, the rat approached a wrong port; a second sound presentation caused it to move towards the correct target port (4), and to receive the reward (5). d Data post-processing. Posture features are extracted using a custom-trained DNN. The Kilosort2 program used with a custom wrapper performs largely automatic spike detection and sorting. e All data types are synchronized on a single time axis and f can be visualized offline using a custom visualizer software. The visualizer can be used to browse through all synchronized data types up to the level of the raw neural signals

Back to article page