Lights, neurons, action: Reconstructing movies from mouse brain activity
By April Cashin-Garbutt
Imagine a world where we could visualise dreams and imagination. It might sound like something from a sci-fi story, but researchers at the Sainsbury Wellcome Centre have taken us a step closer to this futuristic reality. Led by Joel Bauer, Senior Research Fellow in the Clopath and Margrie Labs, the team has successfully reconstructed videos purely from the brain activity of mice. This exciting work could help shed light on the intricate workings of how the brain processes visual information and open new avenues for exploring how different species perceive the world.
Building on fMRI research in humans
Over recent years, there has been a growing interest in decoding visual representations in humans. Images and movies have been played to individuals in fMRI machines and researchers around the world have tried to decode the brain’s representations of visual information on a pixel level.
New research at SWC builds on this approach but instead uses single-cell recordings in mice, which offer the potential to provide a more precise measure of the brain’s representations. This technique has enabled the team to create high-quality reconstructions of videos played to mice, based solely on the neural activity in their visual cortex.
“We wanted to have a better way of investigating visual phenomena. The current methods of understanding what populations of neurons are representing are not very generalisable to situations which haven’t been specifically tested for. And so, we wanted to develop a method that can capture what is being represented and compare that to reality,” explained Dr Bauer, lead author of the study.
By looking for deviations between representations and reality, the new method offers the potential to understand how visual processing phenomena modulate representations in the brain and extract information to help us understand the world.
Reconstructing videos from brain activity of mice
To reconstruct videos from single-cell recordings, Dr Bauer utilised the data and winning dynamic neural encoding model from the 2023 Sensorium Competition, which predicts neural activity of individual neurons based on the movies they are presented with, along with a couple of behavioural parameters such as running and pupil diameter.
Instead of using this model to predict the reaction of the neurons to the original movie, they used it to calculate the difference between the original activity of the neurons and the predicted activity of the neurons if the mouse had seen a blank screen. This allowed the team to iteratively update the pixels of the blank movie, through an algorithm called gradient descent. The movie was continuously refined until it closely resembled the visual stimuli presented to the mouse.
“Using this approach, we were able to achieve high-quality reconstructions of 10 second video clips. The accuracy of the reconstructions improved with the inclusion of more neurons, demonstrating the importance of comprehensive neural data,” commented Dr Bauer.
Testing the reliability of the reconstructions
To quantify the reliability of the reconstructions, the team used pixel correlation. This involved correlating each pixel of the movie between the original version and the reconstructed version.
They found that there weren’t many deviations on a temporal scale, but they plan to focus on improving the resolution and coverage of visual reconstructions. This will involve getting data that can give higher resolution reconstructions and larger coverage of the visual scene.
Looking ahead
In terms of next steps, the team plan to use the technique to uncover new insights into the brain’s visual processing capabilities. Specifically, they are interested in understanding how visual representations deviate from ground truth.
“We don’t have a perfect representation of the world in our heads. The processing pipeline skews and warps our representation in a way that modifies information. This deviation is not necessarily an error but a feature, reflecting how our minds interpret and augment sensory information. We want to explore how this happens in the brain,” concluded Dr Bauer.
Find out more
• Read the paper in eLife
• Find out more about the Sensorium Competition
• Learn more about research in the Clopath Lab and Margie Lab