People perceive gravity through visual cues
People adjust their bodily movements in response to gravity by perceiving visual cues from their surroundings rather than feeling it through changes in weight and balance.
In a recent study published in the Frontiers in Neuroscience, a group of researchers used virtual reality (VR) based experiments to reach this fascinating conclusion.
Gravity is the unseen force that dominates our entire lives. It's what makes walking uphill so difficult and what makes parts of our body eventually point downhill.
But exactly how do people account for this invisible influence while moving through the world?
PhD Student Desiderio Cano Porras, from Sheba Medical Center, Israel and colleagues found that our capability to anticipate the influence of gravity relies on visual cues in order for us to walk safely and effectively downhill and uphill.
In order to determine the influence of vision and gravity on how we move, the researchers recruited a group of 16 young, healthy adults for virtual reality (VR) experiment.
The researchers designed a VR environment that simulated level, uphill, and downhill walking.
Participants were immersed in a large-scale virtual reality system in which they walked on a real-life treadmill that was at an upward incline, at a downward decline, or remained flat.
Throughout the experiment, the VR visual environment either matched or didn't match the physical cues that the participants experienced on the treadmill.
Using this setup, the researchers were able to disrupt the visual and physical cues we all experience when anticipating going uphill or downhill.
So, when participants saw a downhill environment in the VR visual scenery, they positioned their bodies to begin "braking" to go downhill despite the treadmill actually remaining flat or at an upward incline.
They also found the reverse - people prepared for more "exertion" to go uphill in the VR environment even though the treadmill remained flat or was pointing downhill.
The researchers showed that purely visual cues caused people to adjust their movements to compensate for predicted gravity-based changes.
However, while participants initially relied on their vision, they quickly adapted to the real-life treadmill conditions using something called a "sensory reweighting mechanism" that reprioritized body-based cues over visual ones.
In this way, the participants were able to overcome the sensory mismatch and kept walking.
"Our findings highlight multisensory interactions: the human brain usually gets information about forces from "touch" senses; however, it generates behaviour in response to gravity by "seeing" it first, without initially "feeling" it," says Dr Plotnik.
Dr Plotnik also states that the study is an exciting application of new and emerging VR tech as "many new digital technologies, in particular, virtual reality, allow a high level of human-technology interactions and immersion.
The research is a step towards the broader goal of understanding the intricate pathways that people use to decide how and when to move their bodies, but there is still work to be done.
Dr Plotnik states that "This study is only a 'snapshot' of a specific task involving transitioning to uphill or downhill walking. In the future, we will explore the neuronal mechanisms involved and potential clinical implications for diagnosis and treatment."