Feb 19 2018
A new pair of 4D goggles has been developed by researchers at UC San Diego and San Diego State University that enables wearers to be physically “touched” by an approaching spacecraft when they are viewing a looming object on the screen.
The novel device was developed based on a study performed by the neuroscientists to map brain regions that combine the touch and sight of a looming object and assist in their understanding of the neural and perceptual mechanisms of multisensory integration.
The researchers, however, said that the device has a more practical purpose for the rest of the individuals: entertainment content, such as music, movies, virtual reality, and games, can be synchronized into the device to deliver immersive multisensory effects close to the face and thus improve the sense of presence.
The results of the study have been reported in a paper published online February 6 in the journal Human Brain Mapping by Ching-fu Chen and Ruey-Song Huang, neuroscientists at UC San Diego’s Institute for Neural Computation, and Martin Sereno, a former professor at UC San Diego and the former chair of neuroimaging at University College London, now at San Diego State University.
We perceive and interact with the world around us through multiple senses in daily life. Though an approaching object may generate visual, auditory, and tactile signals in an observer, these must be picked apart from the rest of world, originally colorfully described by William James as a ‘blooming buzzing confusion.’ To detect and avoid impending threats, it is essential to integrate and analyze multisensory looming signals across space and time and to determine whether they originate from the same sources.
Ruey-Song Huang, Lead Author
In the experiments performed by the researchers, subjects evaluated the subjective synchrony between an approaching ball (simulated in virtual reality) and an air puff sent to the same side of the subject’s face. When the onset of an air puff and the onset of ball movement were almost simultaneous (with a delay of 100 milliseconds), the air puff was perceived as fully out of sync with the approaching ball. With a delay of just 800 to 1,000 milliseconds, both stimuli were perceived as one (in sync), as if an object had passed close to the face creating a little wind.
In similar experiments using functional Magnetic Resonance Imaging, also known as fMRI, visual-only, tactile-only, tactile-visual in-sync and tactile-visual out-of-sync stimuli were delivered to both sides of the subject’s face in randomized events.
The researchers reported in their paper that over a dozen of brain regions appeared to respond more strongly to lateralized multisensory stimuli than to lateralized unisensory stimuli, and when the multisensory stimuli are in perceptual sync, the response was further improved.
The study was funded by the National Institutes of Health (R01 MH081990), a UC San Diego Frontiers of Innovation Scholars Program Project Fellowship, a Royal Society Wolfson Research Merit Award (UK), and Wellcome Trust (UK).