I'm not sure how many people know about this audio experiment but it really blew my mind a few years ago when I first heard it. I DIDN'T MAKE IT, JUST SO YOU KNOW.
Now this experiment is scarily real, and can tell us just how much potential sound can have in a VR environment. If you close your eyes, wear headphones, and listen to that video, your brain is tricked multiple times into feeling different sensations. It also feels like when you open your eyes, you will be at a barbershop.
So, what would happen if you combined this with Oculus Rift?
First of all, since the Rift tracks your head movement, when you move your head, the sound won't have to move with it. For example, in the video, turn your head to the right. It will either feel like your barber has slid across the floor to be in the same location relative to your head, or it may feel like your head didn't move at all. This is because the 'position' of the sound has not changed relative to your ears.
In a simulation in VR, when the head is moved, Unity or whatever software will recognize this and move the sound accordingly.
Second of all, since sight and sound are arguably our two most important senses, our brain will be further tricked if we have not one but two of our primary sensory inputs fooled effectively.
We have this effect in all of our games so far, but if this much effort was put into sound design, that would truly be scary...
What do you think?
Oh, the brutal wait until Christmas. Downloading leagues of demos to make the wait less agonising.
The Virtual barbershop is something that popped to my head as well when I heard about the Oculus Rift. I also recall Palmer and a bunch of other people talking about it on a panel once. They pointed out that at some point the industry stopped focusing on 3D sound. I hope that with the Rift and VR people start paying attention to it again. It's hard to tell how difficult will it be to simulate proper 3D sound. The problem is very similar to realistic rendering. Instead of simulating photons we are simulating sound waves. For realistic 3D sound we need to consider how sound bounces around the environment and interacts with different materials. This will require quite a bit computing power if done with a "brute-force" method. My guess is that it's enough to fake the effect in someway while still having an approximation of sound waves bouncing. With rendering it's kinda easy to see if faking something gives a good result since we can see the result. With sound it's a bit more difficult, but on the other hand I guess you don't have to be as accurate.
The Unity demo sounded pretty good to me. Would be nice to know a bit more how their system works. In the demo video they have on the description I can hear the difference when the robot arm is low and high. I would like to see a demo where they have a complex 3D enviroment. How would this system work with different rooms, corridors etc.
Edit: Also if you get a nice tingling sensation while listening to the Virtual Barbershop try googling ASMR or Autonomous sensory meridian response. There are a lot of ASMR videos on youtube. Here is an example: http://www.youtube.com/watch?v=ccXTDTTveEY