cancel
Showing results for 
Search instead for 
Did you mean: 

Hacking Our Senses

wildermuthn
Honored Guest
The Oculus Rift has created the virtual sense of sight, not perfectly, perhaps not even 'well' (relative to the human retina), but it has done so well enough. People step into the game because our sense of sight has been successfully virtualized.

But there are other senses to deal with. Hearing has already been taken care of — headphones and stereo sound work just fine. The Rift + Headphones makes for a truly immersive experience. But there's other senses that we know are a big problem: touch and movement. Lack of movement causes naseau. Lack of touch keeps us from truly entering into our virtual realities.

But the brain is smart. We can hack the mind's senses.

"Touch to touch sensory substitution is where information from touch receptors of one region can be used to perceive touch in another. For example, in one experiment by Bach-y-Rita, the touch perception was restored in a patient who lost peripheral sensation from leprosy. For example, this leprosy patient was equipped with a glove containing artificial contact sensors coupled to skin sensory receptors on the forehead (which was stimulated). After training and acclimation, the patient was able to experience data from the glove as if it was originating in the fingertips while ignoring the sensations in the forehead. After two days of training one of the leprosy subjects reported "the wonderful sensation of touching his wife, which he had been unable to experience for 20 years."

http://en.wikipedia.org/wiki/Sensory_substitution#Tactile.E2.80.93tactile_substitution_to_restore_pe...


The sensation of touch in VR may not require full-body suits that actually recreate the sensation of touch. A crude, but effective, implementation might only take an armband that recreates full-body touch on a small section of skin. The same could be done for a sense of movement. Yes, it would only be an approximation. But it might be just good enough to fool our minds.

But why stop with touch and movement? Why not give human beings entirely new senses?

"See with your tongue. Navigate with your skin. Fly by the seat of your pants (literally). How researchers can tap the plasticity of the brain to hack our 5 senses — and build a few new ones."

http://www.wired.com/wired/archive/15.04/esp.html


These experiments in creating new human senses have, to my knowledge, been restricted to the real world. But what better way to create new senses than through Virtual Reality, since everything in a Virtual World already has been measured and calculated.

What senses would you like to have? VR can give them to you. And the big question for me is this: do you get to keep your newfound superpowered senses when you take the Rift off?
19 REPLIES 19

FlyingFox
Not applicable
Give me Echo Location. make a sound and it will echo trough the rooms/caves/trees, mapping the entire thing and giving you a way to navigate trough pitchblack areas.

wildermuthn
Honored Guest

everygamer
Explorer
Honestly, I think computer-human interfaces are the only way we are going to get a truly immersive experience. This is decades away, and really requires finding a non-invasive way to pass information to the human brain without implants. Once we have non-invasive technology which can send signals to the brain and trick it into seeing, hearing, feeling, tasting, touching it never will be perfect immersion.

Of all of our senses, sight and sound may be the easiest to tackle first. Noise cancelling headphones with a decent level of 3D sound handles hearing, and once VR headsets like the Oculus can get high enough resolution to remove the screen-door effect, and allow us to see small details at a distance (say the bottom line on a eye test) with fidelity we can trick these two senses.

The hardest senses to trick are going to be touch (us using our hands to feel objects) and Kinesthesia, which is the physical awareness of our own bodies. We might be able to in a VR environment allow someone to put their hands on a wall, and with bulky equipment, simulate your feeling touching that wall, but nothing is going to stop your physical hands from passing through the space where the virtual wall is. So even if we trick touch, we will not trick kinesthesia. The only way to trick Kinesthesia is to block our minds awareness of our body completely and replace it with false information.

So true immersion will not happen until we can override our bodies signals to the mind and replace them with false signals. Until then sight and sound will likely remain the most likely targets for immersion technologies, especially in consumer electronics.

As to additional senses, I read an interesting experiment years ago where someone had to wear a belt that would vibrate on the side that was true north. After the person used the belt for an extended period of time they found they had subconsciously mapped their environments, they could point to home at any point because they had such a strong understanding of where they were in relation to other places.

http://www.wired.com/wired/archive/15.04/esp.html

Our brains can very quickly adapt to new information, and it is interesting how it does.

KBK
Protege
Intelligence... is not inherent - it is a point in understanding. Q: When does a fire become self sustaining?

geekmaster
Protege
"FlyingFox" wrote:
Give me Echo Location. make a sound and it will echo trough the rooms/caves/trees, mapping the entire thing and giving you a way to navigate trough pitchblack areas.

I gave you echolocation four days ago:
viewtopic.php?f=26&t=1670&p=21825#p21825

And some more progress at sensory hacking. Direct Neural Interfaces may not be that far into the future:
"At http://www.youtube.com/watch?feature=player_embedded&v=CR_LBcZg_84
This talk is about research done to allow a monkey to control a robotic arm, as well as a virtual arm- including receiving sensory feedback from the virtual arm.
Potential path to a direct neural interface for VR?

They are doing this suff with humans too.

A robotic hand that can feel:
http://www.youtube.com/watch?feature=player_embedded&v=X85Lpuczy3E

It's possible to hack your brain, forcing you to reveal information that you’d rather keep secret:
http://www.extremetech.com/extreme/134682-hackers-backdoor-the-human-brain-successfully-extract-sens...

We can reconstruct visual images from fMRI data:
https://sites.google.com/site/gallantlabucb/publications/nishimoto-et-al-2011

We can record what people are dreaming about:
http://www.nature.com/news/scientists-read-dreams-1.11625

Cracked retinal coding, now a functional prosthetic eye:
http://www.ted.com/talks/sheila_nirenberg_a_prosthetic_eye_to_treat_blindness.html

Now decoding speech signals in the brain:
http://www.nature.com/news/voicegrams-transform-brain-activity-into-words-1.9945

I have been waiting most of my life for a Direct Neural Interface. Until then, my Rift and my Hydra will be my "Poor Man's" gateway to VR.
And here is another key to DNI (Direct Neural Interface):
"mind meld" here we come!

And bionic contact lenses are not far off either:
http://en.wikipedia.org/wiki/Bionic_contact_lens
Once perfected, I see no obvious reason to NOT insert them just below the surface in an intra-corneal implant.

seeingwithsound
Explorer
This does not address the touch and movement problem, but perhaps this
sensory substitution app can work with the Oculus Rift and Android?
http://www.seeingwithsound.com/android.htm
https://play.google.com/store/apps/details?id=vOICe.vOICe
The optional tactile graphics mode might also be interesting to experience
in combination with a headset.

For some background info,

Sensory hijack: rewiring brains to see with sound
http://www.newscientist.com/article/mg20727731.500-sensory-hijack-rewiring-brains-to-see-with-sound....
Seeing with Sound - The vOICe http://www.seeingwithsound.com

geekmaster
Protege
"seeingwithsound" wrote:
... Sensory hijack: rewiring brains to see with sound
http://www.newscientist.com/article/mg20727731.500-sensory-hijack-rewiring-brains-to-see-with-sound....
More information (and enlightening videos) about seeing with sound using the "vOICe" as featured in that article were posted in another thread here:
viewtopic.php?f=25&t=1317&p=15227#p15227

seeingwithsound
Explorer
"geekmaster" wrote:
More information (and enlightening videos) about seeing with sound using the "vOICe" as featured in that article were posted in another thread here:
viewtopic.php?f=25&t=1317&p=15227#p15227


Thanks! Ah yes, some nostalgia. 🙂 You have a good memory, one of the snapshots of my old Compuserve website is at http://web.archive.org/web/20010123211300/http://ourworld.compuserve.com/homepages/Peter_Meijer/ (July 2000).

Nowadays I am on the lookout for (future) mass-market HMDs that I can use as a platform for affordable sensory substitution for researchers and for blind end users, be it Oculus Rift, Google Glass, Recon Jet, you name it. I recently ran The vOICe for Android on a Rikomagic TV stick with HDMI output, which is of little use at present for lack of camera input to the TV stick (so I just made it timeout after half a minute), but I suppose it could readily connect to an Oculus Rift setup along the lines of http://www.oculushut.com/blog/oculus-rift-on-android-devices/

Regards
Seeing with Sound - The vOICe http://www.seeingwithsound.com

jwilkins
Explorer
This thread reminded me of one of the most beautiful manga ever created.

yokohama-kaidashi-kikou-87824.jpg
yokohama-kaidashi-kikou-87825.jpg

It might be a spoiler to see these two images, but I felt tears well up in my eyes when I looked them up again, so they are probably spoiler proof.

Basically, Alpha the android, is able to interface with machines by holding a connecting cable in her mouth. In these panels she is using that ability to pilot an old jet boat out to sea and scuttle it.

Somehow the artist is able to get across to me Alpha's surprise at having become one with the boat. I find this very moving.
(╯°□°)╯︵┻━┻