Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
Yashimitzu's avatar
Yashimitzu
Honored Guest
11 years ago

Eye-Sensor

Hey, i just registered to tell about an idea i had days ago.

How about to put a Eye-Movement-Sensor into the Oculus.

You look at a direction without even having to move your head.
(No shit sherlock?!) well, but you ever had a feeling of "depth of field"?
You look at your Hands (I mean how great would it be to have this with a Arm/Hand detection Glove.)
And you can see how sharp they get and everything in the background gets blurry.
(I hope you understand what i have on my mind...)

Imagine this with the Playable Teaser from Silent Hill.
You can put your Hand on Walls (with this Arm/Hand detection Glove, that emits a pressure on your hands inside cuz it detects the walls) sneak up on them and feel the power of Depth of field in its true form.

Its just an idea and extremly hard to make it possible.
But i hope someone will be having a fantasy push with it.
(All the Game Creators could have the possibilty to make much more Atmospheric games.)

Greetings, Yashimitzu

6 Replies

  • Gerald's avatar
    Gerald
    Expert Protege
    great ideas, but also old ideas that have been discussed to death. search for foveated rendering.

    sony has an implementation working outside of VR, microsoft bought a company that already uses it in hmds (Fove) and I bet Oculus is not ignorant of the topic either. give it some time until we see workable solutions - but do not expect DOF to appear any time soon, chances for eye controls are better though.
  • Could be really nice, i even would spend tons of money for this :)
  • In multiplayer games actually detecting eye positions could increase the immersivity greatly.

    Think about actually see another player looking you in the eyes. Though half naked elves in WoW might become a element of embarresment. 8-)
  • tyhuber's avatar
    tyhuber
    Honored Guest
    Eye tracking is actually very useful as diagnostic criteria for many brain disorders or injuries so there has been considerable research and a few predicate devices which serve as a basis of modification for use with the Oculus Rift. If you are interested, I'll include a brief overview of why eye tracking is medically relevant at the end of this post.

    There are two main issues that must be overcome for eye tracking to become feasible to combine with the Oculus Rift. I'll really only be focusing on one eye tracking technique specifically here, videonystagamography(VNG) but these issues are actually relevant to all eye tracking methods I know of. The reason for the focus on this particular technique is that I am by far more familiar with it than anything else. I actually worked on the design of a VNG device which has which has gotten FDA approval within the last 6 months so I have a pretty good view of where the industry is right now. If anyone notices something missing with this analysis which I haven't mentioned please let me know. Same thing if these issues are easily overcome or mitigated using a different eye tracking technique or if there is a method to avoid the issues altogether that I overlooked. I also want to say I am by no means an expert in this field, just a lowly engineer doing his best in a relatively new field that only has a total of about 4-5 different devices FDA approved for use in the US.
    Issue 1:
    Most eye tracking methods work by using a camera and video feed of the subject's eye to determine the pupil position. In VNG, an IR camera is placed so the video stream is pointing from around the temple area towards the nose. Since the video stream starts out perpendicular to the eye, it must be reflected 90 degrees using a semi translucent mirror to get the front view of the eye. Other methods must do something similar.
    • The oculus rift lenses make this a lot more complicated.

    • If you try to look at the eye directly through the lens, the video is distorted which makes analysis a whole lot more complicated.

    • Another option is to have a small hole in the lens where the camera is passing through to avoid any distortion but this creates a noticeably different portion of the image the patient sees which is obviously less than ideal

    • Last option I know of is to try to get the video in that small area between the lens and the eye so it doesn't have to pass through the lens at all. Since these lenses are so close to the eye in the Rift, it is very tough to get the video signal to the encompass the entire eye without moving the lens a lot father out than any normal position

    Issue 2:
    Most current eye tracking only track pupil position in 2 dimensions(horizontally and vertically) because this is sufficient for whatever they need to do but the pupil actually moves in all 3 dimensions.
    • The curvature of your eye means your pupil is closest to the camera when you are looking straight ahead and gets farther away whenever you rotate your eye to look in a certain direction .
    • For the precision necessary to actually make eye tracking worthwhile for the rift, this depth distance does in fact need to be measured.

    • To figure out that depth you would need to either measure the subject's eye shape accurately in 3D prior to any eye tracking or use 2 or 3 separate camera feeds for each eye to triangulate pupil position in all 3 dimension.


    Finally keep in mind that for VNG eye tracking systems at least, a single device will run you anywhere from $35,000-$50,000. They are medical devices really only used for the diagnosis of balance disorders and traumatic brain injuries so the lengthy regulatory process for medical devices does contribute to this price but the point is eye tracking isn't eeasy. It's actually damn hard enough without \having to work around the design of the rift. I bet someone has at least semi-functional eye tracking Oculus Rift prototype somewhere but I haven't heard of any specifics. I tried using the VNG system I worked on with the Rift and failed miserably.
    I actually have seen there is a approved patent for an eye tracking VR helmet used in the military for concussion diagnosis and I'd be willing to bet the air force has something similar that drone pilots use. I don't think anyone outside of the military is going to get their hands on these for quite a while though.

    Why eye tracking is so useful in the medical field:
    Right now the diagnosis of many traumatic brain injuries(TBI) or balance disorders is a subjective process which very much depend on the opinion of the doctor performing the assessment. For concussions, a form of a mild TBI the patient's visual, cognitive and balance performance are assessed to perform a diagnosis. A lot of these criteria are subjective so diagnosis can vary wildly from doctor to doctor. An example of a subjective diagnostic criteria actually used something like the level of confusion of the patient. There is no direct way to quantify something like this so one doctor may think a patient is highly confused and disorientated and another see the same symptoms and think it is mild confusion. Obviously measurable criteria would be much more ideal and visual tracking offers metrics for this.

    To smoothly track a moving visual stimulus, you are actually constantly subconsciously predicting where it will 100-150ms in the future to account for your brain's sensory processing lag. This is this how long your brain needs to process visual data and combine it with other sensory inputs to form the coherent world view allowing you to read this right now. Since 50% of all cranial nerves relate to vision, increased gaze error actually correlates with the extent of a TBIs or brain disorders due to disruption of this prediction process. Because the distance between where a patient is actually looking vs. where the target actually can be quantified with eye tracking technology it is probably the most promising criteria to be used for brain injuries. This error has also been shown to vary between patients with many other types of brain disorders like ADHD or schizophrenia so it is actually a really promising assessment of brain function in general. So rest assured there is a lot of money to be made with eye tracking technology so I have no doubt within a year or two this technology will be available with the rift.
  • Mrob76o's avatar
    Mrob76o
    Honored Guest
    These guys are doing exactly what we need only not for VR.

    http://www.tobii.com/eyex

    Check out the plugin video for Unreal Engine 4 especially at 4:22 with depth of field. It looks awesome. If they can take this tech and put it inside a Rift......would be pretty cool.

    https://www.youtube.com/watch?v=0ISrY1aMSLs#t=265

    They have a Dev kit for only $139.

    I have no idea on latency or things like that but it looks like a good start. Hope Oculus takes a look at this.
  • Toron's avatar
    Toron
    Honored Guest
    I also think Eye-Tracking is a key part that should be in the next generation of VR-Devices. It would take the place of the pc mice. It's for sure one of the fastest ways to navigate menus at the moment.
    After these and a couple more years of developement the brainwave readers would be good to go. At the moment they can only read a couple of simple commands after quite some calibrating and with a big timing problem.