Forum Discussion
Cybershaman
11 years agoHonored Guest
Reverse AR (Augmented Reality): bringing the real into VR
I was trying to figure out a way for a user to see their keyboard while wearing a Rift. Would this simply be Augmented Reality in reverse? Instead of having digital elements being projected into the “real” world as in AR it would just be the opposite: projecting real world elements into the VR environment. By placing colored dots at the corner of a user’s keyboard and then having a small head mounted cam it would allow the system to project the area within the dots, basically just the keyboard or any object really, within the VR environment when the user tilts their head down. Perhaps the corners/edges of the image could be grabbed within a setup program so the image could be stretched and moved in order to match it as close as possible to what the user would see if they weren’t wearing the device in order to aid with proprioception. It would be like always having a floating keyboard just below your direct level line of sight. Perhaps even have it only appear when you looked down. You could even have a variable fade-in setting so that it would only appear when a certain line of sight depression angle was reached. The system could even dither the edges of the keyboard image so that it wasn’t so stark. Even employing other kinds of color/image processing qualities like grayscale, colored borders, saturation, brightness, etc. just like you would with a typical cam application. The possibilities are endless. Maybe dots wouldn’t be needed at all? Maybe just have the setup program look for any “keyboard-like” object(s) within a certain area and have it ask if that’s your keyboard? Perhaps not just a keyboard but any amount of desk space that a user would like to “see” within VR even a computer monitor. Just drag out an area that you would like to have visible within VR. Perhaps even have it in stereo with two cams? We would be adding weight, of course, but I’m sure with the size of cameras these days it would be negligible. Are there any other issues that I may have overlooked? This isn’t really earth shattering but input controller, specifically keyboards, issues come up a lot in conversation with my friends.
6 Replies
- MrKaktusExplorerI've done that with hands:
https://www.youtube.com/watch?v=QtrPuYeh_NY
You can see hands usage from 0:41 and later interesting part 2:02.
Made it as "wolfensteinVR" to add some fun :) - CybershamanHonored GuestVery cool! :) I wonder if you could wear some skin tight gloves that have a high contrast pattern on them so that the recognition software could "see" them better. Is it just looking for anything with your skin tone and then projecting that into the scene? Love the interaction with the VR environment! :) I guess seeing your keyboard in VR wouldn't be too hard. Just curious how the sense of proprioception would be. Like, would you see the keyboard but then just be pawing at it because it would be too hard to tell where your hands were? Very cool stuff. Very happy to see this stuff coming to fruition after waiting over 20 years since the halcyon days of VR being "just around the corner!" ;)
- MrKaktusExplorerHands are reconstructed as 3D mesh from depth camera (and textured using color from color camera).
Currently I'm having custom filter that removes everything except hands. It takes into notice HMD orientation.
Filter is based on depth discontinuity, so it will properly display not only your hands but everything you will hold in them.
I've made few tests and I was able (barely but still) to read text messages from my smartphone in VR ;].
Keyboard should be visible as well if you will just look down and touch it, but there are technical limitations.
Prototype of depth sensors I'm using is measuring depth using IR. That IR beams re reflected in all directions when they hit glossy (?) plastic surface of my black keyboard. This results in disappearing keys as IR beam is not returning to the sensor and samples are removed as bad. With better depth camera HW this could look much better, without loosing samples. - CybershamanHonored GuestVery nice, Mr. Kaktus. :) So what you're basically telling me is...that I need to get myself a dev kit asap, right? ;) Seriously, though, it would be fun to mess around with one of these. I've just been holding back because I get distracted easily and want to make sure that if I'm going to invest in something that I will give it the time that it deserves. :) I am very curious if I can get the clarity required to see and use a keyboard effectively in VR. That's why I'm wondering if by specifying anything within the limits of 4 dots that make up a quadrangle it would be easier to to have it display everything without the, I'm not even sure what to call it...Reality Phase Distortion? Like when the lines on your palm or the edges of your hand and especially the fingers disappear. I really like what you're working on, though. I think it would be fascinating to be able to pick up any object and then have it appear and possibly even affect/interact with the VR environment. Wouldn't it be great to have a prop weapon to actually use in your Castle Wolfenstein game? ;) Very good stuff. I look forward to seeing more of this stuff in the future! :)
- mptpExplorerI did this a while ago, but rather than using a camera and tracking markers on the keyboard, I used LEAP and tracked my fingers, and used the known finger positions and known keyboard characteristics (which could feasibly be sourced from manufacturers) to solve for the keyboard position.
Once we have good finger tracking (because LEAP isn't quite accurate enough for this, and fails whenever your hands get close to things, which makes it of limited use for this application), this kind of technique will be really easy to do. You could also use if for tracking your mouse quite easily by getting objective mouse position at each click, and moving the virtual mouse based on Input.mouseDelta.
https://www.youtube.com/watch?v=ckAGpmf21a8 - MrKaktusExplorerThats cool idea to track pressed keys related to hand position :).
The drawback is you need to have model of given keyboard which means it's cool demo, but there is no way that could be used in production app because of keyboards variety :/. The thing I don't like about LEAP is that it's loosing tracking of your hands from time to time, which results in "jumping" of mesh representing them (which is extra annoying, and can be observed also on your YT video). Other thing is the fact you need to use mesh of hands which introduces huge immersion breaker as they don't feel like my own hands. Or like in latest LEAP demos, you're seeing your own hands but it's just a depth visualization without colors.
I'm working now on improving my reconstruction method to remove aliasing on the edges of hands, and integrate it with PBR lightning. I hope will be able to show much better hands demo in few months.
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 28 days ago
- 13 years ago
- 13 years ago