Forum Discussion
Porkhe_Pigatov
13 years agoHonored Guest
Some observations
Hi there! Just passing on my observations after spending a few days integrating the Rift into our game.
Observation one: the current Unity plugin implementation briefly threw me for a loop. I'm of the opinion that any controller or peripheral should try and follow Unity's input manager model as closely as possible, since that would make a proper integration much easier. The groundwork I had laid in preparation for integrating the Rift assumed I would be treating the Rift sensor data as controller input axes, polling the HMD for the orientation every frame, and in general just treating whatever Rift object/prefab was supplied as just a straightforward camera replacement. It took me a little while to figure it out, but I got it working in the end.
(We're using Mecanim and IK with a full body avatar system, and our entire motion control scheme was based around polling new peripherals for positional and/or rotational input and either applying those directly to the affected bones or using them to drive IK effectors. In this case, the expected implementation assumed that the HMD would only be supplying its look rotation and the skeletal IK would have taken care of orienting and positioning the cameras.)
Observation two: Rotational drift isn't that noticeable without an avatar body or a pair of hands in the scene, but once you have it hooked up to a body, you can see quite a bit of drift in the Y axis that accumulates over time. I have to hit the B button to reset it about once every couple of minutes. Any advice on minimizing or mitigating that would be welcome.
Observation three: Some visual elements that our art director loves are broken, like realtime shadows and skyboxes. Light cookies are also semi-broken. This...is not good. I really don't want to have to fork things all over the place to ensure that things look good on both monitors and the Rift, 'cause things are complicated enough as it is. Advice and any news of pending fixes/reworks are most welcome.
Observation four: Unity is...a bit awkward when it comes to trying to launch games on the Rift. Sometimes the "-adapter N" command line trick works, sometimes it doesn't, and I end up having to clone the Rift with my primary display to make it work reliably. This brought on a sudden attack of curiosity about how to handle this sort of issue on the user side of things--does anyone here have any practical tips for making Unity standalone builds play nice without having to explain to them how to screw around with their display settings and whatnot?
Observation five: With a full body avatar system, the lack of position tracking for the HMD becomes a noticeable issue at certain points in the game. In our case, the avatar's wearing a headlamp that's turned on by reaching up to your head and pulling the trigger on a Hydra controller, and sometimes the head/hand relationship isn't 1:1 at all. It's sort of dealable with, but it would be way better if it wasn't an issue in the first place. This isn't so much an Unity integration thing as much as something general that came up while I was integrating it, and I wanted to mention it before I forgot. So if there's anyplace where I can cast my vote for an add-on position tracker or to have positional tracking added to the consumer version, point me there so I can toss my 2 cents in. :)
Observation one: the current Unity plugin implementation briefly threw me for a loop. I'm of the opinion that any controller or peripheral should try and follow Unity's input manager model as closely as possible, since that would make a proper integration much easier. The groundwork I had laid in preparation for integrating the Rift assumed I would be treating the Rift sensor data as controller input axes, polling the HMD for the orientation every frame, and in general just treating whatever Rift object/prefab was supplied as just a straightforward camera replacement. It took me a little while to figure it out, but I got it working in the end.
(We're using Mecanim and IK with a full body avatar system, and our entire motion control scheme was based around polling new peripherals for positional and/or rotational input and either applying those directly to the affected bones or using them to drive IK effectors. In this case, the expected implementation assumed that the HMD would only be supplying its look rotation and the skeletal IK would have taken care of orienting and positioning the cameras.)
Observation two: Rotational drift isn't that noticeable without an avatar body or a pair of hands in the scene, but once you have it hooked up to a body, you can see quite a bit of drift in the Y axis that accumulates over time. I have to hit the B button to reset it about once every couple of minutes. Any advice on minimizing or mitigating that would be welcome.
Observation three: Some visual elements that our art director loves are broken, like realtime shadows and skyboxes. Light cookies are also semi-broken. This...is not good. I really don't want to have to fork things all over the place to ensure that things look good on both monitors and the Rift, 'cause things are complicated enough as it is. Advice and any news of pending fixes/reworks are most welcome.
Observation four: Unity is...a bit awkward when it comes to trying to launch games on the Rift. Sometimes the "-adapter N" command line trick works, sometimes it doesn't, and I end up having to clone the Rift with my primary display to make it work reliably. This brought on a sudden attack of curiosity about how to handle this sort of issue on the user side of things--does anyone here have any practical tips for making Unity standalone builds play nice without having to explain to them how to screw around with their display settings and whatnot?
Observation five: With a full body avatar system, the lack of position tracking for the HMD becomes a noticeable issue at certain points in the game. In our case, the avatar's wearing a headlamp that's turned on by reaching up to your head and pulling the trigger on a Hydra controller, and sometimes the head/hand relationship isn't 1:1 at all. It's sort of dealable with, but it would be way better if it wasn't an issue in the first place. This isn't so much an Unity integration thing as much as something general that came up while I was integrating it, and I wanted to mention it before I forgot. So if there's anyplace where I can cast my vote for an add-on position tracker or to have positional tracking added to the consumer version, point me there so I can toss my 2 cents in. :)
14 Replies
Replies have been turned off for this discussion
- MrGeddingsExplorerGood insights. One thing to note is that the skyboxes that come with Unity wont work that well with the Rift i think you have to supply 3rd party ones or your own or something like that.
and the drift hopefully will be fixed down the road as the magnetometer is not yet implemented in the SDK.
Good thoughts though :-). Hopefully theyll add Unity and SDK updates over time to improve such things :-). - PinglesExplorerThanks for posting your concerns. While I am currently working on my project in Unity I have not yet purchased the Pro. Seeing how fast both Unity and Oculus react to your concerns will help me to make a final decision on engines.
- KordaHonored GuestBut I love shadows :( I remember the Vireio driver having trouble with Skyrim's shadows I wonder if the cause is similar.
I think you could get around the skybox problem by:
1) applying the skybox textures to a box with the skybox material on it
2) set a Rift camera inside it
3) put it in a far away place
4) change your main camera clear flags to Dont Clear
5) change your skybox camera depth to -1
I'm not an expert and I don't have my rift yet so I can't test it but maybe that will help?
edit: just read a post where cybereality mentions bugs with deferred rendering and shadows: "Anyway, there are still some major issues with deferred lighting (shadows, etc.) so it's recommended you use forward rendering until we can work out the bugs." So if you are using Deferred rendering maybe try switching to Forward? link: https://developer.oculusvr.com/forums/viewtopic.php?f=37&t=51&p=1402&hilit=unity+shadows#p1402 - Porkhe_PigatovHonored GuestYeah, I was gonna work around the skybox thing by going towards a procedural or at least hemispherical sky solution. Never really liked static skyboxes much.
Shadows, though...I'm seriously torn. On one hand, I'm thinking "meh, we'll deal" because Unity's realtime shadows have a fairly severe impact on framerate, but on the other hand, they're a lot more practical than baked lightmaps for some of the gameplay challenges we've encountered so far. That, and the AD is pretty set on using realtime shadows where possible.
I'll have a play with both rendering modes in Unity, but forward rendering apparently only supports realtime shadows from one directional light, and has a few other potential issues that I need to double-check. It might turn out to be only a partial solution at best. We'll see. :) - virrorExplorerThe skybox issue is a known problem and there is already a post in this section about it, have a look at the Tuscany demo how they did it.
Regarding the shadows, if its possible maybe you should hold of deciding on that atm? hopefully Unity is working on a solution for that...
Drift compensations is already in the works as i understand it. - CaliberMengskExplorerAs far as the drift and having to hit it every couple of minutes, why not force it when the player isn't looking around much. I'm not sure how the reset works since I don't have mine, as far as whether it resets the oculus to the body, or the body to the oculus, but you could still track that orientation, force the update, then apply the original offset. That way the player's vision doesn't change, but the body is again offset to the player.
You could also do this orientation reset as just a timer if it were programmed well enough that there was no visual flicker or anything.
As said though, I don't have my rift yet to really give a great suggestion, since I can't test anything.
That also said, you could write a script that read the data and outputs it like you want it, to simplify things. - cyberealityGrand ChampionThe default Unity skybox isn't working with the Rift. This is a known issue and noted in the documentation. The work-around is to create your own skybox, you can copy the one used the the Tuscany demo.
In terms of shadows, it sounds like you might be using deferred rendering. This is another known issue. At the moment deferred rendering is not supported. The work-around is to use forward rendering instead.
The y drift sounds strange to me. There should not be any drift in the pitch. Currently there are issues with yaw drift (is that what you meant?). In that case its something we are working on and should be fixed soon. - Porkhe_PigatovHonored Guest@Virror: The game has to work on both a regular display and the Rift with the same art assets, so if the Tuscany skybox doesn't look good on a normal display, then it's not a practical solution. I'll look at it and find out. :)
Making hardware work with software is usually the hardware developer's problem, so I'm not certain that Unity's going to come up with a solution. We'll just have to deal with it the best we can until it all gets sorted out somehow.
@CaliberMengsk: The reset apparently just zeroes out the Rift's orientation relative to its parent. If you're looking at the screen when the drift builds up, you can tell something's wrong because your body eventually ends up walking in a different direction than you expect, but a script would have a bit of difficulty reliably distinguishing drift from legitimate player input, I think. Granted, I'm on my first cup of coffee right now and thus not at my intellectual peak, but it's definitely something I'll look into--it's one less button for the player to push, and that's always a good thing. :)
I wrote an input wrapper last night to abstract out all the different peripherals, and that smoothed out things considerably from a game logic standpoint. - Porkhe_PigatovHonored Guest@cybereality: Yes, the rotational drift is around the Y (up) axis--it manifests as your body appearing to gradually walk in the wrong direction relative to the direction your head is looking. X is pitch, Z is roll.
As for forward vs deferred, all I can say is that making it work with deferred rendering is important. Going from "looks bad on the Rift and great on everything else" to "looks merely OK on all displays" isn't a workaround, it's trading one set of problems for another. :lol:
We can work around this short-term by falling back to lightmaps and lightprobes and still maintain most of the visual quality we want, but there are some parts of the game where realtime shadows from multiple light sources are important. - cyberealityGrand Champion@ChristopherRoe: Yes, we are both talking about the same thing. Its just when referring to tracking in VR, the standard terminology is "yaw", though you are also technically correct. This will be fixed with our magnetometer-based yaw correction, which will be available soon.
The issue with deferred rendering is not an issue with our integration, but its a problem with Unity itself. Basically if you adjust the camera's projection matrix manually (which is needed to do the proper stereo render) it causes all sorts of issues, like with the shadows. There are lots on threads about this on Unity's forums, and I am not sure if or when there will be a solution.
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 9 years ago
- 13 years ago
- 12 years agoAnonymous