cancel
Showing results for 
Search instead for 
Did you mean: 

OVRHands Prefab doesn't show when testing in a Unity Project.

SriAmin
Explorer

Hey there,

 

So I need to implement Hand Tracking into a Unity Project, and I first tried it in a base project and everything works, I was able to import the Oculus Integration Package, apply the OVRCameraRig and OVRHandsPrefab within the Camera Hand Tracking Anchors and everything shows perfectly.

 

But once I try a similar process in the actual unity project that I need to work on, everything works and runs, except when I put the headset on and move my hand, nothing shows, like no mesh or texture shows up. Everything builds and runs fine, but I'm not sure as to why it just doesn't work on this project but worked before.

 

I was able to log the position of the hands and it doesn't seem to move, it's just stuck at a coordinate, but I'm not sure as to why it doesn't move. If anyone can help me in regards to why the hands don't show up that would be wonderful, and if you know anything that I can log to get more information I'll do that.

 

Edit: So I just found this out, but my friend who tested out the same code on his own computer found out that it works, meaning that he can see the hands, so now im just confused as to why it doesn't work on my computer.

1 REPLY 1

Respawner69
Explorer

The default OpenXR runtime needs to be Oculus and not SteamVR. Oculus app has a bug where its not able to switch runtime. Try this tool https://github.com/WaGi-Coding/OpenXR-Runtime-Switcher