Meta XR Plugin - Hand Tracking Testing in Editor
Hi, I have downloaded the MetaXR plugin from this link https://developer.oculus.com/downloads/package/unreal-engine-5-integration/46.0/ and added it to a fresh unreal engine 5.0.3 project. I have then created a very basic pawn with a camera in it, two motion controllers and two oculus xr hand components. I have set up everything exactly the same way that I would do with 4.26/4.27 when using the OculusVR plugin and the oculus hand component. When I test in the editor with the VR preview I cannot see my hands. The OculusVR plugin and the oculus hand component allowed to easily test in the editor and did not require to launch on the device. Is this not possible anymore with the MetaXR plugin? Am I missing something obvious or any required extra steps? Thanks in advance, Simone6KViews1like4CommentsController snaps to Hand Wrist (Simultaneous tracking of Controllers and Hands)
Hi everyone, I have an issue that I enabled simultaneous tracking for both controllers and hands, everything works an all, just the controller is stuck in the hand all the time. I'm using Unity version 2022.3.52f1 and the Meta XR All-In-One-SDK version 71.0.0. About the setup: both Controllers and the Hand Tracking have the Show State "Always" since I want them tracked and in the scene constantly. Under Camera Rig "Simultaneous Hands and Controllers" is ticked as well as more below "Launch simultaneous Hand and Controllers mode". Body Tracking is by the way NOT enabled. Now further about the problem, when I have my controller in the hand it shows hand and controller, which is fine of course, but when I let go of the controller the virtual controller snaps to the wrist of my tracked hand and the virtual controller doesn't track my real controller anymore. When I pick it up again, the controller is tracked like nothing happened. This is for both hands and I couldn't achieve the effect I want like in this video, where both hands and controllers are constantly tracked and visible, no snapping to the wrist, but the controller stays where I put it down: https://youtu.be/ShIeLRZbUEA?si=UWLEchRaYYu9JS7M&t=348 I really hope you guys know what to do here, thanks ahead for the answers already! Perhaps someone from Meta? GoodGameGomez MetaStoreHelp665Views1like1CommentMore then one OVR cameraRig
Hello All, i´m trying to develop a VR based multiplayer app. i'm using Oculus quest and PUN2 for this. mz question is: is it possible to implement more than one OVRCameraRigs in oculus quest? the app is working perfectly with single player but hands are not getting synch in multiplayer scene.1.2KViews0likes3CommentsCannot open HandsTrainSample because Engine modules are out of date.
I wanted to try out Hand Tracking with UE4, and I just did git clone the engine source from Oculus github, and build the engine following the instruction in the Oculus website. I cloned 1e09d2f (the latest one) from branch 4.25, which is release "oculus-4.25.3-release-1.51.0-v19.0" and built the engine using Visual Studio 2019. It worked fine, I could make a new project. I could also open HandSample project and it works great with Oculus Quest & Link. But I couldn't open HandsTrainSample. I built the engine and clicked Debug/Start a new instance in Visual Studio, then it opened up the editor. Then I chose the sample, but I couldn't open it. I also tried to open .uproject directly but it just says "The following modules are missing or build with a different engine version: HandsTrainSample Would you like to rebuild them now?" and if i go Yes, then I get Missing Modules error with the message "Engine modules are out of date, and cannot be compiled while the engine is running. Please build through your IDE." and I couldn't open it. I guess I have to find and fix errors in VS but I don't even know where I can get .sln file for this project so which I need to see errors in VS. I tried Generate Visual Studio project files from HandsTrainSample.uproject but .sln file wasn't generated in the same directory although I got no error. This is my first time to build the engine from source and I might misunderstand some stuff. I hope someone helps me with it.4.7KViews1like5CommentsOpenXR hand-tracking + depth api - is it possible?
Is it possible to have both Unitys OpenXR-plugin, Oculus OpenXR Plugin and Oculus XR Plugin in the same project somehow? Would like to use default OpenXR for hand-tracking (so it's easier to port to other XR compatible headsets), and then add Oculus XR-plugins features as a progressive enhancement. Ex using Unity XR Hands: https://docs.unity3d.com/Packages/com.unity.xr.hands@1.1/manual/index.html And Meta Depth API as an enhancement for Quest headsets: https://github.com/oculus-samples/Unity-DepthAPI Currently they seems to not be compatible, or am I'm missing something?2.1KViews1like5CommentsIssue with VRC.Quest.Input.8
I am currently attempting to submit my app to the Meta Store AppLab. I need some advice on the following issue: In VRC.Quest.Input1, it is suggested that in-app menus should be activated with the menu button. I have implemented my menu in the way that when you do the pinch gesture with your left palm (hand tracking), the in-app menu appears. However, the app was rejected because in VRC.Quest.Input.8 it says that for apps that support hand tracking, the system gestures are reserved and should not be used. May I have misunderstood something. Could you please advise me on how to open my hand menu when I am not supposed to use the system gesture?503Views0likes0CommentsHand skeleton data startboneId issue
Hi, I'm trying to get hand skeleton position data from quest3. I'm following https://developer.oculus.com/documentation/unity/unity-handtracking/ and https://www.youtube.com/watch?v=DR60_TCkmAY&t=305s this video. But when I return handSkeleton.GetCurrentStartBoneId(), even though I added the hand skeleton(OVRHandPrefab) into the script explicitly, it gives me start bone id is FullBody_start. The thing weird is end bone id is Hand_End properly. Is there anyone else who struggled same problem with me? Any suggestion or solution?481Views0likes0CommentsSkeleton physical colliders pose not updated during hand tracking
I run the HandsInteractionTrainScene in Quest with Unity, and find that the skeleton physical colliders are not updated with hand tracking pose. I only turn on the Render Physics Capsules and allow Hand Tracking Support in OVRCameraRig. Does anyone know what's going on? You can see the footage in the attached zip file.Solved3.9KViews1like7CommentsHand anchors stuck
Hi. I've noticed since updating to recent Oculus Integration that LeftHandAnchor and RightHandAnchor is stuck in 0,0,0 even when I can see the synthetic hands moving around in the editor. They move and snap to the accurate position once I either grab an object or pinch my fingers together. It's like they are stuck in an idle mode until a form of interaction is triggered. Is this a known bug? And is there any way to sort of "force update" the HandAnchor transform?1.5KViews1like2CommentsOculus Link + Windows Build + Handtracking + Unity
Hello I know that you can have handtracking inside unity editor, but is it possible to: 1) Build app (just to windows platform) with hand tracking support in project 2)Then connect quest by oculus link 3) Run the app and have hand tracking? I was testing and experimenting and it doesnt seem to work. What is interesting if I move my hand outside the guard area it detects it. I guess that it is not supported yet or will never be. Anyone knows something about it?5.8KViews1like11Comments