Kiosk mode or "at least" a way to disable system menu gestures?
We are developing applications that will not be released on the Meta store. These include utilities and tools for other teams to use, as well as games intended for dedicated contests where users are required to remain within the app (and again... I'm not talking about apps available on the store but apps that our final clients will use in they HQ or physical stores to show off their products). Based on my understanding, many other developers are making similar requests. Is there a possibility to hide the system menus for the 'hand tracking' mode? Additionally, are there plans to introduce a 'kiosk' mode?2.9KViews17likes12CommentsQuest 3 lost all tracking
Brand new quest 3 just arrived, configured an everything was doing pretty well. Was just logging into a game when all tracking just stopped working, a window saying "Tracking lost" showed up, but I couldn't click anything since the controllers stopped working too, hand tracking also wasnt working. I tried cleaning up the cameras with a very soft towel, tried rebooting, resetting to factory settings, everything I could. The screen is either completely black, black with a dot or on the starting screen asking to turn on the controllers, but the screen just moves randomly and isn't following the headset movements at all. There is a white dot blinking on the left controller and they keep vibrating every few seconds.
Solved67KViews8likes169CommentsHow to enable handtracking 2.0 on Unity?
I've updated to the latest OVR Plugin, can't see anything new. Some of my apps heavily relies on quick handtracking and could really benefit from it. Also, is there a link for direct contact form for development support? Can't seem to find it either. So - how to enable handtracking 2.0 on Unity? Many thanks!Solved3.1KViews7likes3CommentsThe Complete List of Sample Unity VR Projects
Hey guys, I wanted to put together a list of my favorite sample projects that you can grab and learn from. In my opinion, these projects are pure goldmines, they don’t just showcase design principles around specific features but also provide direct examples of how to use them, which is especially important right now for something like a hackathon. For an even larger collection of Meta samples, see the GitHub list of all Meta sample repos here: https://github.com/orgs/oculus-samples/repositories?type=all Let’s start with our first category, the interaction samples. Interaction Samples Meta XR All-In-One (Interaction SDK) Sample Links: https://github.com/oculus-samples/Unity-InteractionSDK-Samples https://assetstore.unity.com/packages/tools/integration/meta-xr-all-in-one-sdk-269657 Description: A comprehensive demo from Meta’s XR Interaction SDK featuring core VR interactions like poking, grabbing, raycasting, UI, and locomotion, all working together. Perfect to understand how to integrate both hands and controllers in one system. First Hand Link: https://github.com/oculus-samples/Unity-FirstHand Description: A full VR game demo focused on hand-tracked interactions. It showcases a complete Unity experience using the Interaction SDK with hand tracking as the main input and controller fallback. XR Interaction Toolkit Examples Link: https://github.com/Unity-Technologies/XR-Interaction-Toolkit-Examples Description: Unity’s official XR Interaction Toolkit samples showing how to implement locomotion, selection, grabbing, and UI interactions. A solid starting point for setting up XR Origin and interactor/interactable components. Move Fast Link: https://github.com/oculus-samples/Unity-MoveFast Description: A fast-paced VR fitness demo using hand tracking and the Interaction SDK. The sample shows how to build an energetic workout game with responsive, punch-based interactions. Whisperer Link: https://github.com/oculus-samples/voicesdk-samples-whisperer Description: A voice-controlled VR experience demonstrating the Meta Voice SDK. Use voice commands as part of gameplay to learn how to integrate real-time voice recognition into your own projects. Tilt Brush (Open Brush) Link: https://github.com/icosa-foundation/open-brush Description: An open-source continuation of Google’s Tilt Brush. Lets users paint and sculpt in 3D space — an excellent reference for creative VR tools and spatial drawing. Multiplayer & Social Samples VR Multiplayer Sample (Unity XRI) Link: https://docs.unity3d.com/Packages/com.unity.template.vr-multiplayer@2.0/manual/index.html Description: Unity’s official multiplayer VR template featuring a prebuilt scene, avatars, and networking setup using Netcode for GameObjects. Great for learning multi-user interactions in VR. Mixed Reality Multiplayer (XR Multiplayer) Sample Link: https://docs.unity3d.com/Packages/com.unity.template.mr-multiplayer@1.0/manual/index.html Description: A tabletop MR multiplayer demo that includes avatars, voice chat, and shared AR/VR spaces. Features games like balloon slingshot and chess while teaching MR networking and colocation concepts. Tiny Golf Link: https://github.com/Meta-Horizon-Start-Program/Tiny-Golf Description: A free-to-play multiplayer mini-golf VR game created for the Meta Start program. Demonstrates basic physics, scoring, and networked multiplayer. Ultimate Glove Ball Link: https://github.com/oculus-samples/Unity-UltimateGloveBall Description: A VR e-sport showcase demonstrating multiplayer, avatars, voice, and in-app purchases. Integrates Photon networking and Oculus social APIs, making it a great reference for social competitive games. Spirit Sling Link: https://github.com/oculus-samples/Unity-SpiritSling Description: A social MR tabletop game letting players place a shared game board in real space and invite friends to join. Highlights Avatars SDK and MR colocated play. Decommissioned Link: https://github.com/oculus-samples/Unity-Decommissioned Description: A social-deduction VR game inspired by titles like Among Us. Shows how to handle multiplayer lobbies, Oculus invites, and social APIs in a networked Unity project. Mixed Reality (MR) Samples A World Beyond (Presence Platform Demo) Link: https://github.com/oculus-samples/Unity-TheWorldBeyond Description: A full MR showcase combining Scene Understanding, Passthrough, hand tracking, voice input, and spatial audio. A must-see for developers building immersive MR scenes blending real and virtual spaces. Phanto (MR Reference App) Links: https://github.com/oculus-samples/Unity-Phanto https://developers.meta.com/horizon/blog/phanto-unreal-showcase/ Description: An MR reference app focused on environmental awareness. Uses the Scene Mesh and MR APIs to blend gameplay with real-world geometry. Unity Discover (featuring Drone Rage and others) Links: https://www.meta.com/en-gb/experiences/discover/7041851792509764/ https://github.com/oculus-samples/Unity-Discover Description: A collection of MR showcase mini-experiences like Drone Rage. Demonstrates MR features including Passthrough, Spatial Anchors, and Shared Anchors in various game prototypes. MR Motifs Link: https://github.com/oculus-samples/Unity-MRMotifs Description: A library of MR “motifs”, small, reusable templates showcasing mechanics such as passthrough transitions, colocated multiplayer, and instant content placement. Cryptic Cabinet Link: https://github.com/oculus-samples/Unity-CrypticCabinet Description: A short MR escape-room experience that adapts to your room’s layout. Demonstrates interactive storytelling in mixed reality using environmental awareness. Passthrough Camera API Samples Link: https://github.com/oculus-samples/Unity-PassthroughCameraApiSamples Description: A sample project demonstrating how to access and process Quest’s Passthrough camera feed for effects, object detection, and image manipulation. Tool and Utilities Asset Streaming Link: https://github.com/oculus-samples/Unity-AssetStreaming Description: An open-world streaming sample that shows how to dynamically load content using Addressables and LOD systems — ideal for maintaining performance in large VR environments. Shader Prewarmer Link: https://github.com/oculus-samples/Unity-ShaderPrewarmer Description: A utility sample that preloads shader variants at startup to eliminate hitching or stutters when shaders first compile — an important optimization for smooth VR performance. Complete Game Showcase Northstar Link: https://github.com/oculus-samples/Unity-NorthStar Description: A complete VR game showcasing advanced interaction and visual technique for VR. Featuring rope physics, narrative story telling, lip sync and more.1.2KViews6likes2CommentsQuest 3 seemingly "random" recentering with palm up gesture
Just doing a sanity check here to see if other developers are experiencing any issues with the oculus doing weird things recently in regard to using hand interactions and the Meta reorientation button. After around July this year I've been experiencing an undesired automatic system recenter whenever the hands are palm up facing the camera. At first, I thought it was a random thing or due to testing the device with PCVR link, because this "auto" recentering is not 100% consistent - seemed random until recently. I ignored it up till now because I was too busy to address it. Recently though, I've seen a few others posting about it, but not enough to confirm if it's just our implementations or a Meta OS thing. But the more I test, the more I realize it is some fundamental change in one of the many recent Meta OS updates. On top of all this it doesn't recenter properly - at least not in my case. I am making a training app where I transition the user through lessons with some freedom to move if they're able, but it is designed for schools with potentially small, confined spaces. I also have my own reorient button and algorithm that the user can use to reorient themselves if they need to face a specific direction to fit with their space, so that the lesson items are always in front of them. It works perfectly. However, if they use the Meta button it will not always reorient them correctly - slightly off. Additionally, there is a problem with a seemingly random activation throughout my app when the users hand happens to turn palm up. I just so happen to have a "palm" menu that activates - by you guessed it - a palm up gesture. So, this annoying random recentering is more noticeable than usual. I am using hand interactions exclusively with floor tracking origin type - Unity 2022.3.19f, Quest 3, with Meta's All In One SDK v64. Just found this... Oculus hand tracking palm menu is automatically re... - Meta Community Forums - 12436512.9KViews5likes8CommentsHand tracking system gestures, possible solution?
Hi! I am currently developing for unreal and I find the hand tracking will be perfect for my use. The issue is that of all the gestures we can use, the thumb and index pinch is by far the most accessible and easy gesture to use and for the headset to recognize accurately, and that is a gesture that is system reserved. Even if I try to close my fist and then pinch it still sometimes recognizes the pinch as a system gesture instead. It would be unreasonable to request access to system gesture behavior for obvious reasons as apps and programs shouldn't have that access to begin with. I'm not sure what would be the best solution here, as I'm probably not alone in feeling a bit hamstrung trying to use such an awesome feature as hand tracking but being constantly thrown to the menu while using it. Some proposed solutions: -Make it possible to change the behavior in the headset menu so you would have to hold the palm up pinch for X amount of time before triggering the system menu. 1 sec, 2 sec, 5 sec, 10 sec for instance. Maybe with a ring closing around the meta icon to indicate how far away from opening the menu you are. -Make it possible to change the behavior from thumb and index pinch to something less used, like thumb and pinky pinch, or requiring both hands to do the system pinch. -Personally I would love to be able to relegate that system menu to the power button, so that if you are within an application, the power button would bring up the system menu. (Then long press to switch off the headset) Any feedback or possible workarounds would be much appreciated. Best regards1.1KViews5likes0CommentsHand Tracking menu ruins game play
The first thing people do with hand tracking is look at their hands. The second thing they do is touch their fingers. Then Quest shuts down the game; because that's the hand gesture Meta chose as an 'escape' key. I encourage players to see/feel their hands in the experience because it is so much more enjoyable and immersive. Literally the entire point of mixed reality. This menu punishes all that fun with a distracting, overly sensitive button that apparently cannot be disabled. But can it be delayed? Ideally, the icon would not appear until after touching (and holding) thumb/finger together for 2 seconds, then become active (similar to holding controller's menu button down to reset view). I understand Quest "needs" an escape gesture, but not if it constantly interrupts everything. Anyone else dealing with this? Found another solution or workaround?2KViews5likes6CommentsEnable hand and controller tracking at the same time.
Hi, I have a Oculus Quest Pro, and work on a Unity project that needs hand tracking and controller tracking for a physical object, but I can't enable hand and controller tracking at the same time. So I wonder is this possible? or is there any other ways to track a physical object using Oculus?12KViews5likes15CommentsHow to force Avatar Hand to blend as Interaction Rig's hand when grabbing object
I'm trying to have the Avatar hand (from avatar sdk) to shape as the interaction rig hands when grabbing object. For example, in the samples scenes if I put the avatar in the TouchGrabExamples scene when i grab an object the interaction hand actually stop at the collider position while the avatar's one go all the way through it. Someone have an idea for solve it?1KViews4likes1CommentHandtracking in PC SDK
Hi. Is it possible to do handtracking with the PC SDK? Reading through the documentation didn't help. We build simulators, and our product currently uses Leap Motion to do the tracking and interact with virtual buttons. For what I understood this is only doable in Unity, am I wrong? Can someone confirm this is doable outside of Untiy/Unreal Engine? Thanks.2.4KViews4likes2Comments