Kiosk mode or "at least" a way to disable system menu gestures?
We are developing applications that will not be released on the Meta store. These include utilities and tools for other teams to use, as well as games intended for dedicated contests where users are required to remain within the app (and again... I'm not talking about apps available on the store but apps that our final clients will use in they HQ or physical stores to show off their products). Based on my understanding, many other developers are making similar requests. Is there a possibility to hide the system menus for the 'hand tracking' mode? Additionally, are there plans to introduce a 'kiosk' mode?2.7KViews17likes12CommentsFIX THE ISSUE THAT RESULTS IN CONTROLLERS FAILING ON REGULAR BASIS
Why is meta unable to address the serious problem with controllers failing or bricking, or purple lights of death? Replaced mine 3x in month and keep getting refurbished junk. what is meta doing about this? It's costing me time, money, and a ton of frustration with terrible support & constant down time. I paid $500 for this thing but have maybe 3 full months of play time over 10 months. How is this OK?14Views0likes0CommentsUnintended automatic Recenter when using hand tracking with Quest Link
Location of the issue: Quest Link home screen and applications using Quest Link Symptoms: - First, put on the Quest headset while it's in sleep mode. - The proximity sensor then wakes the Quest from sleep mode and launches the app. - At this point, if you use hand tracking to turn your palm towards yourself, it unintentionally triggers an automatic Recenter. Additional information: - Once this behavior occurs, it won't happen again until the Quest enters sleep mode next time. - This issue occurs not only in Quest Link apps but also on the Quest Link home screen. - The Recenter seems to trigger the moment the Meta button appears on the wrist. - We checked the log output using ADB, but couldn't find any relevant error information at the moment this issue occurred. This Recenter occurrence significantly disrupts the user experience when demonstrating apps that rely solely on hand tracking. When users move their hands, it unintentionally triggers a Recenter, causing confusion with the user's perspective and severely compromising the experience. If anyone has encountered a similar issue and found a solution, we would greatly appreciate your input. video: https://drive.google.com/file/d/1lxnDHhZV89W7NyO-GNWtaZdqp3jz4og0/view?usp=sharing1.6KViews1like3CommentsHand Tracking menu ruins game play
The first thing people do with hand tracking is look at their hands. The second thing they do is touch their fingers. Then Quest shuts down the game; because that's the hand gesture Meta chose as an 'escape' key. I encourage players to see/feel their hands in the experience because it is so much more enjoyable and immersive. Literally the entire point of mixed reality. This menu punishes all that fun with a distracting, overly sensitive button that apparently cannot be disabled. But can it be delayed? Ideally, the icon would not appear until after touching (and holding) thumb/finger together for 2 seconds, then become active (similar to holding controller's menu button down to reset view). I understand Quest "needs" an escape gesture, but not if it constantly interrupts everything. Anyone else dealing with this? Found another solution or workaround?2KViews5likes6CommentsProper way to set up simultaneous hand & controller input for Quest3?
Hi. This topic might be redundant with the topic of this thread, but with the SDK updated to V62, the useful official documentation provided in the previous thread was deleted, and I can't find an alternative document. Also, the simultaneous hand & controller input has been released as an official feature, not experimental, so I started a new thread. I was able to make simultaneous input work on Quest Pro with SDK57, but it hasn't been successful with V62 & Quest 3 yet. (I sold my Quest Pro!) I confirmed that the feature itself works properly on Quest 3 as well with the demo published on the App Store, Interaction SDK Samples, but it seems that this sample project is not included in the samples for the v62 SDK. Am I missing something? If anyone has any cautions, useful information, or links regarding this matter, please share your knowledge.Solved2.7KViews1like4CommentsHow to add constraints to grabbable object such it moves slower or needs much force to move?
Hi I'm making a liver/driving handle interaction I want to add constraints such that it reduce the motion of grabbable object (need to add larger force to move the liver) Is there any inbuilt way in which i can achieve this ? I went through "https://developer.oculus.com/documentation/unity/unity-isdk-grabbable/#one-grab-transformers" it adds only specific angle/axis limits. Thanks in advance. Big_Flex900Views0likes2CommentsGrab Objects with weight Unity Meta Interaction
Hi, I am new to using the SDK and I would like to know if there is a way to make that when I pick up objects they move according to their weight, for example if I pick up a heavy object, when I move my hand the hand tracking should take more time to reach the position of my real hand compared to if I was holding a lighter object, I have looked for multiple solutions but I have not been able to achieve anything yet, I appreciate any help!304Views0likes0CommentsTrigger room setup /space setup from app
Hello community, I'm currently work on a Demo for mixed reality. For my demo I need space setup / room setup being setup correctly to work fine for occlusion but I don't found in documentation how to force room setup to be done before launch the app, verify space setup is present, or force to launch space setup from app like the Meta Demo "space encounters" do. There is any method to do this ? Or documentation ? Thanks you all for future answers.1.1KViews0likes4CommentsHandtracking fails after switching OVRCameraRigs
the handtracking works perfectly fine when open palms are facing the headset. but stops immediately when grabbing or rotating the hands. However the Oculus button still shows up and handtracking works normal when pausing the app. I'm switching between OVRCameraRigs by instantiating two prefabs. both have handtracking and controller tracking enabled. The first one I've set up myself and the second one is from the vuplex webviev example scene for the quest. After switching back from the vuplex-OVRCameraRig to mine and loading out the scene with vuplex components the tracking fails and switching back to controllers isn't detected either. At this point I unload everything and make sure that all persistant objects are destroyed, before loading the first scene non-asynch. But the issue still persists. I've also removed the OVRManager from all OVRCameraRigs and added an OVRManager to an empty gameObject that is initiated in the first scene.352Views0likes0Comments