Technical Questions for a LBS Game: Disabling System Gestures, Spatial Mapping & Remote Control API
Hello, I'm looking to create a multi-user, large-scale, location-based (offline) game and have a few questions: 1. Is there a way to disable system-level gestures to prevent players from accidentally exiting the application and returning to the home screen during gameplay? 2. Is there a method for scanning a large physical space (approximately 10x10 meters) to generate a persistent and shareable map file? 3. Is it possible to enable or provide some control API? We need an interface that allows a central controller to remotely start and stop the application on all devices, as well as manage the installation and updating of game content.46Views0likes1CommentHand tracking update root scale not working
I'm trying to use Hand tracking in my app and no matter what the Hand scale stays at one event with a friends hands that are much smaller. After some investigation, and a lot of debugging, I found that the hand scale is calculated for the first frame of the application and is at like 1.1 before it gets switched back to 1 for ever. A "solution" I found is to switch of the update root scale parameter of my hands and I could then scale them depending on this initial value but as per the documentation says, the root scale is supposed to get updated during runtime. (The documentation is pretty empty on everything though and it's never detailled how they are supposed to be mesuring that). Does anyone managed to have the root scale update for their hand tracking ? If yes, could you share some insight with me ?Solved1.5KViews0likes2CommentsHands Model controlled by controllers (no hands tracking)
Looking for instruction for new Oculus Quest Meta XR SDK (All-in-One) how to add a hand model instead of a controller model. For classic game mod: control of hands using a controller (press the grip - your hand clenches into a fist, etc). YouTube is full of videos on how to do this using XR Integration tool kit, but no similar instruction for Meta XR Sdk. Unity Engine1.3KViews1like1CommentStarting Oculus Link directly in PC-Desktop View
I want to use Oculus/Quest Link directly for development and keep my headset on for some time. Since it always disconnects for different reasons I have to restart Quest Link quite often. Is there any way to start the DesktopView of my PC directly on starting Quest Link? It's very annoying to always activate it manually. Also, why is there no handtracking in Quest Link? You changed the name, activate the features!571Views2likes0CommentsHand Tracking over Link still not working in Unity Editor
Hi, I have spent the last two years developing for the Quest 2, and recently got the Quest 3. It's a great device and I'm super happy with it. There is just one big problem standing between me and total developer bliss. How is it possible that we still don't have a stable, robust hand tracking implementation across the whole development pipeline in 2023? I'm confused as to how Meta envisions developers to take full advantage of their (really good) hand tracking tech, if there are consistent inconsistencies, fumbling around, trying seven different versions of all of the little SDKs, components, etc. Can someone please advise me on how to achieve a simple Unity scene using the standard Oculus Integration, where I can just click "Play" in the editor and get hand tracking working over my Link cable? So far I have gone through five different Unity versions from 2021-2023, even more different Oculus Integration SDK versions (v50-57), and three different headsets (Quest 1, 2 and 3). Nothing worked. The only way that I have managed to have hand tracking working in the editor via link is to use a long deprecated version of the Oculus Integration, where I had to manually select "Oculus -> Tools -> OVR Utilities Plugin -> Set OVRPlugin to Legacy LibOVR+VRAPI" from within Unity. Three things. First, this option is no longer available in the later Oculus Integration versions. Second, selecting this option explicitly disables the possibility of building for the Quest 2 and 3. So you'd have to switch back and forth between the old LibOVR+VRAPI and the newer OpenXR integration, just to get hand tracking working in the editor. Really? Third, we as developers cannot reasonably be expected to stick to this legacy API, as all of the newer mixed reality features, like scene understanding, spatial anchors, etc. are not supported in the old version. Hence I want to ask my question one more time: how does Meta expect people to develop for their platform? Please let me know if you have an answer to this dilemma, I am grateful for any pointers! Note: I am explicitly talking about hand tracking through the Unity Editor using Link, in the standalone Android builds it works fine and it's amazing to use! Best, MaxSolved8.9KViews3likes14CommentsUnity Editor - Hand Tracking over Link Problem
I've spent the last two years creating for the Quest 2 and now received the Quest 3. It's an excellent item, and I am quite pleased with it. There is only one major issue standing between me and complete developer joy. How is it feasible that we won't have a stable, robust hand tracking implementation across the entire development pipeline by 2023? I'm not sure how Meta expects developers to fully utilize their (really good) hand tracking technology if there are consistent discrepancies, fumbling about, and attempting seven different versions of all of the small SDKs, components, etc. Could someone kindly tell me how to create a simple Unity scene using the normal Oculus Integration, so that I can simply click "Play" in the editor and have hand tracking working via my Link cable? So far, I've tested five different Unity versions ranging from 2021 to 2023, as well as several different Oculus Integration SDK versions (v50-57) and three different headsets. Nothing worked. The only way I've been able to get hand tracking working in the editor via link is to utilize a long-deprecated version of the Oculus Integration, which required me to explicitly pick "Oculus -> Tools -> OVR Utilities Plugin -> Set OVRPlugin to Legacy LibOVR+VRAPI" from within Unity. Three things. First, this option is no longer available in later Oculus Integration releases. Second, picking this option clearly prevents the possibility of building for Quests 2 and 3. To get hand tracking to work in the editor, you'd have to switch between the old LibOVR+VRAPI and the newer OpenXR integration multiple times. Really? Third, we cannot realistically expect developers to continue to this historical API because all of the latest mixed reality features, such as scene understanding, spatial anchoring, and so on, are not supported by the older version. As a result, I'd want to raise my original issue once more: how does Meta expect people to create for their platform? Please let me know if you have a solution to this problem; I would appreciate any insights!624Views0likes0Comments