Kiosk mode or "at least" a way to disable system menu gestures?
We are developing applications that will not be released on the Meta store. These include utilities and tools for other teams to use, as well as games intended for dedicated contests where users are required to remain within the app (and again... I'm not talking about apps available on the store but apps that our final clients will use in they HQ or physical stores to show off their products). Based on my understanding, many other developers are making similar requests. Is there a possibility to hide the system menus for the 'hand tracking' mode? Additionally, are there plans to introduce a 'kiosk' mode?2.9KViews17likes12Commentsv77 Grab Hand Pose Translation/Rotation
Hi, I'm using v77 of the Interaction SDK in Unreal 5.5.4 for hand tracking, and am having issues with a HandGrabPose not following the grab location/rotation when using rotation and location transformers. I am using an object setup with multiple grabbable pieces. My blueprint hierarchy is essentially: Mesh -- GrabbableComponent1 -- SubMesh ---- GrabbableComponent2 ------ HandGrabPose GrabTransformer1 (for main mesh, with free transformer) GrabTransformer2 (for sub mesh, with translate constraint applied) The actual interactions of the meshes work without issue. I can grab the main mesh, and grab the submesh to move it with its translation applied. My HandGrabPose component has the following properties set: The initial posing is fine and the hand will move to the intended pose position, but as the submesh is moved, the hand pose stays in the same location rather than following the submesh as I would like. Looking at the source for hand poses, I can see that the hand pose override is set via a call in OculusInteractionPrebuilts/Rig/IsdkHandPoseRigModifier to HandMeshComponent->SetHandPoseOverride once the grab state changes to Selected, but as this is only called upon initial grab, the required RootPoseOffset does not get updated upon translation/rotation of the grabbed component. This is not an issue for components with only one grabbable where the entire object will be moved and thus the offset remains constant, but for my case with sub-objects the pose offset obviously does change but the hand remains in its initial location. Has anyone run into a similar issue and could recommend a solution or workaround? I'd be happy to provide more information on my setup if needed. Thanks!91Views0likes2CommentsFIX THE ISSUE THAT RESULTS IN CONTROLLERS FAILING ON REGULAR BASIS
Why is meta unable to address the serious problem with controllers failing or bricking, or purple lights of death? Replaced mine 3x in month and keep getting refurbished junk. what is meta doing about this? It's costing me time, money, and a ton of frustration with terrible support & constant down time. I paid $500 for this thing but have maybe 3 full months of play time over 10 months. How is this OK?20Views0likes0CommentsQuest 3 seemingly "random" recentering with palm up gesture
Just doing a sanity check here to see if other developers are experiencing any issues with the oculus doing weird things recently in regard to using hand interactions and the Meta reorientation button. After around July this year I've been experiencing an undesired automatic system recenter whenever the hands are palm up facing the camera. At first, I thought it was a random thing or due to testing the device with PCVR link, because this "auto" recentering is not 100% consistent - seemed random until recently. I ignored it up till now because I was too busy to address it. Recently though, I've seen a few others posting about it, but not enough to confirm if it's just our implementations or a Meta OS thing. But the more I test, the more I realize it is some fundamental change in one of the many recent Meta OS updates. On top of all this it doesn't recenter properly - at least not in my case. I am making a training app where I transition the user through lessons with some freedom to move if they're able, but it is designed for schools with potentially small, confined spaces. I also have my own reorient button and algorithm that the user can use to reorient themselves if they need to face a specific direction to fit with their space, so that the lesson items are always in front of them. It works perfectly. However, if they use the Meta button it will not always reorient them correctly - slightly off. Additionally, there is a problem with a seemingly random activation throughout my app when the users hand happens to turn palm up. I just so happen to have a "palm" menu that activates - by you guessed it - a palm up gesture. So, this annoying random recentering is more noticeable than usual. I am using hand interactions exclusively with floor tracking origin type - Unity 2022.3.19f, Quest 3, with Meta's All In One SDK v64. Just found this... Oculus hand tracking palm menu is automatically re... - Meta Community Forums - 12436512.9KViews5likes8CommentsTechnical Questions for a LBS Game: Disabling System Gestures, Spatial Mapping & Remote Control API
Hello, I'm looking to create a multi-user, large-scale, location-based (offline) game and have a few questions: 1. Is there a way to disable system-level gestures to prevent players from accidentally exiting the application and returning to the home screen during gameplay? 2. Is there a method for scanning a large physical space (approximately 10x10 meters) to generate a persistent and shareable map file? 3. Is it possible to enable or provide some control API? We need an interface that allows a central controller to remotely start and stop the application on all devices, as well as manage the installation and updating of game content.57Views0likes1CommentUse Take Recorder to record animations from tracked hands
Hi everyone, I have hand tracking activated and am trying to record the hand components using the Take Recorder plugin. My goal is to capture hand animations which I can later on apply to a NPC without having to manually animate hand poses. I was able to record animations from the official HandSample when using the controllers to manipulate the hand mesh. But when switching to hand tracking the hand poses are not recorded. Instead, only rotation and position of the hand component are stored in the sequence. I hope I managed to express what I am trying to achieve in a comprehensible manner. Has anyone tried to record hand animations from tracked hands using Take Recorder before and got it running? Or has an idea what I am missing? Thanks for your help!Solved8.2KViews0likes11CommentsUnreal - Use Take Recorder with Live Link for Hand Tracking as Motion Capture?
Hi everyone, I'm trying to record an animation in Unreal using the Hand Tracking of the Quest 3 and 2 as a motion capture and once recorded I need to export the animation for future use. When I record with the Take Recorder, the only things that move are the rotation and location of the Motion Controller Components. I searched for a Live Link connection but found only the tracking for body and face with Quest 2. Is there a function for recording the Skeletal Mesh movements of the hand with the Take Recorder? (I found this Post similar to my question but didn't know how to record the bone transforms.) Is there a Live Link connection for the Hand Tracking? I'm using the Meta XR plug-in and already set it for Hand Tracking only.1.7KViews0likes2CommentsUnintended automatic Recenter when using hand tracking with Quest Link
Location of the issue: Quest Link home screen and applications using Quest Link Symptoms: - First, put on the Quest headset while it's in sleep mode. - The proximity sensor then wakes the Quest from sleep mode and launches the app. - At this point, if you use hand tracking to turn your palm towards yourself, it unintentionally triggers an automatic Recenter. Additional information: - Once this behavior occurs, it won't happen again until the Quest enters sleep mode next time. - This issue occurs not only in Quest Link apps but also on the Quest Link home screen. - The Recenter seems to trigger the moment the Meta button appears on the wrist. - We checked the log output using ADB, but couldn't find any relevant error information at the moment this issue occurred. This Recenter occurrence significantly disrupts the user experience when demonstrating apps that rely solely on hand tracking. When users move their hands, it unintentionally triggers a Recenter, causing confusion with the user's perspective and severely compromising the experience. If anyone has encountered a similar issue and found a solution, we would greatly appreciate your input. video: https://drive.google.com/file/d/1lxnDHhZV89W7NyO-GNWtaZdqp3jz4og0/view?usp=sharing1.7KViews1like3CommentsHand Tracking menu ruins game play
The first thing people do with hand tracking is look at their hands. The second thing they do is touch their fingers. Then Quest shuts down the game; because that's the hand gesture Meta chose as an 'escape' key. I encourage players to see/feel their hands in the experience because it is so much more enjoyable and immersive. Literally the entire point of mixed reality. This menu punishes all that fun with a distracting, overly sensitive button that apparently cannot be disabled. But can it be delayed? Ideally, the icon would not appear until after touching (and holding) thumb/finger together for 2 seconds, then become active (similar to holding controller's menu button down to reset view). I understand Quest "needs" an escape gesture, but not if it constantly interrupts everything. Anyone else dealing with this? Found another solution or workaround?2KViews5likes6CommentsTransformRecognizerActiveState is never activating
I haven't been able to get a Transform Recognizer Active State to register as active. I have no issues with Shape Recognizer Active State. Are there any additional steps that are required in order to get a TransformRecognizerActiveState to work? I have also examined it with an Active State Debug Tree UI and it confirms that the shape recognizer is activating but not the transform recognizer. Here is my component setup: Can anyone see or guess what I might be doing wrong?Solved2.5KViews0likes4Comments