Game-Design for post-stroke patients, need help hacking the interactions SDK
Hi everyone, I'm currently working on a Unity game designed for post stroke patients with hemiplegia undergoing motor rehabilitation. I want to use Hand Tracking to make the players grab different objects. Problem is : the impaired hand often has very reduced finger mobility, so grabbing motion is out of the question. So I would like to know how to : 1- Trigger fake hand motions to simulate grabbing on one hand with a simple event (also ignore actual hand motion except for position) 2- Even better : simulating a grab from a different object that i could control independently for animations etc... (Also i need this because eventually the game will have to work with a motorized rehabilitation glove that will be tracked via Passthrough because default hand tracking doesn't track him at all) I'm currently using the basic interaction building block and sample scenes prefabs for the objects (Chess piece prefab : Touch Hand Interactable + Grabbable components) If you have any leads on how to approach this problem, I'll be very grateful ! Joseph, student gamedev13Views0likes2CommentsVR Preview does not work when MetaXR plugin is enabled in Unreal Engine
This issue seems to have started after updating Meta Horizon Link to version 83.0.0.333.349. At least with version 83.0.0.311.349, developer-specific features (such as passthrough and hand tracking) did not work, but VR Preview itself was functioning correctly. Current behavior: When I press VR Preview, the Unreal Engine preview window enters a non-VR state (the camera is fixed at floor level and does not move), and nothing happens on the Meta Quest headset connected via Air Link. On the headset side, the Link waiting screen (the gray environment) continues to be displayed indefinitely. Is anyone else experiencing the same issue? Any information or insights would be greatly appreciated. Environment: Unreal Engine 5.5 Meta XR Plugin: 78 Meta Horizon Link: 83.0.0.333.34923Views0likes3CommentsStylized passthrough: How can i retexture walls?
Meta Horizons documentation on Scenes gives this image as an example of a Basic stylized passthrough. This looks to me like a screenshot of a stylized hall way. How can such an effect be accomplished in Kotlin without using Unity or Unreal? Can this effect also be achieved on Quest 2 or only on Quest 3(s)? The article mentions that Assisted scene capture (available on Quest 3(s) only) shouldn't be used to create such an effect.4Views0likes0CommentsStereoscopic panel - Undocumented or non-existent?
According to Meta's Design documentation on Panels Panels are the rectangular surfaces that display 2D (and sometimes stereoscopic) app content in Quest. Panels can contain stereoscopic app content. However i could find no documentation how to accomplish that and in the examples i checked this feature wasn't used either. Is the official documentation mistaken about the possibility to display stereoscopic app content? If not how is that accomplished? A self contained example would be appreciated.2Views0likes0CommentsI cannot use VisemeDriver Component
I would like to estimate visemes from audio in Unity. I configured my project according to the documentation, but FaceExpression.AreVisemesValid is always false, and as a result, the VisemeDriver component does not work. This issue also occurs in the Face Tracking sample scene included with the Meta Movement SDK. Unity version: 6000.2.10f1 Meta Core SDK version: 81.0.0 Meta Movement SDK version: 81.0.0 How can I use the VisemeDriver? Please let me know if there is a workaround or solution.8Views0likes1CommentIntroduction and assistance please!
Hello everyone, first time poster so might be a bit of a long one. Jan 1st I began a self taught journey into Meta Horizons. I am Ex -British Military and Teacher wanting to create learning environments for my colleagues using VR technology, specifically in the field of Therapy. I have been building a small world and have made what I consider to be moderate gains given my lack of knowledge and skill but I am enjoying the process so that's the main thing I suppose. I have been relying quite heavily on AI using CoPilot, Grok, Chat and even the in built AI on the editor but I am finding increasingly that they aren't seeing what I am seeing quite often, so their suggestions (albeit said with huge confidence on their part!) are incorrect (unless its me just need understanding them and the editor as well as perhaps I think. One of my primary issues is that I am not in Desktop Editor but in fact Desktop studio. I have tried repeatedly to adhere to the AI instruction on correcting this but i just end up back in the same place which is with what I believe to be Desktop Editor and what they believe to be desktop studio. I think this disconnect is causing me to not be able to advance as quickly as I could. So my first question is: Can someone tell me if I am using the right software? I have downloaded and am using Worlds Desktop Editor (that's the title of the thumbnail with the "O" on the black background. AI is telling me I am in studio and not Editor and that's why their suggestions aren't marrying up with whatever software it is I am actually working with? Secondly, can I trust AI to continue to help me with this or do I need a different approach? Thirdly, I have an NPC created and I want to attach a text to speech audio file to give the NPC dialogue that can be interacted with i.e. on the click of the trigger the NPC starts speaking. Be great to hear peoples thoughts and I am more than happy to help out with the challenges I have encountered so far and overcome. Thanks in advance - Ant83Views1like5CommentsConflicting Information in the Horizon OS SBC (Shader Binary Cache) Documentation?
In the documentation regarding building a shader binary cache per-platform (link) the documentation states: Using this feature, once one user starts the app and manually builds the SBC, all other users with the same device and software (Horizon OS, graphics driver, and app) will be able to avoid the shader generation process by downloading a copy of a pre-computed SBC. However, later on the same page, it states there is an automation in place to launch the apps and perform scripted prewarming logic if requested. The system automatically identifies and processes Oculus OS builds and app versions that require shader cache assets. It generates and uploads these assets to the store backend and automatically installs them during an app install or update. Does this feature support both of those setups? If I am not scripting any custom warmup logic, will shader binary caches still be shared between users with identical setups? IE, if I simply play the release candidate on the target OS version/hardware, will my SBC be automatically uploaded, or are SBCs only distributed when a scripted warmup sequence is present? Few details are provided regarding SBCs from other users being uploaded, so I'm curious if this is an inaccuracy or not. Thanks, excited to see features like this in Horizon OS. Very important for first time user experience.6Views0likes0CommentsHow to disable controller's auto-sleep?
Hello, I'm working on a project (PCVR) that continually reads coordinates from Quest Pro controllers (integrated cameras), all works fine in my side. My issue is that the controller automatically turns off (auto-sleep) after few minute if no movements detected, so, reading the controller's coordinates breaks. How to disable controller's auto-sleep? Thank you.9Views0likes1CommentIs buying from the shop with in-game tokens different than using Meta credits?
Hello I set up the shop gizmo in my world so players can buy perk upgrades (auto-consumed items). My problem is that when I change the price from Meta credits to in-game tokens, it suddenly stops working correctly. I wanted to know if the event triggered is different when buying items with tokens versus Meta credits. Thank you for you helpSolved106Views0likes8Comments