The eye tracking is only work when first time open the Unreal Engine
I used the VR Preview to debug my eye-tracking project, yes, it works with the blueprints (when I first time open the project): and I print the state of eye tracking data, it is true: But this time I end the VR Preview, and I preview the project next time, the state of the eye-tracking is False. And of course I can't receive any gaze data. And I reopen the project, it will repeat the previous steps. It means each time I want to debug with eye tracking system, I should restart my Unreal Engine. Why it only works when the first preview after my project opened? the Unreal Engine is 5.1.1, the Integration is v54. I'm using Quest Link( Using cable to streaming ). I copied my project to another computer it has the same problem.1.9KViews0likes6CommentsUnreal Engine 5.0.3 Platform SDK
Hey, is there any place to access older versions of the Platform SDK for Unreal Engine 5? We just released our VR title, Kid Pilot, to Steam and were beginning the setup to release it to the Meta Store. However, we're working with Unreal Engine 5.0.3, we do have the Meta XR plugin for this engine version, but not the Platform SDK, and on the downloads page, the oldest version is 56.0, for UE 5.2.1 Any help towards finding where the older versions of the SDK can be found would be greatly appreciated.681Views0likes0CommentsPassthrough not displaying meshes
Hey, so I've been banging my head against the wall for the past couple hours trying to get passthrough to work properly, I can get the camera to show up using this example https://github.com/Ayushanbhore/Unreal-Quest3-PassthroughSample but when using the OculusXRPassthroughLayer with LayerPlacement set to Underlay, Meshes and objects just wont show up... Does anyone have a solution for this, or is it working for you? Would really like to get started developing Mixed Reality Games.1.3KViews0likes3CommentsBest Practices for Quest 3 - no documentation and no update
HI! I have decided to develop for Meta Quest 3. But I notice that Meta documentation is often old and has never been updated. It refers to the Oculus Plugin and has the Quest 2 as its most modern reference device. This is a problem when we, as developers, try to set up a job seriously and want to respect all the Meta directives, because it is not clear if they will end up creating a greater cofusion, referring to procedures sometimes 2 years old. With the introduction of openxr, Unreal Engine 5.3.2, the new Meta headsets, a new plugin no longer called Oculus Plugin but Meta XR, it is no longer clear what should be done, especially before developing an app for Quest 3 Some examples: 1) is it necessary to set a target device indicating Meta Quest 3 in advance APK Packaging? 2) Is the Get Device Type node useful for Quest 3 or can it be ignored? 3) the configuration recommended by Meta for Oculus Plugin Settings is no longer referable to the MetaXR plugin. What are the best settings for this latest plugin, with reference to the various standalone HMDs? 4) the Meta documentation often refers to Unreal Engine 4, while we are now at version 5.3.2. There is a huge difference between the two versions. Is what is written with reference to Unreal Engine 4 considered obsolete? And if so, why is it still part of the documentation available to developers on the Meta website? I ask where I can find updated and reliable information and guides, because the lack of support is one of the first and biggest obstacles to creating apps and software for Meta products. Thank you1.2KViews2likes0CommentsUE5 Shared Spaces + Shared Spatial Anchors
Hey there, I'm currently facing a challenge while attempting to combine the SharedSpaces and SharedSpatialAnchors Templates. The issue arises when trying to integrate the framework of SharedSpaces into the sharedspatialanchors template, or vice versa. Specifically, I'm unable to get the SharedSpaces framework to function properly within the sharedspatialanchors template. On the other hand, when I try to incorporate the sharedspatialanchors functionality into the sharedspaces template, I'm unable to get the Passthrough feature to work. Additionally, I've encountered intermittent compile errors with the OculusPlatform Plugin (OVR), particularly when compiling for Android. Strangely enough, I don't encounter any problems when starting the engine itself. I've been working on resolving this for quite some time, but I always seem to hit a roadblock. I would greatly appreciate any assistance you can provide. Thank you in advance for your help! ~formulated with the help of ChatGPT3.3KViews3likes6CommentsError on Live Link | Face, Body, Eye Tracking
Hello, hope all is well. I am trying to get Live Link working on the Meta Quest Pro in Unreal 5.2, and have tried in 5.1 as well. I keep running into these errors in the Live Link window after connecting Meta MovementSDK Live Link. Live Link Errors: Trying to add frame data that is not formatted properly to role 'LiveLinkAnimationRole' with subject 'Eye'. Trying to add frame data that is not formatted properly to role 'LiveLinkAnimationRole' with subject 'Body'. Trying to add frame data that is not formatted properly to role 'LiveLinkBasicRole' with subject 'Face'. (My Quest Pro is connected correctly and I can run the Movement Sample without any issues using the official release 5.2/5.1 and MetaXR Plugin and have also built the GitHub version of UE5.2)2.6KViews0likes3CommentsOculus Lip Sync plugin for UE5 Installation not working
Good Day, I downloaded the "Oculus Lipsync plugin" and followed the instructions given in the document. As per the documentation, I copied the the "OVRLipsync" to "UE4.27-Engine-Plugins" and also into "UE5.1-Engine-Plugins". But when I open a project in both the versions, The plugin is throwing an error saying the plugin is compiled for different version. I also tried to copy the plugin directly to the project folder, Still same error. Please sort the issue for me, Thanks,2.9KViews1like4CommentsTracked Keyboard SDK for UE5
Is there a way to get the Tracked Keyboard SDK working in Unreal Engine 5? I didn't know if there was an alpha or beta plugin available that I could try out with the Logitech K375s. If not is there a way I could extend the plugin in github and add this capability myself? I tried using passthrough by itself but its so blurry that it is not really suited to type a paragraph. The Keyboard SDK is perfect but I can only find Unity or Native examples.601Views0likes0CommentsAbout quest pro, how to use face capture data in the real engine 5
Quest pro has the functions of eye expression capture and facial expression capture. But I didn't find out how to get this data in the unreal engine in the document How can I get this data and the corresponding expression for development in the unreal engine? I may need some help4.2KViews0likes3Comments