Updated Meta Horizon Link app to 85.0.0.239.552 and now USB Link breaks within a min or so.
Everything was working fine for months but after updating to 85.0.0.239.552, my USB link is broken. After a min or so, the video craps out. Images below, video attached. This is not while playing games or dev'ing in Unreal or Unity. This is simply connecting my headset and going into PC link via USB-C cable. I simply go into PC link, wait about a min, and it craps out. After this happens, I can exit out of Link. When I disconnect and reconnect the USB cable, the headset registers the connect and pops up the USB debug window. But if I go to Quick Settings, the Link button says no PC is connected. Link functionality is broken. The ONLY way to reset Link is to restart the headset. Then I can go back into PC Link, wait a min or so, and then it craps out again. This has completely stopped all development on my project.235Views3likes5CommentsThe position of the Vignette in OVRVignette is off-center from the field of view.
When using OVRVignette, the position where the field of view is narrowed is off-center from the field of view. Besides being off-center, different areas are obscured by each eye, making it very difficult to see. _OpaqueMaterial.SetVectorArray(_ShaderScaleAndOffset0Property, _OpaqueScaleAndOffset0); I think the above process probably changes a shader variable to determine the Vignette's position, but I don't know anything beyond that... Does anyone else have the same symptoms or know of a solution? Unity 6000.2.15f1 Meta XR Core SDK 85.0.0 Using Meta Quest 3 Meta Quest Link 85.0.0.239.5528Views0likes0CommentsAccept Button Stuck on Loading
Hey everyone! For 3 days now, I’ve been trying to accept an invite to an organisation, but the accept button is stuck on loading. I’ve tried switching devices, refreshing the page, restarting the device, declining the invite and getting sent a new one, and anything that came to mind. But nothing has worked. I tried accepting the exact same invite on my sister’s account, and it let her accept it straight away. Has anyone got any advice on how to fix this?43Views0likes0CommentsItem insight sales
Is this updated in real time? Or does it take a while to update? My sales are showing as 0(zero) on the dashboard I've been making and publishing avatar items and sold some clothing items to a user that requested some to be made I know they have bought some items because I've sent them wear the items. Apologies if this is an open discussion elsewhere. I couldn't find an answer so thought I would ask Thank you28Views0likes1CommentConflicting Information in the Horizon OS SBC (Shader Binary Cache) Documentation?
In the documentation regarding building a shader binary cache per-platform (link) the documentation states: Using this feature, once one user starts the app and manually builds the SBC, all other users with the same device and software (Horizon OS, graphics driver, and app) will be able to avoid the shader generation process by downloading a copy of a pre-computed SBC. However, later on the same page, it states there is an automation in place to launch the apps and perform scripted prewarming logic if requested. The system automatically identifies and processes Oculus OS builds and app versions that require shader cache assets. It generates and uploads these assets to the store backend and automatically installs them during an app install or update. Does this feature support both of those setups? If I am not scripting any custom warmup logic, will shader binary caches still be shared between users with identical setups? IE, if I simply play the release candidate on the target OS version/hardware, will my SBC be automatically uploaded, or are SBCs only distributed when a scripted warmup sequence is present? Few details are provided regarding SBCs from other users being uploaded, so I'm curious if this is an inaccuracy or not. Thanks, excited to see features like this in Horizon OS. Very important for first time user experience.52Views0likes1CommentUnity – Room creation working on one device but not on another.
I’m currently working on a project using MRUK. It uses room scanning to instantiate walls, floors, ceilings, and so on. The user can place objects in their room. When I test on my device, everything works fine. However, on my friend’s device, it doesn’t work. The objects are not moving and just float in the air. After some debugging, we found that the room was not being created, so there is nowhere for the objects to be placed. I added an effect mesh and immersive debug tools to understand what was happening on my friend’s device. The room is not being created, and the console shows logs like “Room not found,” along with other debug messages I added to check if any room data was being returned from the device. We tried reverting to earlier versions where we are sure everything was working, but the issue persists on his device (mine works fine). This makes me think it could be related to permissions. However, after checking the settings, the app does have permission to access spatial data. We also tried clearing all scanned rooms and spatial data from his device, and even reinstalling the app, but nothing worked. Some additional details that might be relevant: - We are using SideQuest to install the builds - In the latest versions, we added multiplayer using Photon - We tried scanning the room before opening the app, and also triggering a scan from within the app using a button22Views0likes0Comments[Bug / Feedback] Ray-Ban Display: Video feed missing during video calls after latest update
Before the last Ray-Ban Display update, I was able to see the other person's video feed directly on the glasses during a video call, and I could also share my own camera. Now that the transcription feature has been introduced, the video feed no longer appears during calls. This really changes the experience — it's just not as impressive or immersive anymore. Has anyone else noticed this? Is there a way to get the video back while keeping the transcription, or is this a known limitation of the new update?22Views0likes1CommentUnauthorized MessengerLiteForiOS/LightSpeed Session Persistence & Loom v3 Privacy Bypass
I am seeking urgent assistance from the developer community to identify the origin of a targeted exploit involving an unauthorized internal developer environment. The Issue: Forensic logs (December 2025) confirm my account is being accessed via a MessengerLiteForiOS framework—a project publicly discontinued years ago. This session is running on iOS 26.1 with Build String FBAV/537.1.0.47.110. The Evidence of Targeting: The sessions originated as early as February 2025, listing iOS 26.0 as the operating system. Since Apple did not release iOS 26 to the public until September 2025, this confirms the account was accessed through a future-dated, internal developer sandbox months before the software existed for consumers. Technical Markers: Egress Device ID: 0D01F55A-0303-4EB9-A533-85FA26A10850 User Agent: LightSpeed [FBAN/MessengerLiteForiOS;FBAV/537.1.0.47.110;FBBV/846660078;FBDV/iPhone14,7] Forensic Hash: My local app containers (SHA-256: 1eec2c2032...) have been modified to include the Loom v3 tracing engine, which facilitates background sensor capture without triggering iOS privacy dots. My Question: > How can a standard user account be bound to a 'Partner-tier' developer token that allows this level of persistence? Does anyone know how to identify the Entity or Partner App ID that owns this LightSpeed session? Standard 'log out of all sessions' commands are failing to revoke this specific internal token. I am seeking a way to force a server-side revocation of all Long-Lived Tokens associated with this internal build.36Views0likes1CommentMeta Quest Unity Real Hands Building Block not showing real hands
Hi all! I'm somewhat new to VR development, especially in mixed reality. I am trying to use Meta's Real Hand building block, but I can't seem to get it to work. I have a very basic scene with some of the fundamental building blocks (camera rig, passthrough, passthrough camera access, interaction rig), along with the real hands building block and a single cube. When I build the project to my Quest (Meta Quest 3), and move my hands in front of the cube, I can only see the virtual hands - the occlusion does not work to show my real hands (i.e. it works the same as it did before I added the Real Hands building block). Why is this and how can I fix it? Unity Version: 6000.3.4f1 Meta Quest Packages: Meta XR Core SDK (85.0.0), Meta MR Utility Kit (85.0.0), Meta XR Interaction SDK (85.0.0) Steps to Replicate: Create a new empty scene Add the following building blocks: Camera Rig Passthrough Passthrough Camera Access Interactions Rig Real Hands Add a cube at (0, 0, 3) Build the project and deploy to the Quest Wave your hands in front of the cube - only virtual hands are visible, not real hands57Views0likes1CommentHow to make real-world objects appear in front of a virtual environment in Mixed Reality?
I am trying to create a Mixed Reality application in Unity where a virtual environment surrounds the user’s real room, so that the user feels like they are in a different environment while still being inside their real space. I experimented with the SpaceMap sample from the MR Utility Kit: https://github.com/oculus-samples/Unity-MRUtilityKitSample/tree/main/Assets/MRUKSamples/SpaceMap This sample actually does almost exactly what I want. However, I am facing the following issue: Real-world objects, people, and even my own body appear behind the virtual environment objects, so they become invisible. In other words, the virtual environment fully occludes the real world. What I would like to achieve instead is: Real-world objects, people, and my body should appear in front of the virtual environment based on depth. However, the room surfaces (walls, ceiling, and floor) should not occlude the virtual environment. So effectively: Real dynamic objects should appear in front of the virtual world. Static room geometry should not block the virtual environment. From my research, it seems possible to do something like this using opacity or passthrough techniques, but that’s not what I want. I would like the result to feel natural and realistic, without transparent effects. Since this is my first MR application, I might be missing something obvious. If anyone has experience with this or can point me in the right direction (depth APIs, occlusion techniques, or examples), I would really appreciate the help.66Views0likes1Comment