The position of the Vignette in OVRVignette is off-center from the field of view.
When using OVRVignette, the position where the field of view is narrowed is off-center from the field of view. Besides being off-center, different areas are obscured by each eye, making it very difficult to see. _OpaqueMaterial.SetVectorArray(_ShaderScaleAndOffset0Property, _OpaqueScaleAndOffset0); I think the above process probably changes a shader variable to determine the Vignette's position, but I don't know anything beyond that... Does anyone else have the same symptoms or know of a solution? Unity 6000.2.15f1 Meta XR Core SDK 85.0.0 Using Meta Quest 3 Meta Quest Link 85.0.0.239.5526Views0likes0CommentsBlurry Textures vs Mip Mapping
I'm struggling to get my models to appear sharp on Quest - They are fine in the Meta XR Simulator. The problem seems to be linked to Mip Mapping. If I uncheck Generate Mip Map on my textures, they appear sharp, but they I get some odd flickering that seems like an aliasing artifact. I tried disabling anti-aliasing to no avail. Maybe it i possible to see it here: If I enable mip mapping, then the texture only appears sharp if it is literally 5cm from the VR camera I have tried playing with Mipmap Limit Groups, to no avail. My textures are 4k, I have tried to set max size to different values. Has anyone dealt with similar issues?9Views0likes0CommentsConflicting Information in the Horizon OS SBC (Shader Binary Cache) Documentation?
In the documentation regarding building a shader binary cache per-platform (link) the documentation states: Using this feature, once one user starts the app and manually builds the SBC, all other users with the same device and software (Horizon OS, graphics driver, and app) will be able to avoid the shader generation process by downloading a copy of a pre-computed SBC. However, later on the same page, it states there is an automation in place to launch the apps and perform scripted prewarming logic if requested. The system automatically identifies and processes Oculus OS builds and app versions that require shader cache assets. It generates and uploads these assets to the store backend and automatically installs them during an app install or update. Does this feature support both of those setups? If I am not scripting any custom warmup logic, will shader binary caches still be shared between users with identical setups? IE, if I simply play the release candidate on the target OS version/hardware, will my SBC be automatically uploaded, or are SBCs only distributed when a scripted warmup sequence is present? Few details are provided regarding SBCs from other users being uploaded, so I'm curious if this is an inaccuracy or not. Thanks, excited to see features like this in Horizon OS. Very important for first time user experience.50Views0likes1CommentUnity – Room creation working on one device but not on another.
I’m currently working on a project using MRUK. It uses room scanning to instantiate walls, floors, ceilings, and so on. The user can place objects in their room. When I test on my device, everything works fine. However, on my friend’s device, it doesn’t work. The objects are not moving and just float in the air. After some debugging, we found that the room was not being created, so there is nowhere for the objects to be placed. I added an effect mesh and immersive debug tools to understand what was happening on my friend’s device. The room is not being created, and the console shows logs like “Room not found,” along with other debug messages I added to check if any room data was being returned from the device. We tried reverting to earlier versions where we are sure everything was working, but the issue persists on his device (mine works fine). This makes me think it could be related to permissions. However, after checking the settings, the app does have permission to access spatial data. We also tried clearing all scanned rooms and spatial data from his device, and even reinstalling the app, but nothing worked. Some additional details that might be relevant: - We are using SideQuest to install the builds - In the latest versions, we added multiplayer using Photon - We tried scanning the room before opening the app, and also triggering a scan from within the app using a button22Views0likes0Comments🚀 Social VR Accelerator Orientation
Build and launch a Social VR game in 4 weeks. This isn't just a class, it's an accelerator. You'll start with a Unity template, get weekly mentor support, test with real players, and launch on the Meta Horizon Store by Week 4. To get started, all you need is beginner experience with C#, Unity, or the Horizon World Editor, plus 1–4 hours per day on weekdays. Apply as a team (designer + developer) or solo. This invite is open to Meta Horizon Start Members. In this session, we'll walk through how the 4 weeks work, what's expected each week, and how mentorship and playtesting are set up. You'll get access to your Unity starter template, set up your dev environment, and meet the other teams. What we'll cover: Program timeline and weekly milestones Mentor and check-in structure Starter template walkthrough Team introductions How to get help along the way By the end of orientation, you'll know exactly what you're building toward and how to hit the ground running in Week 1. Not yet a Meta Horizon Start Member? Apply today! Join on Zoom736Views5likes8Comments[BeatGesture] A Hand Tracking VR Rhythm Game
Hi everyone, I’ve been working on a small VR rhythm game called BeatGesture. It uses hand tracking and gestures instead of controllers, matching timing, hand shapes, and direction. If you’ve played other VR rhythm games, the core structure might feel familiar. But I wanted to explore what happens when you go all-in on hands as the primary input. The goal was to push gesture-based interaction further, rather than relying on controller-driven mechanics. It feels less like hitting notes and more like performing choreography. If you're interested, here's the store page: https://www.meta.com/en-gb/experiences/beatgesture/35694765943455719/ Thanks!
11Views0likes0CommentsCan Meta XR SDK build for Windows PC VR, or is it Quest-only?
I've been developing a VR game for Quest Pro/Quest 3 for the past year using Unity 6 (6000.0.40f1) with Meta XR SDK (Core 74.0.1, All-in-One 74.0.2, Interaction SDK 74.0.2, Essentials 74.0.1). Current situation: Standalone Quest APK builds work functionally Performance is poor (~40-50 FPS, pixelated visuals) Unity Editor with Quest Link runs beautifully (80+ FPS, crisp visuals) What I need to know: Can I build a Windows platform executable with Meta XR SDK and run it as a PC VR app via Quest Link? Or is Meta XR SDK strictly for standalone Quest Android builds? What I've done: Optimizing the standalone build with my own code logic and texture compression, and logging disabled (My game has a server logging feature) Limited improvement due to an asset-heavy project Why I'm asking: Since Quest Link (Editor to Quest) performs so well, I'm wondering if I can build a Windows .exe that runs the same way - using my PC's GPU while the Quest acts as a display/input device. Constraints: I need Meta XR SDK specifically for eye tracking (Quest Pro) Not using OpenXR due to reported conflicts with Meta XR SDK Has anyone successfully built Windows PC VR apps using Meta XR SDK? Or is the SDK Android-only, requiring a switch to OpenXR/SteamVR for PC builds? Any guidance or documentation links appreciated! Other questions: Does eye tracking work over Quest Link with Windows builds? Are there specific build settings or plugins needed? Any performance differences vs standalone?Solved42Views0likes4CommentsMeta Quest Unity Real Hands Building Block not showing real hands
Hi all! I'm somewhat new to VR development, especially in mixed reality. I am trying to use Meta's Real Hand building block, but I can't seem to get it to work. I have a very basic scene with some of the fundamental building blocks (camera rig, passthrough, passthrough camera access, interaction rig), along with the real hands building block and a single cube. When I build the project to my Quest (Meta Quest 3), and move my hands in front of the cube, I can only see the virtual hands - the occlusion does not work to show my real hands (i.e. it works the same as it did before I added the Real Hands building block). Why is this and how can I fix it? Unity Version: 6000.3.4f1 Meta Quest Packages: Meta XR Core SDK (85.0.0), Meta MR Utility Kit (85.0.0), Meta XR Interaction SDK (85.0.0) Steps to Replicate: Create a new empty scene Add the following building blocks: Camera Rig Passthrough Passthrough Camera Access Interactions Rig Real Hands Add a cube at (0, 0, 3) Build the project and deploy to the Quest Wave your hands in front of the cube - only virtual hands are visible, not real hands57Views0likes1CommentHow to make real-world objects appear in front of a virtual environment in Mixed Reality?
I am trying to create a Mixed Reality application in Unity where a virtual environment surrounds the user’s real room, so that the user feels like they are in a different environment while still being inside their real space. I experimented with the SpaceMap sample from the MR Utility Kit: https://github.com/oculus-samples/Unity-MRUtilityKitSample/tree/main/Assets/MRUKSamples/SpaceMap This sample actually does almost exactly what I want. However, I am facing the following issue: Real-world objects, people, and even my own body appear behind the virtual environment objects, so they become invisible. In other words, the virtual environment fully occludes the real world. What I would like to achieve instead is: Real-world objects, people, and my body should appear in front of the virtual environment based on depth. However, the room surfaces (walls, ceiling, and floor) should not occlude the virtual environment. So effectively: Real dynamic objects should appear in front of the virtual world. Static room geometry should not block the virtual environment. From my research, it seems possible to do something like this using opacity or passthrough techniques, but that’s not what I want. I would like the result to feel natural and realistic, without transparent effects. Since this is my first MR application, I might be missing something obvious. If anyone has experience with this or can point me in the right direction (depth APIs, occlusion techniques, or examples), I would really appreciate the help.63Views0likes1CommentSpatial Anchors won't stay after Recenter +Workaround
I have a minimally reproductible project but it cannot be uploaded here. Unity 6000.3.9 and Quest3 v2.1.1034, SDK v84 We have a scene 0 that loads scene 1 in which the CameraRig lives. With that setup, after launching the APK I place some anchors, then holding the Meta button (Recenter) the anchors move away. After that first recenter everything works normally and any new anchor does stay in place. This doesn't not happen if scene 0 is not included and thus we launch directly to the CameraRig scene. Workarounds: - Removing scene 0 (but that's our bootstrapper we can't remove it, but this fix might work for you) - Adding a CameraRig (OVRManager) on scene 0 (so far it seems to work) If any developer from Meta is available, i do have a project that can reproduce he bug. Thanks in advance.49Views0likes1Comment