Connect ethernet cable to quest 3
Can i somehow enable ethernet support on my quest 3? I want to connect it to my router to have lowest delay possible in pc vr. I've bought ethernet to usb c adapter but when i connect it to the headset it does'nt see the cable. I tried to connect this adapter to my phone and it worked, it only does'nt work on my quest 3. What can i do about it?24KViews2likes11CommentsWhats your problem?
Hey creator forum reader! I’m doing some research and could use your help. I make educational horizon creator content on YT, but before working on something new, I want to hear where people actually struggle, so I know I’m helping solve the right problems. I’m looking to have 10 conversations with creators, specifically anyone who has dabbled as a world creator, but for whatever reason still hasn’t published their world. OR Anyone who HAS published, but is still yet to reach their 1st 1,000 visits. Who am I: My name is Mitch. I’ve been working in the VR industry for nearly a decade, from Oculus sales floor rep to working with the dev team on Meta Workrooms (RIP). I love VR and have made a living in this industry trail-blazed by Meta. Even up to where we are now taking on mobile gaming! Even though Meta Horizon Editor is still so early, I’ve already got a full years experience from fresh start to 1st world publish to 200,000+ impressions on my 1st world in 7 days. I want to help those a few steps behind me navigate the creator labyrinth faster than I did. If you fit the description above, or know anyone who does, please join my discord and I'll reach out: https://discord.gg/nRkM7bHCf9 If not then please disregard and best of luck on your creator journey 🎨 Godspeed.105Views1like4CommentsWebGPU Compute into WebXR on Quest
Anyone know the expected date for Meta Quest's WebGPU-WebXR layer? i just purchased a MetaQuest 3, to complement my Quest 2, for WebXR development *with* WebGPU ( Compute-Shader only Voxel/SDF engine ) and found Meta-Browser's doesn't support WebGPU-WebXR, a 'Chromium 134' stable feature. Suprising since Quest 3 is a "Flagship of XR" device ( in terms of sales/popularity/development ). Reference check here: https://immersive-web.github.io/webxr-samples/webgpu/ i've web-searched extensively yet not found a workaround/flag to set or anything to do other than the suggestion to build a WebGPU copy into WebGL context ( wasting bandwidth/VRAM on copying the XR canvas? ) Am i missing anything? Thx---175Views1like7CommentsMeta interaction SDK: picked up items going through objects with collision.
I'm a newbie in vr and mixed reality development and I'm trying to create a game in unreal engine. I encountered an issue that if I grab an item and go through e.g. a virtual table with collision mesh. It doesn't react to this at all. I figured that the interaction sdk is turning the physics off in their ISDKGrabTransformerComponent, but I changed that and it still goes through, but with a bit of stuttering and teleporting. I think that's the problem with the implementation, because it snaps the grabbed objects to my hand position and I totally understand it. My question is: is there a way to bypass it, maybe some kind of thing that if I collide with object while grabbing another item it gets the position and it stays there until I don't end overlapping with it? But still I'm not sure how to implement this. If there is some kind of tutorial or anything let me know I need it for my school project.57Views0likes2CommentsMeta Quest Unity Real Hands Building Block not showing real hands
Hi all! I'm somewhat new to VR development, especially in mixed reality. I am trying to use Meta's Real Hand building block, but I can't seem to get it to work. I have a very basic scene with some of the fundamental building blocks (camera rig, passthrough, passthrough camera access, interaction rig), along with the real hands building block and a single cube. When I build the project to my Quest (Meta Quest 3), and move my hands in front of the cube, I can only see the virtual hands - the occlusion does not work to show my real hands (i.e. it works the same as it did before I added the Real Hands building block). Why is this and how can I fix it? Unity Version: 6000.3.4f1 Meta Quest Packages: Meta XR Core SDK (85.0.0), Meta MR Utility Kit (85.0.0), Meta XR Interaction SDK (85.0.0) Steps to Replicate: Create a new empty scene Add the following building blocks: Camera Rig Passthrough Passthrough Camera Access Interactions Rig Real Hands Add a cube at (0, 0, 3) Build the project and deploy to the Quest Wave your hands in front of the cube - only virtual hands are visible, not real hands20Views0likes0CommentsHow to make real-world objects appear in front of a virtual environment in Mixed Reality?
I am trying to create a Mixed Reality application in Unity where a virtual environment surrounds the user’s real room, so that the user feels like they are in a different environment while still being inside their real space. I experimented with the SpaceMap sample from the MR Utility Kit: https://github.com/oculus-samples/Unity-MRUtilityKitSample/tree/main/Assets/MRUKSamples/SpaceMap This sample actually does almost exactly what I want. However, I am facing the following issue: Real-world objects, people, and even my own body appear behind the virtual environment objects, so they become invisible. In other words, the virtual environment fully occludes the real world. What I would like to achieve instead is: Real-world objects, people, and my body should appear in front of the virtual environment based on depth. However, the room surfaces (walls, ceiling, and floor) should not occlude the virtual environment. So effectively: Real dynamic objects should appear in front of the virtual world. Static room geometry should not block the virtual environment. From my research, it seems possible to do something like this using opacity or passthrough techniques, but that’s not what I want. I would like the result to feel natural and realistic, without transparent effects. Since this is my first MR application, I might be missing something obvious. If anyone has experience with this or can point me in the right direction (depth APIs, occlusion techniques, or examples), I would really appreciate the help.18Views0likes0CommentsMeta Smart Glasses for Hockey Referees: Real-Time AI Help for Offsides, Icing, Penalties, Goals...
Hi Meta team and community, I’m Ron — a coach and on-ice referee with 15 years of experience in ice hockey. The game is incredibly fast-paced with action happening all around the rink, making split-second calls extremely tough even for seasoned officials. My Idea: use Ray-Ban Meta smart glasses (with the powerful camera and Meta AI) as a hands-free assistant for referees. The glasses could provide subtle audio alerts or light AR visual cues for: Offsides & icing detection Too many men on the ice Hand passes & high sticks Minor penalties (tripping, cross-checking, etc.) Goal confirmation + automatic scorer/assist identification This would keep refs’ eyes on the play at all times — no more looking away or missing peripheral action.Why this matters Dramatically improves accuracy, consistency, and safety in one of the fastest sports on earth. Reduces controversial calls and game-stopping reviews. Starts as a training/review tool (POV recordings + AI analysis) and evolves to live assistance. The tech built for hockey’s extreme demands would transfer perfectly to soccer, basketball, football, and everyday AR uses. The NHL is already giving officials Apple Watches for timing alerts, the IIHF has experimented with eye-tracking glasses for referee development, and refs in other sports are already using Ray-Ban Meta glasses for POV filming. Meta’s AI glasses (especially the new Display models with in-lens overlays) and developer toolkit are perfectly positioned to lead this next step! What do you think? Has anyone worked on similar sports-officiating ideas? Kudos if you support this — I’d love feedback and to hear from Meta developers or other hockey refs! Thanks, Ron17Views0likes0CommentsChannel invite limits before launch
Hi, I have a question regarding channel invitation limits. From what I understand: Each app can configure up to 5 channels for URL-based invites. Each channel can contain up to 200 users. If that’s correct, does that mean the maximum number of users we can invite to private channels is 1000 users total (5 × 200)? We’re planning to run tests before launch and may need to invite more than 1000 users. Is there any way to: Increase this limit? Add more channels? Invite more users before launch? Any clarification would be greatly appreciated. Thank you!8Views0likes0CommentsQuest 3 Constant Restart Loop, Need full OTA firmware
Hello I have a quest 3 sistem which I use constantly, it started failing and went into a constant restart loop, and the factory reset does not work, The bootloader works and the ADB sideload works, recovery is possible but I need the Full OTA Firmware to fix it. I need to find where to download it. Thank you59Views1like4Comments