MQDH Won’t Connect Quest 3 at Final Cable Step Despite USB and ADB Working
Hello everyone, I’m encountering a frustrating issue with Meta Quest Developer Hub (MQDH) v5.7.2 on macOS while trying to set up a Quest 3 running firmware V78. ISSUE DESCRIPTION I enabled Developer Mode on Quest 3 via the mobile app. In the Quest headset, I selected “Always allow USB access.” SideQuest and ADB (adb devices) can successfully detect the Quest 3. However, when I run “Set Up New Device” in MQDH, it stalls at the final step asking me to plug in the cable, even after doing exactly that. No prompt appears inside the headset; MQDH hangs without progressing. I tried reinstalling MQDH and rebooting both Mac and headset—problem persists. WHAT I’VE TRIED Verified Developer Mode is enabled, and USB Connection Dialog permission is granted. Confirmed USB‑C cable works (SideQuest and ADB both connect). Rebooted Quest and Mac multiple times. Uninstalled and reinstalled MQDH, cleared caches/settings. Used both bundled Quest cable and a known good USB‑C data cable (I tested USB 3.2 cable / Thunderbolt cable with Mac mini USB‑C Thunderbolt port). adb works fine. REQUESTED HELP Could someone please advise on potential causes why MQDH stalls even though ADB and SideQuest work? Has anyone else experienced this on Quest 3 running firmware V78 with MQDH 5.7.2? Are there any logs or advanced diagnostics I can capture to pinpoint the blockage? If anyone has successfully connected MQDH after the V78 update, what steps resolved it? FULL ENVIRONMENT INFO Headset Firmware: Quest 3 V78 MQDH Version: 5.7.2 (macOS) OS: macOS (latest version) USB cable: Verified data-capable USB‑C ADB Output: adb devices correctly lists the headset (adb cmd / MacDroid) MQDH Behavior: stuck at final prompt, no progress NEXT STEPS I’m considering doing a factory reset and strictly following the community-suggested flow (skip Air Link until first USB prompt), but not sure if firmware V78 changed behavior. Thank you in advance!92Views0likes0CommentsPassthrough play mode doesn't work since Quest 3 update yesterday
I don't know how long the update was but it was working before the update and now it's not working. Normally I open unity and as long as I have quest link open on the headset and on my pc, clicking play in unity will launch the passthrough app preview on my headset. Now, when I go to launch link on the headset, it begins the colourful loading screen animation and then after a moment says "headset/pc disconnected - please disconnect and disconnect your headset" (or similar. Even though the desktop app shows the header as connective and active, and the quest home settings menu preview shows link as active.109Views0likes1CommentOVRBoundary.GetGeometry returns incorrect play area data when using Link
I am calling OVRBoundary.GetGeometry(OVRBoundary.BoundaryType.PlayArea) and receiving an array of four Vector3 as expected. Since the docs say these are in local space I'm converting them into world space by transforming them by OVRCameraRig.trackerAnchor. This produces different results based on headset: 1. Correct on Rift S in Unity Editor (Windows) 2. Correct on Quest 3 in a build (Android) 3. Incorrect on Quest 3 + usb Link in Unity Editor (Windows) 4. Incorrect on Quest 3 + Air Link in Unity Editor (Windows) 5. Incorrect on Quest 3 + usb Link in build (Windows) The common denominator seems to be Link. In this case while the actual dimensions seem correct, the OVRCameraRig.trackedAnchor has not been updated. It (and OvrCameraRig, and it's 'TrackingSpace' child) are all at identity. Which places the PlayArea points around where I booted the game, and not where the center of the PlayArea should be. This can be reproduced by creating an empty scene and adding the OVRCameraRig, and setting the tracking origin to 'Stage', then visulising the PlayArea rectangle in some way. Adding a LineRenderer under the TrackedAnchor and populating it directly with positions from GetBoundary shows the issue. Unity 2022.3.22f1 / Windows 10 Meta All-in-one SDK v63.0.0 Unity Oculus XR Plugin 4.2.0 OpenXR Plugin 1.10.0 Configured to use the 'Oculus' plugin via the Project Settings -> XR Plug-in Management.774Views2likes1CommentNew Game Idea - Configurable Game Board
US Patent - Interactive Player Board Game I’ve recently been granted my first US Patent on 11/05/2024 for the idea to have a configurable game board with technology to check whether a rule of the game has been satisfied. Was wondering if anyone would be interested in developing this type of game for the Oculus. I see the possibility of having the user be able to generate their own game board depending on the theme (exercise, eating, traveling, etc), setting the number of spaces for their game board, and have the ability to make this into a private game or public game. The user can invite friends to play on the game board and it can be played asynchronously. If the theme is exercise then if they roll a 1, they need to do 10 jumping jacks. The oculus will be able to see whether that task is completed successfully. If so they can roll again and see what task they need to perform next. Each game can be considered to be a pod and can invite others to join. Each person that joins is a pea in the pod because of their similar interest in the theme. Certain pods can become quite big with many peas. You can use the potential of these pods to generate income through ad revenues. The ads can target specific pods that may relate to the product. Please reach out to me if you think is a viable idea and can be developed. Thanks!287Views0likes0CommentsBuilt Unity standalone app becoming laggy after enabling passthrough through airlink
Hi everyone, I have a problem regarding my built app becoming lagging after enabling passthrough in airlink, because I needed the depth api and passthrough mode. When I run the app when disabling passthrough in airlink, my app will run smoothly. I used the oculus debug tool and I will see that my compositor latency is very high when passthrough is being turned on. Any solution for this situation? I have connected my PC using ethernet cable and used a dedicated 5GHz router for my quest 3.307Views0likes0CommentsStage tracking space in Link Mode
Hi I'm developing a PC VR app with the Oculus integration for Unity. In this app, the virtual space should should align to the physical space. - The documentation states that one should use the "stage tracking space" when creating apps like this to keep the view re-centering function from messing with the alignment. https://developer.oculus.com/documentation/unity/unity-ovrboundary/ - On the OVRManager, I've set "TrackingOriginType" to "Stage" and disabled the "AllowRecenter" option. In an Android build, it seems to have the effect which is described in the documentation. But in the Editor or in a Desktop build with the Quest2 in Link mode, the view/alignment still resets when re-centering manually or after the headset was taken off for a few seconds. - Are those options only supported on Android Quest builds or am I missing something? Would there be an alternative way to keep the virtual world aligned with the physical one?3.8KViews0likes3CommentsPCVR/Link Development in Meta: A Comprehensive Technical Analysis
Introduction This technical analysis examines the development landscape of PCVR (PC Virtual Reality) and Link technology within the Meta ecosystem, focusing on the Quest platform. We'll explore the underlying technologies, compare different approaches, and delve into the technical challenges and solutions in this rapidly evolving field. Hardware Architecture Meta Quest 2 and Quest 3 serve as the primary standalone devices for PCVR via Link: 1. SoC (System on Chip): - Quest 2: Qualcomm Snapdragon XR2 (Based on Snapdragon 865) - CPU: Octa-core Kryo 585, up to 2.84 GHz - GPU: Adreno 650 - Quest 3: Qualcomm Snapdragon XR2 Gen 2 - CPU: Octa-core Kryo, up to 3.19 GHz - GPU: Adreno 740 2. Display: - Quest 2: Single fast-switch LCD, 1832x1920 per eye, 90/120 Hz - Quest 3: Dual high-resolution LCD, 2064x2208 per eye, up to 120 Hz 3. Tracking: - Inside-out tracking using multiple wide-angle cameras - 6DoF (Degrees of Freedom) for head and controller tracking Comparative Analysis: Quest 3's improved SoC and display provide better performance and visual fidelity for PCVR streaming, potentially reducing the encoding/decoding latency and allowing for higher resolution streams. Software Stack The software infrastructure for PCVR development on Meta platforms involves several layers: 1. Runtime Environment: - Oculus Runtime: Manages device drivers, tracking systems, and provides APIs for developers - OpenXR: Cross-platform standard for VR development, supported by Meta 2. Development SDKs: - Oculus PC SDK: Native C++ SDK for low-level access to Oculus hardware - Oculus Integration for Unity/Unreal: High-level SDKs for popular game engines 3. Graphics APIs: - DirectX 11/12: Primary APIs for Windows-based PCVR development - Vulkan: Cross-platform graphics API, offering lower overhead Comparative Analysis: While DirectX is more commonly used due to its long-standing presence in Windows development, Vulkan offers potential performance benefits, especially in multi-GPU scenarios or when targeting multiple platforms. Link Technology Deep Dive 1. Oculus Link (Wired): - Protocol: Custom USB protocol over USB 3.0/3.1 - Bandwidth: Up to 5 Gbps (USB 3.0) or 10 Gbps (USB 3.1) - Latency: Approximately 20-30ms motion-to-photon latency 2. Air Link (Wireless): - Protocol: Modified Wi-Fi protocol optimized for VR streaming - Bandwidth: Depends on Wi-Fi capabilities (ideally Wi-Fi 6/6E) - Latency: 30-40ms motion-to-photon latency under optimal conditions Technical Challenges and Solutions: 1. Compression and Encoding: - Challenge: Balancing image quality, latency, and bandwidth usage - Solution: Dynamic encoding bitrate adjustment based on available bandwidth and scene complexity - Implementation: H.264 and HEVC (H.265) encoders with custom optimizations for VR 2. Latency Mitigation: - Technique: Asynchronous Timewarp (ATW) and Asynchronous Spacewarp (ASW) - ATW: Rotational reprojection to reduce perceived latency - ASW: Interpolation of intermediate frames to maintain smooth motion 3. Tracking Synchronization: - Challenge: Aligning PC-generated frames with headset tracking data - Solution: Predictive tracking algorithms and low-latency data transmission protocols Performance Optimization Techniques 1. Fixed Foveated Rendering (FFR): - Technique: Rendering at lower resolution in peripheral vision areas - Implementation: Custom fragment shaders with multi-resolution rendering passes 2. Variable Rate Shading (VRS): - Technique: Adjusting shading rates across the rendered image - Support: Requires hardware support (available on newer GPUs) 3. Dynamic Resolution Scaling: - Technique: Adjusting render resolution based on performance headroom - Implementation: Frame timing analysis and resolution adjustment per-frame Comparative Analysis: FFR provides consistent performance benefits but may introduce visible artifacts. VRS offers more granular control but requires newer hardware. Dynamic resolution scaling provides a good balance but can result in noticeable quality fluctuations. Developer Tools and Debugging 1. Oculus Debug Tool: - Features: Performance HUD, ASW controls, encoder settings - Use Case: Fine-tuning and performance analysis 2. Oculus Mirror: - Functionality: Displays headset view on PC monitor - Application: Debugging visual issues and recording gameplay 3. Performance Profiling: - Tools: Oculus Performance Profiler, Unity Profiler, Unreal Insights - Metrics: Frame timing, GPU utilization, memory usage Best Practices for PCVR Development 1. Optimize for Link streaming: - Reduce draw calls and polygon count - Implement level-of-detail (LOD) systems for complex scenes 2. Latency-aware design: - Implement predictive input systems - Design UIs and interactions with latency considerations 3. Cross-platform compatibility: - Use abstraction layers for platform-specific features - Implement scalable rendering pipelines for various hardware capabilities Conclusion PCVR development for Meta platforms involves a complex interplay of hardware capabilities, software optimizations, and VR-specific technologies. By leveraging the latest advancements in Link technology and implementing robust optimization techniques, developers can create high-fidelity, low-latency PCVR experiences that bridge the gap between standalone and PC-based virtual reality. This technical analysis provides a comprehensive overview of the current state of PCVR development for Meta platforms, highlighting the intricate balance between performance, quality, and latency that developers must navigate in this cutting-edge field.1.1KViews1like0CommentsMeta Quest Browser & BabylonJS = Slow FPS
I'm trying learn about WebXR and I'm finding that my library of choice isn't playing well in the Meta Browswer, locking at about 30 FPS. Now, I see 60 FPS on the desktop with Chrome and I know that's because Chrome is locked in a vsync. I can pass Chrome some parameters to unlock the FPS limiting. At that point, I'm seeing around 500 fps. Chrome works great with Link, but I'm looking to work with the native Meta Browser inside the Quest. I've found other engines, such as the Wonderland Engine, demonstrates 72 FPS in the Quest with their app Escape Artists (https://esc.art). A-Frame applications show up in the browser at 90 FPS. So my question is, is there some sort of parameter or command the WebGL lib should be passing to the browser to speed it up? Or, is this likely to be some sort of bug in BabylonJS that is locking it to 30 FPS and the browser doesn't have any sort of artificial rate limiting?690Views0likes0CommentsHand Tracking over Link still not working in Unity Editor
Hi, I have spent the last two years developing for the Quest 2, and recently got the Quest 3. It's a great device and I'm super happy with it. There is just one big problem standing between me and total developer bliss. How is it possible that we still don't have a stable, robust hand tracking implementation across the whole development pipeline in 2023? I'm confused as to how Meta envisions developers to take full advantage of their (really good) hand tracking tech, if there are consistent inconsistencies, fumbling around, trying seven different versions of all of the little SDKs, components, etc. Can someone please advise me on how to achieve a simple Unity scene using the standard Oculus Integration, where I can just click "Play" in the editor and get hand tracking working over my Link cable? So far I have gone through five different Unity versions from 2021-2023, even more different Oculus Integration SDK versions (v50-57), and three different headsets (Quest 1, 2 and 3). Nothing worked. The only way that I have managed to have hand tracking working in the editor via link is to use a long deprecated version of the Oculus Integration, where I had to manually select "Oculus -> Tools -> OVR Utilities Plugin -> Set OVRPlugin to Legacy LibOVR+VRAPI" from within Unity. Three things. First, this option is no longer available in the later Oculus Integration versions. Second, selecting this option explicitly disables the possibility of building for the Quest 2 and 3. So you'd have to switch back and forth between the old LibOVR+VRAPI and the newer OpenXR integration, just to get hand tracking working in the editor. Really? Third, we as developers cannot reasonably be expected to stick to this legacy API, as all of the newer mixed reality features, like scene understanding, spatial anchors, etc. are not supported in the old version. Hence I want to ask my question one more time: how does Meta expect people to develop for their platform? Please let me know if you have an answer to this dilemma, I am grateful for any pointers! Note: I am explicitly talking about hand tracking through the Unity Editor using Link, in the standalone Android builds it works fine and it's amazing to use! Note 2: This is not a new problem, it's documented across many forum posts both here and on the Unity forums. I have also found fully-fledged projects, that warn of this exact error, e.g. (https://github.com/kuff/medslr#how-to-launch-the-project). Note 3 - further information: The legacy backend of OVRPlugin was removed in v51, which is stated here: https://developer.oculus.com/downloads/package/unity-integration/50.0 under "OVRPlugin". Best, MaxSolved4.6KViews0likes8Comments