Passthrough camera data is not available on Quest 3 developing with Unity
Basically it is not possible to access the real time passthrough video from Quest 3 in Unity. Many devs among the internet think it is a crucial feature to be able to process to create incredible mixed reality apps with object recognition using machine learning. Nevertheless, it is not possible to access the real time video captured by quest 3 in unity. Here is a reddit post made by me where people are requesting this feature. Please, we need access to real time passthrough video in Unity or other engines.49KViews110likes109Comments[Feature Request] Edit in Play Mode for Oculus Quest in Unity
I've been building an application for Go until recently, but after trying the Quest am seriously considering changing my platform because of the 6DoF. My Go development process so far has been like this: I use a Rift hooked up to my PC to do most of my Unity development (because of the perk of editing in play mode), then build to the Quest and make sure my build is bug-free before pushing it to my main branch. What I'm wondering is if we'll ever be able to use the Quest hooked directly up to Unity instead of needing to use the Rift for that purpose. Do other people want this feature?9.8KViews16likes35CommentsBuffer-based Haptics are broken for Quest 2
Haptic controller vibrations are a crucial tool for creating game immersion. While simple "buzzes" (a constant vibration amplitude for a specified duration) can be serviceable, modern controllers allow developers to play custom waveforms through the controllers, such as sawtooth waves, sinewaves, and even Audio Clips. This adds texture and nuance to the effect, and is the superior way to play haptics in 2022. After much trying, it appears to me that the buffer-based haptics are fully broken for Quest 2 controllers in the Oculus integration for Unity. I have tried in 3 ways: Using the Unity generic XR system: byte[] samples = new byte[1000]; // This would be loaded with samples to create a custom vibration waveform var handDevice = InputDevices.GetDeviceAtXRNode(hand == Hand.right ? XRNode.RightHand : XRNode.LeftHand); // Pass the buffer into the controller if its "haptic capabilities" say that it supports buffer-based haptics if (handDevice.TryGetHapticCapabilities(out var capabilities) && capabilities.supportsBuffer) handDevice.SendHapticBuffer(0, samples); Using Rift S and Quest 1 Touch controllers, the above code runs successfully. Using Quest 2 and Touch 2 controllers, "supportsBuffer" is false on the capabilities class, and the samples cannot be successfully sent. I know that it is incorrect that the Touch 2 controllers do not support the feature, as I have in a few instances been able to send a buffer to Touch 2 controllers using the below method. Using OVRHaptics: var ovrHapticClip = new OVRHapticsClip(myAudioClipToTurnIntoVibration); var channel = OVRHaptics.RightChannel; channel.Queue(ovrHapticClip); The OVRHaptics class has a function for sending a haptic buffer in through a "channel" (controller). I can actually get this method to work in a test scene. However, it requires me to put the OVR plugin in a legacy setting (Oculus/Tools/OpenXR/Switch to Legacy OVERPlugin (with LibOVR and VRAPI backends) ). If I am not in this setting the function does nothing. In another project, if I set the project to this setting and try to send the haptics buffer, the engine gets stuck in an infinite loop. According to what I can find online, the OVRHaptics class is intended to be deprecated, anyway, so it doesn't seem like a good solution. Using OVRInput: My understanding is that OVRInput is the modern, sanctioned way of sending haptics to Oculus controllers without going through the generic Unity XR system, and they contain a method for a "simple buzz" (amplitude and duration parameters only) via OVRInput.SetControllerVibration. However, they seem to lack any functionality for sending in a custom buffer, unlike the deprecated OVRHaptics. I would love any advice regarding ways I can get this feature to work. I figure I'm either wrong about some of my conclusions above, or the feature is fully broken at the moment -- either way, I'd love to know. Thanks in advance for your help!3.1KViews13likes1CommentLasertag! - an experiment in live scene understanding using DepthAPI
Hello new dev forum :) I'm working on a project that uses the DepthAPI to map your space in real time (instead of relying on room setup) to decrease setup friction, lower time-to-fun, and increase playspsace area. Because the game scans as you play, it responds to opening/closing doors, moving furniture, and other changes to the environment. I'm also using depth for drawing light against the environment. It looks really nice in dimly lit areas. I'm currently working on meshing so I can use it with Unity's NPC pathfinding. I'll be posting updates this thread. You can learn more and download the game at https://anagly.ph276Views13likes9CommentsExternal object tracking
Now that the passthrough API is available I was hoping I could use it to scan a QR code or AR tag, however, the camera layer is not available to us as developers, which I can understand but this limits so much on the AR front. Why wouldn't there be an option that asks this app can see your environment do you want to continue? Rather than have it blocked by default. To get to the point, what would be another way to track an external object? for example, I have a box in my real-world environment and would like to be able to track its position in-game. Are Bluetooth trackers an option? Any ideas are welcome, the main goal is to translate a real-world object to an in-game position. Greetings, Smoothy1015.9KViews9likes5CommentsHow to enable handtracking 2.0 on Unity?
I've updated to the latest OVR Plugin, can't see anything new. Some of my apps heavily relies on quick handtracking and could really benefit from it. Also, is there a link for direct contact form for development support? Can't seem to find it either. So - how to enable handtracking 2.0 on Unity? Many thanks!Solved3.1KViews7likes3CommentsImmersive Debugger spams error message
Hello Community, i try to use the Immersive Debugger in a Unity 6 project. But i do not get to manage that the following error message disappear: "In the render graph API, the output Render Texture must have a depth buffer. When you select a Render Texture in any camera's Output Texture property, the Depth Stencil Format property of the texture must be set to a value other than None." My guess is, that the error appears because of the debugger itself. The error will not be printed in the Unity Log if i start the application with the Simulator (but it will also appear in the Immersive Debugger Log). Has someone an idea whats wrong?2.6KViews7likes14CommentsPetition To Allow Developers RGB Quest 3 Data For Development
Hi, I'm currently developing an application for the quest 3 but need to use its depth and camera data. It already gives me depth but I can't get camera data due to meta's policy on "security". Many people complained about this already and I just want to put another post out there to spread the word.1KViews7likes2CommentsDisable guardian at runtime in Mixed Reality Apps (Unity)
Hi! I'm in the process of creating a mixed reality app that integrates both passthrough functionality and the Scene API. Despite conducting a full room scan and navigating around, the guardian system continues to restrict the app. I'm searching for a way to programmatically deactivate the guardian system, either in Unreal or Unity. Considering that Meta's "First Encounter" demo doesn't encounter these boundary issues, I'm convinced there must be a way to resolve this. Thank u!7KViews7likes14Comments