I am having limitations with the Meta Depth API Shader for AR
If I want to use any shader affect on an object in AR I can't use the Depth Shader and then you see the objects through my arm. What is the best way to combine two custom shaders? I want to use another shader that creates an invisible material that still receives shadows. I will probably need to combine multiple shaders with the depth shader one in the long run. Here is the link to the depth shader: https://github.com/oculus-samples/Unity-DepthAPI/tree/main982Views0likes1Comment"Space Setup" Issues with v57 Quest developing with Unity OpenXR AR Foundation.
Hi! I'm going to outline my issue with the new Space Setup in my AR project. I noticed that neither the v57 Meta Quest Update blog post nor the release notes mentioned that "Room Setup" was moved from being an experimental feature to a supported one called "Space Setup." Here's Dilmer's video showing Room Setup as an Experimental feature. (This was posted July 13th, so pre v56, but I just wanted to show how it was previously marked as Experimental. v56 also has no note of moving this feature from Experimental - yet it has changed since then) My Unity AR passthrough project uses Unity's AR Foundation's Plane Detection and the data from Space Setup to place objects on top of the generated planes. Example video of my project previously working Upon the v57 update the Plane Detection in my project stopped working due to (I'm assuming) not receiving the Space Setup data due to some blockage of permissions or improper setup. I re-configured my Space Setup, Cleared Space Setup and then reconfigured again. Restarted my Quest Pro and Computer to no avail. I also went into settings and allowed permissions access to Spatial Data for a previous Unity OpenXR project, but there was no option for my current one: this is what I suspect is the crux of the issue. Previously I didn't have to explicity give permission to use the data by my Unity project build. My current project isn't showing up as an option to select (so I'm a bit stuck there), and I don't see a record of this change or others with this issue, so I'm unsure how to solve it. Thank you for reading my long winded essay haha - I would appreciate any help... Let me know if I have overlooked something: maybe another permissions setting etc.Solved5.6KViews4likes6CommentsOcclude virtual object by the real world. Meta quest pro Depth sensor.
Hello, it's possible in Unity with Meta Quest Pro occlude the virtual object by the real world. All this thanks to the passtrough? I'm using Scene Model feature of Meta Quest Pro but I'm having an issue. I place a plane over every wall but virtual object are not occluded by the real world, in this way I can't see real world and I lost all the immersivity. It's possible occlude the virtual object and place it behind real object? I want my virtual object are always behind? I want reconstruct the wall of my room virtually All this with passtrough active.994Views3likes0CommentsAR Passthrough sample scenes in Unity
Hey, the new sample scenes with passthrough AR do not seem to work on my Quest 2. I've built them from a clean project on unity 2020.3.4f1 but I just can't get the passthrough overlay/underlay on my build. Any help appreciated. Did yall manage to run this on yours Quest 2? There's not much info on the web rn on this topic, because the update with AR passthrough api came out two days ago and is available on Unity Asset store through Oculus integration pack2.7KViews0likes1CommentRender textures to each eye separately [OVR plugin]
Hi, I have streams from two cameras and and I need to pass it to left and right eye separately (AR effect). For achieving that, I made scene in Unity with two textures on which the streams from cameras are displayed. I added to the scene two OVRCameraRig objects representing each eye and I set target eye "Left" to the first one and "Right" to the other. When I run my application I can see that target eye property is working but the problem is that when I test it with Oculus Rift gogles, these cameras are treated like separate displays and at one time I can see only video from one of them displaying on right or left eye in gogles. What I need is to display these two streams simultaneously on both eyes and I don't know how to achieve that. When app is running I can get it only on left or right side depending on which display is currently selected in the unity program. Another question is how to disable input tracking in OVR plugin, because I will have cameras attached to the gogles so I don't need the whole scene to rotate when I am moving my head. Please guys I need your help here590Views0likes0Comments