Suggestion: Allow apps to request permission to access camera feed(s)
The developer blog says: "We built Passthrough API with privacy in mind. Apps that use Passthrough API cannot access, view, or store images or videos of your physical environment from the Oculus Quest 2 sensors. This means raw images from device sensors are processed on-device." That's all well and good, but what if an application has a legitimate reason to need access to that data? I can think of several such use cases: A camera app for taking pictures/recording video in 3D Mixed reality experiences that implement computer vision algorithms to react to things in the real world A more advanced version of Waltz of the Wizard's mixed reality sandbox that allows changes to be made to objects in the real world (well, at least how they're displayed in the headset š) using image processing filters Trippy effects that go beyond the pre-existing filters in the SDK I'm sure there are many others as well. From my understanding, none of these use cases are currently possible on the Quest, even though there's no reason they shouldn't be with the hardware that's present. I couldn't even code it for my own experimentation unless I find some way to root the headset, which is annoying as I spent $1,500 on a Quest Pro and shouldn't be prevented from using its hardware as I please. So what I recommend is adding an API for accessing this raw image data, which would of course only work if the user grants permission to the app. Apps can already do this on smartphones with no issues, so I don't see how the Quest would be any different. The Oculus Store, of course, can also set policies as necessary, such as requiring apps to keep the image data on device whenever possible, and to never send it to a remote server without the user explicitly giving permission. I'd have posted this in the SDK Feedback section, but for some reason it says I'm not allowed to start a thread there, so I'm posting it here instead.10KViews30likes17CommentsDisable guardian at runtime in Mixed Reality Apps (Unity)
Hi! I'm in the process of creating a mixed reality app that integrates both passthrough functionality and the Scene API. Despite conducting a full room scan and navigating around, the guardian system continues to restrict the app. I'm searching for a way to programmatically deactivate the guardian system, either in Unreal or Unity. Considering that Meta's "First Encounter" demo doesn't encounter these boundary issues, I'm convinced there must be a way to resolve this. Thank u!7KViews7likes14CommentsQuest 3 how to match virtual and physical room orientation properly (AR/XR) ?
Hello, I use a quest 3 in UE5 using steamvr currently (new to quest dev in ue). I use Steam link to stream the picture to the quest3. I never packed an apk but simply started the game from the ue editor in vr mode in unreal. I recognized, that every time i start/stop the game in the editor, the orientation (yaw) of my virtual space in unreal changes kind of randomly... probably depends on inital headset position when i start the game. I want to place a virtual object in my 3d scene and i want it to correspont to the real-world location for ever - even when i shutdown unreal and the quest headset and restart. Think of an AR way to place a virtual object in your room in a specific position. I already found the ARPins, but couldnt get it to run (at least not when starting the game from the UE editor in vr mode - they seem to be overpowered for my case anyway). Generally i wonder why it is so hard to match virtual orientation to real-world-orientation. The guardian/champerone, is always perfectly matching the room - even when turjing off the headset. So the headset must be aware of the physical room and positions and orientation. Why is it such a hustle to match it in unreal? Would be glad if someone could shed some light š thank you šSolved6.3KViews0likes2CommentsMRUK Create a new Scene
The documentation (Docs Link) says: Using Mixed Reality Utility Kit, the function MRUKRoom.IsPositionInRoom(Vector3 pos) can be used to check whether the camera rig pose is present in the loaded scene or not. If it is not, offer users to capture a new scene. How can I create a new room with MRUK?Solved5.1KViews1like9CommentsApplying a LUT on quest pro color passthrough
Working on a mixed reality experience at the moment and trying to get a color LUT on the OVR layer to edit the mood of the color passthrough on quest pro. I've made several different LUT's trying a range of different compression and build settings using meta's documentation. https://developer.oculus.com/documentation/unity/unity-passthrough-creating-color-luts/ https://developer.oculus.com/documentation/unity/unity-passthrough-neutral-color-luts/ Compression Settings Extra Details Unity 2022.3.5f1 Graphics - Vulkan Color Gamut - sRGB Haven't been able to see the effect of the LUT come through on the color passthrough. Understand it's an experimental feature so maybe a little bit untested but curious if anyone has had success in making them work ?? Is there a simple step / setting I'm missing?Solved3.3KViews0likes3CommentsWindows Volumetric Apps on Meta Quest announced for Developers
Microsoft announced Volumetric Windows apps are coming to Meta Quest at Build today. If i'm understanding it correctly, it looks like Developers that currently have mixed reality Windows apps or 3D objects can extend interaction to the Meta Quest from their Windows machine or Windows 365 via a new volumetric API. The screenshots show several virtual application windows open, with a 3D Xbox controller model being handled by the user and dismantled virtually. This is the sign up for the developer preview: https://aka.ms/VolumetricApps Video demo: https://x.com/SadlyItsBradley/status/1792974464424317307 Meta & Microsoft: The Future of Mixed Reality and Productivity - Developer session at Microsoft Build 2024 https://www.youtube.com/watch?v=ev3vsv5fdrE Creating Immersive 3D Solutions with Microsoft Mesh - Developer session at Microsoft Build 2024 https://www.youtube.com/watch?v=Q5LUtwjobgA Build Day 1 : https://www.youtube.com/live/2bnayWpTpW8 Build Day 2: https://www.youtube.com/live/FwJ1Zz_DntY Build Day 3: https://www.youtube.com/live/8Zy9QtZ6czE Bring Your Mobile Apps to Meta Horizon OS Developer Sign up https://developers.facebook.com/m/spatial-app-framework/ Sources and further details https://blogs.windows.com/windowsdeveloper/2024/05/21/unlock-a-new-era-of-innovation-with-windows-copilot-runtime-and-copilot-pcs/ https://www.meta.com/en-gb/blog/quest/meta-horizon-os-open-hardware-ecosystem-asus-republic-gamers-lenovo-xbox/ https://www.roadtovr.com/microsoft-meta-windows-3d-apps-quest/ https://www.uploadvr.com/microsoft-windows-volumetric-apps-quest/ https://www.techopedia.com/news/microsoft-extends-windows-apps-in-3d-on-metas-quest-headset https://www.digitaltrends.com/computing/microsoft-windows-volumetric-apps-coming-to-meta-quest-3 https://techcrunch.com/2024/05/21/microsofts-new-volumetric-apps-for-quest-headsets-extend-windows-apps-into-the-3d-space https://www.theverge.com/2024/5/21/24161817/microsoft-windows-volumetric-apps-meta-quest-api2.9KViews1like0CommentsMR-to-VR transition
Hi there, I have a simple question for anyone who can answer it. Are there examples of apps that are live right now that transition from Mixed Reality to Virtual Reality? Is this something that is permitted within the bounds of the current Unity All-In-One SDK? I have a client who wants to make a critical narrative moment out of starting an experience with a few objects in Mixed Reality, inviting you over to a certain vantage point outside of where you're currently standing but near you, and then switching to fully-immersive VR once you enter the trigger point. Is this something that someone has done before? Is there a way to toggle between the passthrough of Mixed Reality functionality and a fully-rendered environment like the old days? Alternatively, has anyone faked this before by e.g. messing with the passthrough visiblity in scripting and then throwing up a skybox or something similar? I'm just starting the initial R&D phase of this project and am just getting started with the Presence Platform but I have been doing both Unity and Unreal-based Oculus development since the early days. Just trying to reach out to the community early on to get some insight into precedent as I start to dig into the SDK myself. Thanks!Solved2.7KViews0likes3Comments[solved] AR Passthrough - Bouncing Ball Template - Dropped Balls Not Reacting to my Scene Setup
Expected = Squeeze the right hand trigger to create and release a ball. Notice that the ball bounces against your defined surfaces. Problem - I've completed my room setup on a Quest Pro and followed instruction for the Bouncing Ball Scene Sample (linked below), and it looks visually correct, however the balls fall through my floor and scene recognized objects when I expect them to stop the balls from falling by creating 2d planes and 3D volumes which the ball would collide with. Can anyone help me understand where I'm off (bouncing them) please? Thank you! Solution - When using Scene in your project it's required to enable its capability in the project config. Ref: https://developer.oculus.com/documentation/unity/unity-scene-bouncing-ball-sample/2.6KViews0likes3CommentsWhere to get OVRSpatialAnchor when loading unbound anchors?
I'm able to create and re-load spatial anchors, but when I reload the unbound anchors I'm unclear on where to get the OVRSpatialAnchor from. The docs say to bind to an OVRSpatialAnchor after Localized. In the sample it looks like this: var spatialAnchor = Instantiate(_anchorPrefab, unboundAnchor.Pose.position, unboundAnchor.Pose.rotation); I'm unclear on where the _anchorPrefab is coming from though as that is also an OVRSpatialAnchor. Is there a way to do this that doesn't use a prefab? When I create my original game object and anchor I'm doing it this way: var gameObject = GameObject.Instantiate(Resources.Load<GameObject(gameObjectPath), worldObjectPose.position, worldObjectPose.orientation); var ovrSpatialAnchor = gameObject.AddComponent<OVRSpatialAnchor>();Solved2.3KViews0likes1CommentOculus Spatial Anchor drifts in the Passthrough space
Iām trying to spawn a Cube as spatial anchor. But after spawning it drifts if I move physically. Iām add OVRSpatialAnchor after instantiating like this, _cube.AddComponent<OVRSpatialAnchor>(); Please correct me if I have to check something on the build settings or some Oculus settings in Unity. Thanks guys!2.3KViews0likes4Comments