10-31-2022 11:06 AM - edited 10-31-2022 07:29 PM
The developer blog says:
"We built Passthrough API with privacy in mind. Apps that use Passthrough API cannot access, view, or store images or videos of your physical environment from the Oculus Quest 2 sensors. This means raw images from device sensors are processed on-device."
That's all well and good, but what if an application has a legitimate reason to need access to that data? I can think of several such use cases:
From my understanding, none of these use cases are currently possible on the Quest, even though there's no reason they shouldn't be with the hardware that's present. I couldn't even code it for my own experimentation unless I find some way to root the headset, which is annoying as I spent $1,500 on a Quest Pro and shouldn't be prevented from using its hardware as I please.
So what I recommend is adding an API for accessing this raw image data, which would of course only work if the user grants permission to the app. Apps can already do this on smartphones with no issues, so I don't see how the Quest would be any different. The Oculus Store, of course, can also set policies as necessary, such as requiring apps to keep the image data on device whenever possible, and to never send it to a remote server without the user explicitly giving permission.
I'd have posted this in the SDK Feedback section, but for some reason it says I'm not allowed to start a thread there, so I'm posting it here instead.
06-11-2023 01:21 PM - edited 06-11-2023 01:21 PM
That's never been a compelling problem on smartphones. What makes this any different?
06-11-2023 01:23 PM
We see the elephant and i raise you the fact that these issues already exist in a multitude of systems. Its not exclusive to AV/VR. This is something that is Meta's problem. If they are going to sell AR devices then they need to provide a way for developers to develop applications that make use of the hardware. It shouldn't been this complicated to get set up to develop a AR application. I don't think anyone here is against the way Meta has implemented the system to ensure security. The issues is more with how they provide information for developers to make use of this system to develop AR applications. It just feels like they made it very complicated / partitioned in the name of security but at the same time haven't done enough to provide devs with a launchpad to get going.
10-15-2023 12:22 PM
I agree, people keep suggesting the issue is privacy, but smartphones (as mentioned) have this exact same issue. Infact my phones camera is in much more 'riskier' situations to be compromised than my Quest which sits in my office most of the day. Ultimately consumers trust Meta with the camera data because it's processed on device, so why cannot this just be the case for apps to ? Ask user permission, explain to user that the app will have access to cameras while the app is launched, if the user has issue with this, they don't use the app.
There has to be a solution. I hope now we are able to access both scene mesh from Q3's depth and the depth data itself, this could provide some glimmer of hope. However I believe the low resolution of the depth data, does provide some privacy guarantees. Further more there is no access to the colour texture map of the scene mesh itself, which is disappointing, as this could be useful for augmentations.
10-17-2023 02:00 AM
Totally on board with the sentiments shared here. Meta should reconsider the passthrough camera restrictions. Instead of a blanket ban, why not have app-specific permissions? Let users decide.
We risk falling behind other platforms if we can't fully utilize the hardware. There are plenty of use cases where this would be helpful, for instance in education and healthcare. Also to create richer mixed reality experiences.
Privacy is crucial, but there's a balance to be found. Let's keep pushing for it!
10-18-2023 12:00 AM
I want to say the same thing, imagine in the future when Meta comes out with AR glasses with great features and a lightweight design so that more ppl can use it.
This would even be a boost for accessibility, for people who see little or blink, such as highlighting a certain type of product in a color (books, glasses, pencils, whatever) or other functions.
But for now, even in the current state it can be done and surely people will be encouraged to do it.
There are many who pay $3000 for an OCR to be read with TTS, it is very sad that it is so expensive.
Being able to support the use of the passthrough camera opens up endless possibilities. They could compete against the Apple Vision.
Education and MR experience is very worth too.
12-10-2023 06:48 PM - edited 12-10-2023 06:49 PM
Most useful Mixed Reality applications in the real world will require this. It boggles the mind that Meta is not allowing developers to do this - I don't understand how they're pinning hopes on these devices eventually replacing smartphones and then blocking developers from being able to experiment and develop apps to actually allow this to happen.
12-13-2023 08:00 AM
For added awareness of privacy, the functionality of letting apps use the camera could also be in the settings. Please let us developers use the cameras.
06-19-2024 07:45 AM - edited 06-19-2024 07:47 AM
It would be great if Meta could look into this. We now see the Vision Pro 'Enterprise' mode has access to all their sensors, even their NPU unit on Apple's M2 SOC for offloading certain compute. Why can't Meta give us camera access, with the same permission requests ?
Here's a solid example - Let's say I want to implement a more simple passthrough for my app, and I want to use my own CV code. The Vision Pro has Apple's R1 coprocessor which handles sensor processing and fusion and hence their passthrough is very fast. The Quest 3 however relies on taking away GPU resources from the Adreno 740 in the XR2 Gen 2 SOC, which is not great. For my app I need very fast passthrough and I don't want Meta's fake depth projection done etc. So having access to the two RGB front camera feeds would be a godsend. I could then do my own hacky / cheaper passthrough processing, and free up some GPU resources per frame.
Granted Meta's passthrough does look much better now, however for mobile app's it's hogging the GPU 😞
Edits: Formatting