cancel
Showing results for 
Search instead for 
Did you mean: 

WebXR VR Teleoperation: Challenges with Stereo Vision and Passthrough Integration

addisionharry.2024
Honored Guest

I’m currently working on a VR teleoperation demo for controlling a robot, using Node.js for WebXR development. Due to some project constraints, we are focusing on achieving the following features:

  1. Stereo Vision in VR: We want to display the robot's stereo camera feed in a first-person VR view. Specifically, we want to show different images for the left and right eyes:

    • For the left eye, display the image captured by the robot's left camera.
    • For the right eye, display the image captured by the robot's right camera.
    • In addition, we want to have a few status indicators (lights) in the VR environment.

    We are currently referencing the WebXR First Steps demo to achieve this. However, we’re encountering difficulty when trying to display different images for each eye. Does anyone have suggestions or best practices on how to achieve this stereo view in WebXR?

  2. XR Mode with Passthrough: We also want to implement passthrough for the non-plane areas of the XR environment while displaying the stereo images on the planes. Essentially, we want to:

    • Show the stereo camera images on the planes (as described in point 1).
    • For the rest of the scene, integrate passthrough to allow users to see the real-world environment in XR.

    Are there any technical solutions or frameworks that could help us integrate passthrough with WebXR, or any specific methods for blending passthrough with other 3D content in XR mode?

Any guidance or suggestions would be greatly appreciated!

1 ACCEPTED SOLUTION

Accepted Solutions

felixtrz96
Meta Employee

Hi  Addisionharry,


For achieving the stereo vision in VR (point 1), you can utilize WebXR's layers feature. If you're aiming for a first-person view from the robot's perspective, you can use an Equirect layer with a stereo 180 or 360 video, streamed live from your robots. This stereo video approach is demonstrated in this example, in which, a side-by-side 180 stereo video is shown using a stereo equirect layer.

If your robot's camera doesn't support 180 or 360-degree views, consider using a stereo quad layer positioned in front of the user's eyes. You can refer to this example for guidance.

Regarding the status lights, while I'm not entirely sure of your specific use case, but you might achieve this using a projection layer in three.js (projection layer is used by default by three.js in browsers where layers support is available). You can refer to the WebXR First Steps guide for a starting point.

For enabling passthrough in XR mode (point 2), it's quite straightforward. Simply request "immersive-ar" as the session mode when initiating a WebXR session, and passthrough will be enabled automatically. The quad or equirect layers will cover some portions of the view, but users will be able to see their surroundings in the uncovered areas. You can see an example of this here.

View solution in original post

1 REPLY 1

felixtrz96
Meta Employee

Hi  Addisionharry,


For achieving the stereo vision in VR (point 1), you can utilize WebXR's layers feature. If you're aiming for a first-person view from the robot's perspective, you can use an Equirect layer with a stereo 180 or 360 video, streamed live from your robots. This stereo video approach is demonstrated in this example, in which, a side-by-side 180 stereo video is shown using a stereo equirect layer.

If your robot's camera doesn't support 180 or 360-degree views, consider using a stereo quad layer positioned in front of the user's eyes. You can refer to this example for guidance.

Regarding the status lights, while I'm not entirely sure of your specific use case, but you might achieve this using a projection layer in three.js (projection layer is used by default by three.js in browsers where layers support is available). You can refer to the WebXR First Steps guide for a starting point.

For enabling passthrough in XR mode (point 2), it's quite straightforward. Simply request "immersive-ar" as the session mode when initiating a WebXR session, and passthrough will be enabled automatically. The quad or equirect layers will cover some portions of the view, but users will be able to see their surroundings in the uncovered areas. You can see an example of this here.