Forum Discussion

john.j.kolb.v's avatar
18 days ago

WebGPU Compute into WebXR on Quest

Anyone know the expected date for Meta Quest's WebGPU-WebXR layer? 

i just purchased a MetaQuest 3, to complement my Quest 2, for WebXR development *with* WebGPU ( Compute-Shader only Voxel/SDF engine ) and found Meta-Browser's  doesn't support WebGPU-WebXR, a 'Chromium 134' stable feature.  Suprising since Quest 3  is a "Flagship of XR" device ( in terms of sales/popularity/development ).

Reference check here:
https://immersive-web.github.io/webxr-samples/webgpu/


i've web-searched extensively yet not found a workaround/flag to set or anything to do other than the suggestion to build a WebGPU copy into WebGL context ( wasting bandwidth/VRAM on copying the XR canvas? ) 

Am i missing anything?

Thx---

 

 

5 Replies

  • Mahjai_5's avatar
    Mahjai_5
    Honored Guest

    Seems unlikely that Meta wouldn't update with Chrome flags, especially for the main browser for WebXR? Perhaps you've missed an update?

  • Definitely up to date on device & browser...and tried other browsers ( old Firefox & Wolvic, to no avail ).

     

     

  • Mahjai_5's avatar
    Mahjai_5
    Honored Guest

    i tried webxr-samples/webgpu above...and they did not work on my MetaQuest 3S...

    Is there a forum specifically for webgpu?

    Or is Meta discouraging it in favor of dedicated Meta-Store Apps? Like Apple does with AppStore vs WebApps/PWA?

  • So i'm building a WebGPU offScreen canvas to load into glTexture to present the WebXR camera visuals and have noticed that MetaQuest only offers the DEPTH MAP uint16 as 'gpu-optimized' depthUsage in GL as  texture.

    So to use WebGPU COMPUTE on 'depth-image-of-headset' with WebXR, specifically MetaQuest devices, we have to

    1) copy the depthBuffer from a GL texture back to CPU
    2) then back to GPU
    To show visuals, we have to
    3) copy WebGPU compute results back to CPU
    4) and upload to GL as a texture
    5) then render into the WebXR framebuffer

    lotta extra steps for 'render' and access to GPU depth...

    Meta must have an experimental feature i'm missing if not a regular one?

    The "MetaQuest Developer Hub", MQDH, doesn't offer much for WebXR nor does the standalone Win11 emulator.  The Chrome Plugin behaves differently ( doesn't crash on any requestXYZ api calls, unlike the devices, which hang on requestLightProbe, etc. )

     

     

  • Mahjai_5's avatar
    Mahjai_5
    Honored Guest

    i've been experimenting with putting the WebGPU left/right eye texture into a buffer for readback, and with moving the texture into an offscreen canvas but haven't found any 'fast-paths' to make this happen.

    The reverse approach, coming from the WebGL texture that holds the MetaQuest 3S Depth Buffer back into WebGPU using 'writeTexture' or 'copyExternalImage' is another pathway in need of a fastpath.

    Not sure what Meta's Developer Relations consider a best path??

     

→ Find helpful resources to begin your development journey in Getting Started

→ Get the latest information about HorizonOS development in News & Announcements.

→ Access Start program mentor videos and share knowledge, tutorials, and videos in Community Resources.

→ Get support or provide help in Questions & Discussions.

→ Show off your work in What I’m Building to get feedback and find playtesters.

→ Looking for documentation?  Developer Docs

→ Looking for account support?  Support Center

→ Looking for the previous forum?  Forum Archive

→ Looking to join the Start program? Apply here.

 

Recent Discussions