Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
PJBHL's avatar
PJBHL
Honored Guest
1 year ago
Solved

360 Video Streaming Performance on Quest 2 and Quest 3

  Hi, Our team is currently in the process of studying and porting a project previously developed in Unity to the new Meta Spatial SDK. This project involves streaming monoscopic 360° videos (not 3...
  • dav-s's avatar
    1 year ago

    Hi!

    Thanks for the detailed post! I will break it down and try and hit each part.

    Panel Performance

    The first thing I will say in reference to panel performance is to utilize layers (https://developers.meta.com/horizon/documentation/spatial-sdk/spatial-sdk-2dpanel-layers). These will allow you to get higher fidelity than regular panels and look better even at lower resolutions.

    For the best performance, you can utilize a panel configuration that directly connects your application to the compositor. To enable this in your case, I would make the following changes to your panel registration:

    return PanelRegistration(R.integer.sky_video_id) {
        layoutResourceId = R.layout.sky_video_layout
        config {
            layoutWidthInPx = 5760
            layoutHeightInPx = 2880
    
            layerConfig = LayerConfig() 
            panelShapeType = PanelShapeType.EQUIRECT
            radiusForCylinderOrSphere = 300.0f
            // important!
            mips = 1
    
            // sceneMeshCreator no longer needed
        }
        ...
    }

    You can see we first enabled layers by specifying a `LayerConfig` and we then set the shape of that to the 360 degree content with "equirect". The final thing that allows this optimization to work is the `mips=1`. By default, we generate mipmaps so that panels will not look pixelated when viewing them from far away. If we disable mipmap generation (ie settings mips to 1), we are able to make an optimization and not touch the contents of the panel and it is consumed directly from the compositor.

    Using this approach, you should get significantly better performance and quality (I have seen cases where 5K videos go from a stuttering mess to rock solid 90fps)

    Using layers, however, comes with the drawback of not being compatible with custom vert/frag shader, since we forward the app directly to the compositor.

    Other Questions

    > Is there a way to detect programmatically (in Kotlin) whether the user is using a Quest 2 or Quest 3?

    It is not recommended to rely too heavily on platform specifics as it may cause incompatibilities with other devices. However, I believe "android.os.Build" should have information that can allow you to distinguish them: https://developer.android.com/reference/android/os/Build

    > Are there specific considerations for implementing shaders (.frag and .vert) for rendering high-resolution 360° videos?

    The biggest advice is switching to layers. However, we are actively investigating speeding up panel performance in future updates.

    > What is the recommended video format and encoding for streaming monoscopic 360° videos, particularly to optimize performance on the Quest 2?

    We don't have any specific recommendations as of now (but thanks for the callout!). Tweaking the Exoplayer setting will usually have an impact.

    > Do i need stereoMode here ?

    If your video is just a single feed of a 360 video, you shouldn't need any stereoMode.

     

    Again, thanks for the detailed feedback! Let me know if you have any other questions!