Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
ROBYER1's avatar
ROBYER1
Expert Protege
6 years ago

Enable Foveated Rendering in Unity without Oculus Integration

From my understanding, to enable foveated rendering on the Quest/Go, you must use "OVRManager.fixedFoveatedRenderingLevel = OVRManager.FixedFoveatedRenderingLevel.High" or similar to enable foveated rendering in a Unity Project. This also requires having at least one OVRManager on a gameobject in the scene.

I would like to know if it is possible to enable the foveated rendering without the need for the OVRManager in the scene or having Oculus Utilities/Integration imported into the Unity Project at all? Like is there anything we can add to the android manifest or call something from the Oculus(Android) integration from the package manager in Unity?

We want to simply use the Oculus (android) and Oculus (Desktop) packages from the package manager in our application but not include too much of the Oculus Integration asset from the Unity Asset store as it contains lots of code that conflicts with other plugins we are using.

4 Replies

  • No, you must use the oculus integration for FFR
  • ROBYER1's avatar
    ROBYER1
    Expert Protege

    I'm trying to get to the very bottom of where and how to activate FFR using Oculus scripts, it seems the very minimum thing you need is the 'OVR Camera Rig' script, and another script that has:

    Code (CSharp):
    1.  OVRPlugin.fixedFoveatedRenderingLevel = OVRPlugin.FixedFoveatedRenderingLevel.HighTop;
    in the start function. 

    But apart from that, I'm having no luck simplifying the implementation of the FFR as we don't really want to use all the Oculus-specific camera rig and Utilities as our app is supporting other headsets. It seems to all lead back to the OVRPlugin referenced in the scripts, see this example from the 'OVRCameraRig.cs' that I am currently trying to get working without the rest of the code


    Code (CSharp):
    1.     if (OVRNodeStateProperties.GetNodeStatePropertyVector3(Node.Head, NodeStatePropertyType.Position, OVRPlugin.Node.Head, OVRPlugin.Step.Render, out pos))
    2.             headPose.position = pos;
    3.         if (OVRNodeStateProperties.GetNodeStatePropertyQuaternion(Node.Head, NodeStatePropertyType.Orientation, OVRPlugin.Node.Head, OVRPlugin.Step.Render, out rot))
    4.             headPose.orientation = rot;
  • ROBYER1's avatar
    ROBYER1
    Expert Protege
    Just reporting back to say I cracked it, the only reason it wasn't working was because the Unity Camera in my scene was using completely default camera settings (maybe HDR set to 'off') or something. Anyways it would not work until I created a new Unity camera, copied the default values from its camera component and applied them to the camera on our VR rig.
    • blascoburguillos's avatar
      blascoburguillos
      Explorer

      Has this situation remained the same? why do we need to have exactly the same default camera parameters?