Performance hovering around 24 FPS on Quest, even though GPU and CPU utilization are low
This is a screenshot with OVR Metrics enabled so you can see some stats. The game feels smoother than the reported FPS count, and it's also very consistent despite high levels at times. However, a few minutes later the CPU level went down to 2 and both GPU and CPU utilization were around 25% in this same spot, but I was still stuck around 24 FPS. Walking over to more intensive scenes with nearly 1400 batches didn't reduce the framerate any lower, and walking up to a wall to have 30 batches didn't improve the frame rate. Unity 2023.2, OpenXR (aiming to build for Steam and Meta), minimal to no use of Oculus/Meta packages, multiplayer game using Photon Fusion 2, it's a pretty big game but I'm turning off dynamic objects that aren't in the same part of the scene (the picture depicts a fourth of the scene). I'm using a lot of optimization, the batch count is around 400 and poly count around 1 million. I've used the profiler and the update loop scripts are as optimized as they can be, all running at 90+ fps on PCVR. I've baked atlases and meshes, used texture compression, set everything I could to static, set up perfect culling, and used the recommended Quest build settings. (I was having a terrible problem where the game would run at .5 FPS when I was trying to profile the quest build, so I made a clean build and turned off profiling and it stopped that)316Views1like0Commentsfeature request for overlay layer:expose native shader of overlay mesh in android plantform
First of all, I truly appreciate Oculus's overlay layer tech for media player developers, which make video looks sharper & less judder. My team has been contributing to develop "view-based media player", the main method uses 24 quadrilateras to combine mesh into what just right cover camera's view. it works well while 24 quads rendered by unity mesh & shader, but we want to introduce this method into overlay, letting overlay renders the 24 quad. I check the newest sdk v1.40, the overlay layer tech could coustom their mesh through vertexs & uv & triangle index by OVROverlayMeshGenerator, but there is no way to custom overlay's mesh shader though sdk v1.40. As for as i know, there should be program id created by gles though glCreateShader, which use for overlay mesh shadering. so, could oculus staff update unity's sdk in future to support changing overlay's mesh shader? Somehitng like i can pass gles style shader into api and got the shader id, so i can modify params by native .so libary and render something very flexibly. [dllImport"mylib"] static extern void ModifyMyShader(int shaderId) // get shader id though api in unity void func() { uint shaderId = createOverlayProgram(string vertexShader, string fragShader);// not unity shader, we use gles native shade here // pass it into native libary which share opengl context, so we can modify it ModifyMyShader(shaderid) } // modify shader though shader id in c++ void func_native(int id) { GL_CHECK(glUseProgram(id)); texY = GL_CHECK(glGetUniformLocation(id, "TextureY")); if (texY != -1) GL_CHECK(glUniform1i(texY, 0)); }1.2KViews1like2CommentsPick an Image via Oculus Gallery/ Open gallery in vr mode via Unity
I am using the native gallery plugin to pick the image via the gallery, its working fine in pico and other vr devices but in oculus go it is opening in non-vr/overlay mode I am unable to click/pick the image .636Views0likes0Comments(Android) Unity got Poor CPU Performance compared with Native apk
Hello everyone, my team have been contributing to develop " view dependent media player ", which using a single thread decode 2048 * 1536 with android-media-codec (we can call it BASE TEX) and four thread decode 512 * 256 * 24 split blocks for viewer directly look at with ffmpeg (we can call it HD TEX), very similar with VR_5K_PLAYER. The above main framework make the cpu usage more heavy compare with gpu. Cpu need decode 2048 *1536 totally and upload YUV from memory to video memory every frame, while gpu only decode 2048 *1536. After a few test with Samsung S8 & Quest, i found that, s8 & Quest got different performance which s8 got a high fps because of Quest has less cpu avaliability compare with S8. ( PS: S8 & Quest all equiped with snapdragon 835. as far as i know, the 835 chip contain 8 core, 4 core of all 8 core are big core responsible for heavy work ) To confirm my point of view , i made a test proj using unity & the latest gear sdk, add heavy cpu work for some thread public class NewBehaviourScript : MonoBehaviour { // Use this for initialization void Start () { for (int i = 0; i < 8; i++) { ThreadStart method = () => threadWork(); Thread thread = new Thread(method); thread.Start(); } void Update() { long sum = 0; for (int i = 0; i < 1000000; ++i) sum = (int)((sum + i) * 2.0f / 2.0f); } void threadWork() { while (true) { long sum = 0; for (int i = 0; i < 100000000; ++i) sum = (int)((sum + i) * 2.0f / 2.0f); Debug.LogFormat("TestThread End name:{0} curr thread id:{1} cur timeStamp:{2}", sum, Thread.CurrentThread.ManagedThreadId, DateTime.Now.ToString()); } } } the enclosed python file use "adb shell cat /proc/stat" to calculate 8 core usage. the above result is S8, the last four core are big core usage, while 3 big core execute full load. the above result is Quest, the last four core are big core usage, while 2 big core execute full load. my team also made a native android project which using gradle & android studio & without gear, loading my dynamic library through jni & execute view dependent media player. the S8 cpu usage is the below result is executed with S8 using unity & gear & same dynamic library, the two test show that native apk get a better & average cpu usage while heavy cpu work executing, compared with unity & gear apk. My question is, Is there a way for unity & gear apk to control device cpu availability rate ? why native apk performance diff with unity & gear apk? why Quest got less core to execute cpu work compared with S8?776Views0likes0CommentsHow can get the correct camera matrix in a native render plugin?
I want to mix the rendering objects in by a Unity render plugin, such as below: the white box is rendered by Unity, and the colored terrain is rendered in the native plugin. As the screenshot shows, they are all at the position(0, 0, 0) when I turn on VR rendering(Oculus), terrain is rendered in both eye's view, but the transform of the terrain mesh is wrong, and moving with oculus's rotation. I guess the problem is caused by view matrix. here is the code I used in unity. the component is attached to a camera, and native render plugin is called(or queued) in the OnPostRender() method. So the question is, how can I render the terrain to the correct position in Oculus?4.4KViews0likes9Comments