Achievement not logged from development build
I´m adding achievements for my game and on a build on OculusGo it doesn´t seem to log achievement yet to dev panel. Is there some condition to be met to try it out on a build. I already have platformSDK running in the back fetching player name. here´s the code that unlocks: using System.Collections; using System.Collections.Generic; using UnityEngine; using Oculus.Platform; using Oculus.Platform.Models; public class Unlocker : MonoBehaviour { public string id; // Use this for initialization void Start () { Oculus.Platform.Achievements.Unlock(id); } } Any idea?1.7KViews1like5CommentsVulkan debug layer support
I can't get debug layers to work on Quest. vkEnumerateDeviceLayerProperties and vkEnumerateInstanceLayerProperties both return 0 results. Are validation layers not supported or am I doing something wrong (e.g loading the wrong .so)? See code below I notice the Vulkan Cubeworld sample refers to VK_LAYER_LUNARG_api_dump but it seems it's compiled out, which is why it presumably doesn't fail in vkCreateInstance() mVulkanLoader = dlopen("libvulkan.so", RTLD_NOW | RTLD_LOCAL); assert(mVulkanLoader != NULL); vkGetInstanceProcAddr = (PFN_vkGetInstanceProcAddr)dlsym(mVulkanLoader, "vkGetInstanceProcAddr"); vkEnumerateInstanceLayerProperties = (PFN_vkEnumerateInstanceLayerProperties)dlsym(mVulkanLoader, "vkEnumerateInstanceLayerProperties"); vkEnumerateInstanceExtensionProperties = (PFN_vkEnumerateInstanceExtensionProperties)dlsym(mVulkanLoader, "vkEnumerateInstanceExtensionProperties"); vkCreateInstance = (PFN_vkCreateInstance)dlsym(mVulkanLoader, "vkCreateInstance"); uint32_t layerCount; vkEnumerateInstanceLayerProperties(&layerCount, nullptr); std::vector<VkLayerProperties> availableLayers(layerCount); vkEnumerateInstanceLayerProperties(&layerCount, availableLayers.data()); VkLayerProperties* layerProps = availableLayers.data();1.3KViews1like1CommentOVR Lint marks diffuse and unlit mobile shaders as if they weren't single pass
Why is OVR Lint telling me "Please use materials with 2 or less passes" and listing materials that already have single pass shaders such as Unity's Mobile / Diffuse or Mobile / Unlit? Is this a bug in OVR Lint or should I be concerned about it? Looks like I just can't find any shaders at all that wouldn't be reported by OVR Lint as having more than 2 passes. I'm using Unity 5.5.3f, platform android (Gear VR).3.8KViews1like9CommentsOculus browser changed to desktop mode by default
Hi, in latest browser updates default mode was changed to desktop. As said: "...Oculus Browser now loads desktop sites by default. This mode is better for the large browser windows in VR and desktop sites let you get more done. You can change the current site you’re on back to mobile mode by clicking a button in the address bar....." My question is, how we can switch to mobile mode programmatically using Javascript from the loaded page. Is there any API method for that?3.1KViews0likes0CommentsNative activity crashes on input event
Hi! I have a android native activity which uses native_app_glue. Everything works fine, except when I am debugging it and vr mode is activated. In that case for a simple touch event the application crashes at AInputQueue_preDispatchEvent (in android_native_app_glue.c, process_input function) with "SIGILL (illegal instruction)". I have no clue what might cause it. I use Android Studio and when a start the same debuggable application without the debugger attached, it works as expected. What am I missing?4.8KViews0likes11CommentsUse of the Gear VR for professional applications - is there a "Kiosk" mode, etc?
Hello, This may be a question for Oculus as much as for the forum but I can't really determine how a developer would get a direct line to Oculus so I figured I'd start here. My company is using the Gear VR as an all-in-one tetherless VR platform for a new product currently under development. I was wondering if there's any possibility of using the Gear in some kind of "kiosk" mode - basically, I want to be able to grease the wheels a bit on the process of actually launching our product so our customers don't have to deal with stuff like Phone fitting into the headset Dismissing Android notifications/messages Finding/updating/launching app from the oculus store It is frustrating to require our customers to navigate the oculus store just to use an app that is only available to them in a custom release channel - we're able to work around that using an android shortcut app, but the other challenges are more difficult to work around. Is there a way we can package the S7 into the Gear as a single unit, and disable as much extraneous processes as possible so it functions primarily as an all-in-one headset for our application? We're currently using Unity but if this is doable using native development that works. Thanks5.5KViews3likes24CommentsHow did I port PC|Rift version of my game to mobile VR using UE4. tips and opt tricks to hit 60fps.
Hello guys, below I wrote my journey on how did I port my PC Rift/Vive version of "StardustVR" to mobile, it wasn't an easy job but hopefully below tips will be useful for your guys too. UE4 never meant for mobile, its really heavy since most of the feature enabled by default so you will have to disable them, I run an empty level on my Galaxy S7 and I was getting 30 fps, this made me think on what kind of sorcery devs used to make their UE4 game run 60 fps on Gearvr/Go/Quest. I didn't know it will be this long ::smile: , maybe that's why I'm getting stuck on my games, I thought it will be 5 lines of tips. so grab a coffee before start reading. it might be helpful for you guys so you don't fall into the same mistake I did. since most of the fixes i found after trial and error for hours or even days. Porting your game into mobile is a sword with 2 edges. It's really cool to see your game runs on your pocket mobile that needed Core i7 CPU with GTX 1080Ti before. but at the same time if you are so attached to your game it will break your heart (literally) which is the main point that might stop you if you ask me, since you spent months on one of the feature/particle effect/ material but when you port it into mobile you will have to cut it down, if you do it you will able to port your game with 60 fps, if you don't then forget about finding another way to hit 60 fps without a sacrifice. Synopsis to reach 60 fps: -Enable forward render (don't ask). -Reduce drawcall to below 100 and poly below 100K, -Reduce all of your texture size to 128 ~ 512 except skysphere. -Make your game unlit. I don't know if its possible to use phyiscal material with UE4 and hit 60 fps. -Make all of your particles effect's material unlit, lower the texture resolution of it, lower its components and complexity and lower its lifetime. -Change audio SFX from the original sample rate 44100hz to 22050hz for all of your sounds. -Disable HDR/postprocess. -Lower number of components on your character. -Simple material. -Use ASTC which works on all GearVR and its fastest with low size. - No AA -Lower everything :) ----- Here is the detail: 1-You have to sacrifice and let your ego to hit that 60 fps. its the hardest part, believe me, I had to disable a lot of things and mainly I have to disable bloom effect (HDR) in my game and my game is only about the environment and the beauty of the graphic, it doesn't have much of gameplay, so disabling my blood (HDR) increased my fps from 40 to 60 but the game looked really dull and no more light shining and glowing effect on my creatures or my levels. - To hit 60 fps on Gearvr/Go you will have to keep under the limitation of 100 drawcall and 100K poly. and believe me its really hard to keep 100 drawcall. *in case you don't know, drawcall in simple term is the number of separate static mesh x the number of materials on that static mesh. My gunfire alone had more than 10 drawcall so if I fire fast on the empty scene and I have 10 bullets on my scene that reach 100 drawcall LOL 2-Get a phone with snapdragon and not Mali GPU, for some reason, UE4 profile or oculus profile doesn't support Mali GPU and probably Unity too, even official Mali GPU profile didn't help me to show a simple utilization percentage. so I was looking for my fps problem like a blind man, Oculus Go has a snapdragon but I think its harder to profile as you will have to put it on your head every time you want to check unlike galaxy phone which let you run your game on flat screen without Oculus GearVR once you enable GearVR dev mode on your phone. 3-Oculus Go almost have the same spec as Galaxy S7. so if your able to run your game on Go then the only GearVR you will lose is S6 and note 4 maybe which is a small part of the market now, but if your game runs 60 fps on S7/Go then you can make it run 60 fps on S6 by simply lowering the res a bit. 4- Mobile phones is unlike PC you can set mobile CPU and GPU clock any time with simple commands (there is 4 level of CPU and GPU. 1 is the lowest clock and 4 is the highest clock) available with Oculus SDK. mobile CPU and GPU is really powerful but looks like you can't set it to "level 4" all the time and if you did then your phone will get the high temp in 10 min and reach the thermal limit and once it does that, Oculus will force the phone to run on low performance 30 fps and you will not able to pass oculus store performance test. The best number works is CPU at "level 2" and "GPU at level 3" this is the default with UE4 and it runs fine on my galaxy s7 for 30 min without a problem and it passed the oculus performance test. so keep this on your mind. --------------- Threads.... If you remember I mentioned before about game thread: Snapdragon has 4 cores. exynos device (Mali GPU) has 8 cores but it doesn't matter since the phone only uses 4 of them at the time and the rest 4 during low power mode etc... *$-Engine has access to 3 cores of Oculus Quest (4th one reserved for TimeWarp), as for Oculus Go it has access to 2 cores and I believe it's the same with GearVR. I don't know how below threads distribute on cores (since we have more threads than cores) so if you know please comment about it as I looked everywhere. From what I experienced, the game based on UE4 uses 3 main CPU thread: -CPU Game thread (your game logic) -CPU Render thread (drawcall) -CPU audio thread. All of these thread runs separately so if one of them takes more time than 16 ms (60 fps) you will have a problem with your fps, at the same time if you try to optimize your game thread lower to 14 ms (by simplifying your game logic) but your render thread is 18 ms you will get final fps of 18 ms, in short words you will have to optimize each thread separately and once its lower than 16 ms then leave it and move on to the next one since optimizing one thread more than other won't benefit you. The reason I detail explained this is because you will face drawcall problem like me :) and fixing other issues won't fix that bottleneck. *Try to learn CPU profiling provided on UE4 engine it's really easy, you will have to capture the data from your mobile with commands and bring the file back to your engine and open it and analyze the bottleneck. there is no other way, the good thing its simple and it will only take a few mins to learn. read and watch below 2 links. https://www.unrealengine.com/en-US/blog/how-to-improve-game-thread-cpu-performance?sessionInvalidated=true https://www.youtube.com/watch?v=GaRLNcdmU4U&t=10s As for GPU optimization, you will have to test with "vr.PixelDensity" like "vr.PixelDensity 0.5" if you get some boost on fps then you are GPU bound. you will have to optimize your shaders. since my game was unlit and I deleted all the fog and fancy shaders and disabled all post process I didn't face much problem later. if you do then learn on how to optimize your GPU side. -----Now------- -How to optimize drawcall (CPU Render thread) So getting back to the main issue which is the limit of drawcall and poly. it wasn't hard for me regarding poly since my game is already using low poly meshes so I only shrank few of high poly inside the engine (one of the UE4 engine features to decrease poly). As for drawcall, here is the thing will help you to lower it: 1- Merge all the actors to one big mesh. 2-Instanced Stereo Rendering/Multi-view for VR. this will cure your drawcall by 50% 3- Reduce the number of materials. For me, since my game is standing in one place and all of my views is almost the same, I merged my level all to one mesh that had 10 materials (after decreasing my materials) so I got 10 drawcall for my level. before it was more than 300 mesh and each with ~4 material. - How to optimize (CPU Game thread), I found out moving components is very toxic on mobile, so I tried to lower it as much as possible by removing all the extra collision boxes and spheres and merge some static mesh into one. my game was already optimized not much of events run peer tick/fps. since UE4 have a problem with GPU particle (its bug in the engine of 4.21 I tested I believe its not fixed yet, I already opened a ticket with Oculus team, hopfully they will fix it or its fixed on 4.22 which I haven't tested it yet) this really hearted my project so I had to convert all the particle effects into CPU type and optimize my particles by reducing its component and lowering its texture resolution. and lower its lifetime. And since I merged all of my actor into one big mesh I didn't had the need for Occlusion Culling, so i disabled Occlusion Culling which was taking some of my fps. (~2 ms probably) -How to optimize CPU audio thread wasn't that much hard, I tried to convert most of my SFX from original sample rate 44100hz to 22050hz and this reduced the size a lot and fixed the time needed to read this sfx from the slow memory of mobile that was causing spike during running this sfx. ----- Regarding GPU optimization, I didn't have many problems and I was able to run my game with 124% Res since my game already using unlit materials with simple material instructions and full rough material. but I used ES2 and HDR Off (which disabled all of my post-processes) otherwise it was impossible to run empty level with HDR on gearvr. Here is the links that helped me, some of them are unity and others are UE4 but all are the same principle: * https://developer.oculus.com/blog/squeezing-performance-out-of-your-unity-gear-vr-game/ * https://developer.oculus.com/blog/squeezing-performance-out-of-your-unity-gear-vr-game-continued/ * https://developer.oculus.com/blog/tech-note-profiling-optimizing-on-mobile-devices/ * https://developer.oculus.com/documentation/mobilesdk/0.5/concepts/mobile-power-overview/#mobile-power-overview * pdf and youtube talk with the name "How to scale down and not get caught" * https://developer.oculus.com/documentation/unreal/latest/concepts/unreal-debug-gearvr/ * https://docs.unrealengine.com/en-us/Platforms/Mobile/Performance?fbclid=IwAR213O6tEh_rU6cM1TUmxvRvVcoBqvCV4r42-m4SjZKr8IbHVep87z7q4n4 These are few videos I still haven't checked them but going to drop it here so I get back to it later: https://www.youtube.com/watch?v=belVA1C013A&list=PLL2xVXGs1SP450I0CPTIyIYX3yHCY9GQs&index=11 https://www.youtube.com/watch?v=I7dokp76hhE&list=PLL2xVXGs1SP7RjXUBwur43flR7tRcbYLD&index=8 https://www.youtube.com/watch?v=-uDqC6SIB2A&feature=youtu.be&list=PLL2xVXGs1SP450I0CPTIyIYX3yHCY9GQs&t=1751 https://www.youtube.com/watch?v=BZhOUGG45_o&list=PLL2xVXGs1SP450I0CPTIyIYX3yHCY9GQs&index=23 https://www.youtube.com/watch?v=-uDqC6SIB2A&feature=youtu.be&list=PLL2xVXGs1SP450I0CPTIyIYX3yHCY9GQs&t=1751 https://www.youtube.com/watch?v=hcxetY8g_fs https://developers.google.com/vr/develop/ https://software.intel.com/en-us/articles/vr-developer-tutorial-testing-and-profiling-the-premium-vr-game As you can see, there is a lot of good resources found on Oculus sites, and I'm thankful for all the devs who shared their optimization tricks on Oculus forums and blogs, without them I wouldn't able to publish my game. I will past my DefaultEngine.ini setting regarding rendering setting on the 1st comment section. *$ Source https://developer.oculus.com/blog/down-the-rabbit-hole-w-oculus-quest-the-hardware-software/3.1KViews0likes1CommentTrouble Creating a Basic VRMenu
For the last week or two I've been going through the examples provided in the ovr mobile sdk and can't seem to get my own VRMenu to display. Is there any tips/tricks that aren't obvious? Or is there any samples/tutorials of only a simple GUI? Any help on how to accomplish this is much appreciated. Thanks in advance.514Views2likes0CommentsOculus Mobile fixed foveated rendering— with textures?
I am developing an Oculus Mobile app using the C++ SDK. My target is the Oculus Go (I am not concerned with any platform older than the Oculus Go). A basic version of my code (based closely on the VrCubeWorld_NativeActivity sample from the SDK) is here https://github.com/mcclure/lovr-oculus-mobile I am interested in the Fixed Foveated Rendering feature of Oculus SDK described here here https://developer.oculus.com/documentation/mobilesdk/latest/concepts/mobile-ffr/ because I am using somewhat complicated fragment shaders. I tried inserting the recommended code "vrapi_SetPropertyInt( &Java, VRAPI_FOVEATION_LEVEL, 3 );" at the top of RenderThreadFunction in the NativeActivity.cpp file of my project. It did not appear to have any effect on framerate. I have two questions. 1. Is there a simple way to verify foveation is occurring, or to observe foveation occurring within the headset? 2. What exactly is the effect of setting VRAPI_FOVEATION_LEVEL, and for what kinds of draws is it set? That is, when I draw, I draw by attaching to a framebuffer provided by the mobile sdk swapchain. When I set VRAPI_FOVEATION_LEVEL, with which draws is the new foveation associated? Does it change framebuffers? Does it change framebuffers which are added to the swapchain after VRAPI_FOVEATION_LEVEL is changed? 3. Say I render to a texture and then I render that texture to the screen, so that I can apply something like a postprocessing effect. Is it possible to control whether the texture has a foveation level, or what foveation level is used when drawing to the texture? The concern I have is, if the point of foveation is that the fragment shader is called for a larger number of fragments (because the fragments are "large"), if I render to a non-foveated texture or framebuffer and then copy to the foveated eye, I will pay for fragment shades on all the additional non-foveated pixels even though they are not visible.955Views0likes1CommentOculus Mobile side-by-side stereo rendering— is it possible?
I am developing an Oculus Mobile app using the C++ SDK. My target is the Oculus Go (I am not concerned with any platform older than the Oculus Go). A basic version of my code (based closely on the VrCubeWorld_NativeActivity sample from the SDK) is here https://github.com/mcclure/lovr-oculus-mobile I am looking at the multiview rendering feature described here https://developer.oculus.com/documentation/mobilesdk/latest/concepts/mobile-multiview/ . As I understand code based on VrApi, the swapchain hands you two framebuffers one for each eye; so if you use the multiview opengl extension it allows you to render to both framebuffers simultaneously (one draw call draws to both framebuffers). However: In my exploration of VR on desktop, I have encountered three separate ways of doing stereo rendering. One is multipass stereo, where you simply render each eye one at a time. One is view-based single-pass stereo rendering, which is what the GL_OVR_multiview extension does. However a third option is to use side-by-side single pass stereo rendering, using extensions such as GL_ARB_viewport_array and GLAD_GL_AMD_vertex_shader_viewport_index. In this solution there is only one framebuffer, but multiple viewports, and the single pass draws twice into the single double-wide framebuffer. The project i work with (LOVR) has found that this final method, the side-by-side single-pass stereo rendering, is more flexible than multiview stereo rendering, and therefore preferable to us. (The big problem we encountered with multiview is the requirement of a declaration like "layout(num_views = 2) in;" at the top of every shader, whereas with viewport arrays a single shader can be used for both stereo and mono renders.) My question is: Is it possible to do side-by-side stereo rendering on any Oculus Mobile platform? Naively looking at VrAPI it does not seem to be possible because the swapchain issues single framebuffers per eye…641Views0likes0Comments