Meta Quest Browser & BabylonJS = Slow FPS
I'm trying learn about WebXR and I'm finding that my library of choice isn't playing well in the Meta Browswer, locking at about 30 FPS. Now, I see 60 FPS on the desktop with Chrome and I know that's because Chrome is locked in a vsync. I can pass Chrome some parameters to unlock the FPS limiting. At that point, I'm seeing around 500 fps. Chrome works great with Link, but I'm looking to work with the native Meta Browser inside the Quest. I've found other engines, such as the Wonderland Engine, demonstrates 72 FPS in the Quest with their app Escape Artists (https://esc.art). A-Frame applications show up in the browser at 90 FPS. So my question is, is there some sort of parameter or command the WebGL lib should be passing to the browser to speed it up? Or, is this likely to be some sort of bug in BabylonJS that is locking it to 30 FPS and the browser doesn't have any sort of artificial rate limiting?700Views0likes0CommentsUnable to travel
I created a bot to play music in my world. I first tried my not-so-great desktop with 8G RAM and oculus desktop yet horizon runs fine. I can travel to worlds without a problem. My issue was that my music buffered too much with my desktop so then I installed it on my Lenovo X1 gaming laptop 64g RAM and all the bells and whistles but no dice! Horizon loads up just fine but I cant travel to any worlds. I have installed and uninstalled, updated both, set to English and cant get it to work. I even linked my Quest Pro to the laptop. It handled the graphics just fine and still it wont travel from the laptop horizon. Is there any help anyone can provide me with? Ive asked a few key people in the Metaverse and no luck yet.. Sad little DJ Bot807Views1like1CommentMeta Avatars Appearing With Missing Head Or Body In The Unity Editor And On Windows Builds
UPDATE (01/03/2023): As of January 3rd 2023, Loading avatars with federated app ids simply doesn't work. Instead I get an error log saying 'Spec Request Failed'. UPDATE (12/22/2022): I've managed to narrow down this issue to something to do with the way avatars are loaded using federated app credentials. I've made a new post about this here: https://forums.oculusvr.com/t5/Unity-VR-Development/Loading-Meta-Avatars-With-Cross-Play-using-Federated-App-Are/m-p/1010823/highlight/true#M21422 As of the 16th December 2022 we have started seeing incompletely constructed avatars. The Meta Avatar SDK version we are using is 18.0. I have tried updating to 20.0 but still see the same problem. Here are some additional observations.: This does not occur on a Quest build. It happens in the Unity Editor or in a Desktop build. The problem sometimes resolves itself after I make a change to my avatar's look in settings, save the changes and then try reloading the avatar. The problem does not resolve itself by re-loading the user. If a user has the problem, it persists each time the avatar is loaded. Mostly the avatars head is missing, and occasionally only the hands of the avatar entity's LOD4 is visible: Help would be greatly appreciated as it is impossible to launch our product with Meta Avatars if this is not resolved.2.8KViews0likes3CommentsOpening Oculus Dash with Desktop from Game
In our games, we sometimes need players to interact with the desktop (e.g. after opening a Website in the OS browser to let the player interact with privacy settings only available via a Web interface). With SteamVR, we can open the in-VR dashboard with the desktop view via by using a simple API call: OpenVR.Overlay.ShowDashboard("valve.steam.desktop"); What this does is: a) show the SteamVR in-VR dashboard (that's equivalent to pressing the system button), and b) activate the desktop overlay. Is something like that also available when using the Oculus native SDK? I have tried finding it in the documentation but wasn't able to. If this is not possible, yet, could you please add this? We'd like to offer the same features both when using SteamVR and when using Oculus natively.1.1KViews0likes2CommentsOculus Dash steals Input Focus on switch to VR
I have an application that runs in desktop and VR. If I start the app in desktop mode and put on my headset, the Oculus app launches and after a few seconds, my app starts to render in the HMD. A few seconds later, Oculus Dash appears and steals input focus. If I select my application from Dash, it returns focus to my app. If the Oculus app is already running when I switch to VR, this doesn't happen (my app maintains input focus). I tried the same series of steps with an app like Quill. I see that Dash does appear when the Oculus app starts, but then focus is returned to Quill and Dash disappears. As far as I can tell, I'm following all the advice from https://developer.oculus.com/documentation/pcsdk/latest/concepts/book-dg/ Does anyone know why this might be happening?554Views0likes0CommentsRunning Stand alone & Unity editor instance together for multiplayer testing.
I'm working on a multiplayer game and I would like to find a way to deploy a standalone version that does connect to the oculus rift and the unity version that doesnt. Currently when I run both stand alone and Unity editor, only one will run at a time. The other freezes until the app is clicked, then the view from my oculus rift switches to the other view. I'm sure I can simply just remove the OVR components and add a new camera but I was seeing if there was a better solution.938Views0likes1CommentApplying for Oculus Touch Dev Kit
We would love to start implementing Oculus Touch for our VR Desktop application "Hexkraft Haus" as the Leap Motion/Vive controllers simply do not feel like your hands touching screens and buttons. How can we apply for a dev kit? Please check out https://www.youtube.com/watch?v=LEw3Vfx9B38 and www.hexkraft.com for additional information. Appreciate any feedback:)1.5KViews0likes4CommentsIs it possible to use MSAA in an OpenGL app. SDK 1.3.2
I am currently using SDK 1.3.2 to develop desktop apps in OpenGL. I would like to implement MSAA, because my renders are looking pretty chunky. In the old days we used to make our own eye FBO's and link them up, nowadays the SDK is doing it all that internally, when you call "ovr_CreateTextureSwapChainGL" you pass it a description: typedef struct ovrTextureSwapChainDesc_ { ovrTextureType Type; ovrTextureFormat Format; int ArraySize; ///< Only supported with ovrTexture_2D. Not supported on PC at this time. int Width; int Height; int MipLevels; int SampleCount; ///< Current only supported on depth textures ovrBool StaticImage; ///< Not buffered in a chain. For images that don't change unsigned int MiscFlags; ///< ovrTextureFlags unsigned int BindFlags; ///< ovrTextureBindFlags. Not used for GL. } ovrTextureSwapChainDesc; The description clearly states that "SampleCount; ///< Current only supported on depth textures" So does this mean you cannot use MSAA for color eye render targets? When I tried anything other than 1 I get the error "Failed to create texture." Any help would be appreciated. Thanks729Views0likes1Comment