Why I get this error? I deleted all coroutines but still error
[OVRManager] OnDestroy OVRManager:OnDestroy() Coroutine couldn't be started because the the game object 'PrizmaPlusMove1' is inactive! Oculus.Interaction.InteractableColorVisual:UpdateVisual() Oculus.Interaction.Interactable`2:set_State(InteractableState) Oculus.Interaction.Interactable`2:Disable() [./Runtime/Mono/MonoBehaviour.cpp line -2131044792]522Views0likes0CommentsWhere to get the libOpenCL.so for Quest 2?
Hello, I'm testing out a game concept that uses the Paralution C++ library for efficient matrix calculations on the GPU. I currently have the library exported as a shared library (only a .dll for Windows at the moment) that exposes some extern methods to C#. And I'm using that API within the Unity game engine to do the actual gameplay programming. Everything works fine when I'm running the game via Link/Editor, but I can't run on the Oculus Quest 2 at the moment which I want to actually launch the game on if the prototype pans out. The reason for that is that I need to compile the library into an ARM x64 Android plugin (that's an .so file) via Android Studio. The nuance here, to my current understanding, is that Oculus Quest 2 has the Qualcomm Adreno 600 GPU which only supports OpenCL 2.0 as far as I know. The good thing is that Paralution supports OpenCL-compliant chips, but I need the actual proprietary libOpenCL.so to compile in Android Studio, because it is a dependency of Paralution. This file usually resides in /system/vendor/lib/libOpenCL.so or /system/lib/libOpenCL.so in an Android device if it indeed supports OpenCL. My problem is that there is no Google Play store in the Quest 2 where I could download a file manager that would let me dig into the Android file system to get this .so. I have done quite a lot of searching and have not found a solution. I am getting frustrated cause I've been stuck for two weeks now every day trying to port my prototype to the Quest 2. Anyone has any expertise on this topic that could help me reach my goal? I know what I'm trying to achieve is possible, but I think if I can't overcome the obstacle of getting access to the Adreno 600 version of libOpenCL.so I will not be able to succeed. Thanks for reading and may you have a wonderful rest of the day! - Dohn.1.5KViews0likes1CommentCan't enable foveated rendering
I can't get it to work even in the provided sample (Oculus OpenXR Mobile SDK, 38.0, XrCompositor_NativeActivity). Tried creating foveation profile first and use it for all swapchains, tried to create on per swapchain and delete afterwards (just as in the sample and as OpenXR's docs say it is allowed). Tried setting before enumerating images, after. Of course, the extensions are enabled, there is XrSwapchainCreateInfoFoveationFB attached to XrSwapchainCreateInfo and so on, done just as docs say it should be done, double-checked with provided sample. The thing itself seems like it should be fairly easy to code and use but... it just doesn't work. With VRAPI I had some peculiarities, like I had some glFlush that I did for some testing and it was disabling foveated rendering. I wouldn't be surprised then that I did something else in my code. But why the sample does not work? And what's more important, what to check/do to make foveated rendering work?1.7KViews0likes0CommentsBest way to find an Unreal / C++ coding team to develop an Oculus Quest App?
I have a team of excellent artists and Unreal Engine 3D modelers and am looking for a app development team to work with us to create an Oculus Quest app. I have looked on upwork and have not found what we are looking for. Does anyone have any other suggestions?593Views0likes0CommentsHow to correctly use SRGB on Oculus Mobile Native (C++)?
This is a Go+Quest question. I have a Oculus Mobile C++ app forked from the VRCubeWorld_NativeActivity example in the Oculus Mobile native SDK. I hooked in my own code in `ovrRenderer_RenderFrame`, so right after the framebuffer and the eye matrices are set up I call my own rendering code. This rendering code is shared with desktop, it enables `GL_FRAMEBUFFER_SRGB` and it exclusively draws the SRGB color space. I am trying to figure out how to convert VRCubeWorld_NativeActivity from RGB to SRGB. I want the SRGB drawn by my rendering code to be correctly treated by the NativeActivity.cpp's framebuffer and EGL surface. I have found a way that "works" (I see the colors I expect) but I do not know if it is correct. You can see my exact NativeActivity code here (it is open source). What I did: Pass `GL_SRGB8_ALPHA8` instead of `GL_RGBA8` in ovrFramebuffer_Create Set `VRAPI_MODE_FLAG_FRONT_BUFFER_SRGB` in `parms.flags` when we call `vrapi_EnterVrMode`() What I did NOT do— because I tried these things and they had no effect: Call `setEGLattrib(EGL_GL_COLORSPACE_KHR, EGL_GL_COLORSPACE_SRGB_KHR);` after calling `eglCreateContext()` (I think we do this in our glfw version, and it is recommended in a long comment in VrApi_Types.h). Comment out the many `VRAPI_FRAME_LAYER_FLAG_INHIBIT_SRGB_FRAMEBUFFER` lines left over from the sample code. What I need to know: Am I doing something wrong or unsafe, or incurring any penalty (like unnecessary conversions RGB->SRGB and back again) by failing to explicitly set the EGL_GL_COLORSPACE_SRGB_KHR egl attrib? What do the various VRAPI_FRAME_LAYER_FLAG_INHIBIT_SRGB_FRAMEBUFFER flags do? What does this "layer" code (the "black layer" and "loading icon layer") do, and can I safely remove it? (I have removed a lot of the code from the NativeActivity example because I was sure I did not need it, but there are other parts I did not understand so I have left it in. Because there is not really any comments/documentation on the sample code it is hard to tell what is necessary and what Oculus merely left for us as a convenience.)1.9KViews1like2CommentsUnreal vs Unity?
For an experienced VR app developer with a high level of expertise in both Unreal and Unity, which platform is the better one for making an Oculus Quest 2 app? I am part of a team at the very beginning stages of developing a series of apps for the Oculus Quest 2. Our team is currently split between using Unity and C# vs Unreal and C++. What do you think?2.2KViews0likes2CommentsOVR_CAPI functions for Boundary info don't work with Quest + Link
The functions for getting Boundary Dimensions and Geometry all return 1001 = ovrSuccess_BoundaryInvalid: "The call succeeded but the result is not a valid boundary due to not being set up." I tested these four functions: ovr_GetBoundaryGeometry() ovr_GetBoundaryDimensions() ovr_TestBoundaryPoint() ovr_TestBoundary() I'm using Libovr 1.43 in C++ on a Quest over Oculus Link. It seems the boundary data is not copied from the Quest to the Oculus driver on the PC. Is this a bug? Will these functions be implemented in a future version? If yes, when approximately? (If it should already work and I'm doing something wrong, please tell.)978Views0likes2Commentshelp with oculus rift pc sdk
Hello I have been trying for a week now to write a simple app to display an image on oculus rift ( same for both eyes ). OculusTinyRoom is the only sample using OpenGL and it is way too complicated to adapt. I am reaching the point where I am starting to doubt that this library is maintained any longer. Are there better alternatives? The oculus sdk guide is outdated and almost every sample I can find on the net is at least 3 years old, would not compile against the newest sdk, and using an older sdk would often result in blank screens or all kind of errors. I am using C++ / OpenGL / Visual Studio 2017. Can anyone please help me with a link to an updated sample? Thank you !661Views0likes0CommentsUE4 switch eye rendered to spectator screen
Hi there, I'm working with a DK2 in unreal engine, and I really need to figure out how to change which eye is rendered to the single eye spectator screen so that I can record some footage. There are a few modes which show both eyes, but all the single eye modes default to the left eye and for my application I absolutely need it to be the right eye. I saw this post on the unreal forums about editing .cpp and recompiling for the Vive, but I have no idea what files I would have to go about editing for the DK2, or how to recompile anything. Not allowed to post links so the name of the thread is "Problem about monitor resolution display in vive vr mode" if anyone is curious. Essentially it just says to edit the line of code relating to the `DrawRectangle` function and changing the values so it selects a different area to crop. I would really appreciate it if someone could help me. many thanks, H898Views0likes0Comments