FOV is really zoomed in on certain games?
Whenever I play some of my games, they are really zoomed in. And I don't mean that they’re huge or small areas; I mean that everything is zoomed in on the screen. It's basically if the edges of the screen were cut off and the rest of the screen was stretched to fit the rest of the area. It's really uncomfortable, and it's impossible for me to change it. HOWEVER, it only applies to games ran on OCULUS. If I were to run the same game on SteamVR, it's just fine because I can actually change the FOV and world scale in the Steam menu (not that I needed to in the first place). What do I do about this?2.3KViews0likes0CommentsEvery third frame is frozen (sometimes two out of every 3 frames)
My OpenGL application is an emulator that displays on a virtual cinema screen. The emulated game runs on a different thread and OpenGL context than the VR cinema. It works perfectly (looking around the virtual cinema with a blank screen) until I start the emulated game while wearing the Rift. But from then on, 1 (or sometimes 2 if I'm really unlucky) of the 3 frames in the VR swapchain are frozen to whatever they were displaying at that point and locked to my face. The remaining 2 (or 1) frame(s) render as normal and respond to head-tracking. Once it gets into this state, it never recovers, no matter what I do, until I close my application and restart it. But if I take off the Rift to start the emulated game, then put the Rift back on again after the game has started, everything works perfectly (looking around the virtual cinema with a screen displaying the emulated game). This only happens when I use the Oculus API. It works fine using the Rift with the SteamVR API. One of two bugs in the Oculus SDK must be happening: 1. Extra load on the CPU/HardDisk/GPU during emulated game startup while calling the Oculus SDK functions is causing the Oculus SDK to malfunction. 2. Some OpenGL initialization I'm doing on the other thread (with shared display lists) while calling the Oculus SDK functions is causing the Oculus SDK to malfunction.728Views0likes0CommentsOculus Rift S Controller bug on Unity?
When I use OVRInput.Get(OVRInput.Axis2D.SecondaryThumbStick) it usually returns the position of the SECONDARY Thumb stick but sporadically return the PRIMARY controller at other times. The same problem occurs with the Oculus Demo 'Distance Grab' should be easily reproduced. I'm using Oculus Integration 12.0 with Unity 2019.3.0f6 on the Oculus Rift S with the Touch Controllers,944Views0likes1CommentCannot add test users
Hello, I am experiencing blocking problems with the test process. I cannot seem to be able to create test users at all. I also tried adding my own account. I always receive the following message: "Sorry, we couldn't locate the user that you were looking for." Conversely, the page accepts IDs that are not emails (such as "aaa"). I can see those fake users in the list, but obviously I have no way of using them. Please help.1.5KViews0likes3CommentsOculusSDK causing application crash on exit
Windows 10, Visual Studio 2010 C++ OculusSDK 1.8.0, firmware version 7.9 GeForce GTX 1070/PCIe/SSE2, OpenGL Version: 4.6.0 NVIDIA 398.11, GLSL Version: 4.60 NVIDIA It seems that something has changed just in the past few days (as of 2018-08-23) that's causing our application to crash on exit. We wrap the OculusSDK 1.8.0 in our own DLL that we load at run-time via LoadLibrary, depending on application configuration options. (The application supports various VR tracking and display environments; the Oculus Rift is one option.) As of today, the application crashes on exit, only if the OculusSDK has been opened, and only in the release build; the debug build does not crash. This happens whether or not we explicitly call FreeLibrary for our DLL at program exit. The OculusSDK logs these status messages in our console window: [ At startup, we open with ovr_Initialize (nullptr) ] 23/08 19:07:20.832 {INFO} [Kernel:Default] [CAPI] LibOVR module is located at C:\Program Files\Oculus\Support\oculus-runtime\LibOVRRT64_1.dll 23/08 19:07:20.835 {INFO} [Client] Connected to the server running version (prod = 1).1.29.0(build = 651191) feature version = 0. Client runs version (prod = 1).1.29.0(build = 0) feature version = 0 23/08 19:07:20.838 {DEBUG} [Kernel:Default] [HMDState] Using default profile default 23/08 19:07:20.838 {INFO} [Kernel:Default] IAD changed to 64.2mm 23/08 19:07:20.839 {DEBUG} [SharedMemory] Creating factory 23/08 19:07:22.178 {DEBUG} [D3D11_CliCompositorClient] CliD3D11CompositorClient::initialize 1 23/08 19:07:22.220 {DEBUG} [KMTSyncObject] Creating KMTHandle 0x0d2fc4d0 23/08 19:07:22.222 {DEBUG} [KMTSyncObject] Creating KMTHandle 0x0d2fc610 23/08 19:07:22.222 {DEBUG} [KMTSyncObject] KMTHandle::Create hDevice=0x80000380 23/08 19:07:22.222 {DEBUG} [KMTSyncObject] KMTHandle::Create hContexts[0] = 2147485504 23/08 19:07:22.222 {DEBUG} [KMTSyncObject] KMTHandle::Create hContexts[1] = 2147485824 23/08 19:07:22.286 {DEBUG} [D3D11_CliCompositorClient] CliD3D11CompositorClient::addGLRef 2 23/08 19:07:23.552 {INFO} [Kernel:Default] [HMDState] Detected the active window handle changed to 111af0ll 23/08 19:07:28.635 {WARNING} [Tracking:Filter] Prediction interval too high: 0.101367 s, clamping at 0.100000 s [ The application runs well, with a good frame rate. I can move and resize the on-screen mirror window. Everything renders well in the Rift and on-screen. The test scene averages well over 100 frames per second, so the initial high prediction interval seems to be a startup fluke. ] [ At exit, we close with ovr_Destroy (m_Session) and ovr_Shutdown() ] 23/08 19:07:31.115 {DEBUG} [D3D11_CliCompositorClient] CliD3D11CompositorClient::release 2 23/08 19:07:31.117 {DEBUG} [D3D11_CliCompositorClient] CliD3D11CompositorClient::release 1 23/08 19:07:31.117 {DEBUG} [D3D11_CliCompositorClient] Unblocking monitored fence 23/08 19:07:31.121 {INFO} [Kernel:System] Graceful shutdown: OnThreadDestroy 23/08 19:07:31.121 {INFO} [Kernel:System] Graceful shutdown: OnSystemDestroy 23/08 19:07:31.121 {DEBUG} [SharedMemory] Destroying factory 23/08 19:07:31.121 {DEBUG} [Kernel:Default] [Client] Disconnected 23/08 19:07:31.121 {INFO} [Kernel:System] Graceful shutdown: Stopping logger At program exit, it dies in the Windows function __tmainCRTStartup(void) (in crtexe.c): A problem caused the program to stop working correctly. All of our C++ wrapper class close and d'tor functions appear to return successfully. I have wrapped the entire body of our main function in try {} catch() {}, but it fails to catch this exception. I have put a 10-second sleep just before the program exit, but that merely delays the crash; it does not avoid it. The console messages and the crash are all new behavior that I did not see (or notice) just a few days ago. The crash is definitely a new behavior. This application has not changed during that time. Judging from those console messages, it seems that maybe some Oculus resource isn't shutting down properly -- and that maybe somebody has been struggling with that. The fact that it crashes in the release build, but not the debug build, suggests that maybe something has not been initialized properly (e.g., to zero). Maybe the debug build initializes it; or maybe the lower degree of optimization avoids some memory corruption. Maybe there's a dangling reference to SharedMemory. Maybe the logging itself is causing the crash. I did not request logging, and in fact tried to explicitly disable it with ovrInitParams init_params = { 0, 0, nullptr, 0, 0 }; ovr_Initialize (&init_params) but the logging messages still appear in the console window. The appearance of the logging messages correlates with the crash-at-exit behavior: they're both new. Please advise. Thanks. -- Ted Hall <twhall@umich.edu>Solved3.9KViews0likes5Commentsovr_CreateTextureSwapChainGL() failed
So every tine I attempt to create the swap chains, I got the error code ovrError_InvalidParameter == -1005 with the following message returned by ovr_GetLastErrorInfo(): BindFlags not supported for OpenGL applications. I am using Oculus DK2 with the SDK 1.20 and latest NVidia drivers. This issue started many months ago, probably since 1.12 or so. The code is a direct copy-paste from the OculusRoomTiny (GL) sample, or from the Texture Swap Chain Initialization documentation: ovrTextureSwapChainDesc desc = {}; desc.Type = ovrTexture_2D; desc.ArraySize = 1; desc.Format = OVR_FORMAT_R8G8B8A8_UNORM_SRGB; desc.Width = w; desc.Height = h; desc.MipLevels = 1; desc.SampleCount = 1; desc.StaticImage = ovrFalse; desc.MiscFlags = ovrTextureMisc_None; // Tried a direct assignment, but still fail desc.BindFlags = ovrTextureBind_None; // Tried a direct assignment, but still fail ovrResult res = ovr_CreateTextureSwapChainGL(session, &desc, &swapTextures); // res is always ovrError_InvalidParameter The initialization sequence is following (I am skipping error checks for clarity): ovr_Detect(0); ovrInitParams params = { ovrInit_Debug | ovrInit_RequestVersion, OVR_MINOR_VERSION, OculusLogCallback, 0, 0 }; ovr_Initialize(¶ms); ovrGraphicsLuid luid; ovr_Create(&hmdSession, &luid); //------------------------------------ // in the function called at WM_CREATE message: <Create core GL context, make current> <init GLEW library> //------------------------------------ hmdDesc = ovr_GetHmdDesc(hmdSession); ovrSizei ResLeft = ovr_GetFovTextureSize(hmdSession, ovrEye_Left, hmdDesc.DefaultEyeFov[0], 1.0f); ovrSizei ResRight = ovr_GetFovTextureSize(hmdSession, ovrEye_Right, hmdDesc.DefaultEyeFov[1], 1.0f); int w = max(ResLeft.w, ResRight.w); // 1184 int h = max(ResLeft.h, ResRight.h); // 1472 for (int eyeIdx = 0; eyeIdx < ovrEye_Count; eyeIdx++) if (!eyeBuffers[eyeIdx].Create(hmdSession, w, h)) // this function's code is provided above { ovr_GetLastErrorInfo(&err); log_error(err.ErrorString); // "BindFlags not supported for OpenGL applications." } So according to the error message, SDK thinks that I am assigned something to desc.BindFlags, while I am not. I tried to directly assign ovrTextureBind_None value to it (which is just zero), but still no success. I traced all variable values in debugger, they are the same as is the OculusRoomTiny (GL) sample. The only difference in my code that I can see is that I am using GLEW library to handle OpenGL extensions, while sample uses OVR::GLE. It initializes it immediately after wglMakeCurrent(): OVR::GLEContext::SetCurrentContext(&GLEContext); GLEContext.Init(); Can this be the cause? But I don't want to switch to Oculus' extensions library. My project is not Oculus-exclusive, it supports Vive and non-VR modes as well. If it is a bug inside libovr, I request Oculus team to fix it!5KViews0likes14CommentsUnity Camera Preview different than Rift view (bug?)
I'm making a multiplayer Unity game using Photon Unity Network to handle the multiplayer. I can re-create this issue by making a Room with one player (P1) (their camera loads into position and rotation correctly, even if they load in off kilter). Then a 2nd player joins the Room (P2), but they load in with their Rift pointing at an odd angle (physically pointing the HMD off center or even sideways) (not straight forward and level). Their "center" for their camera will be that off kilter view instead of a level view. I have no idea why this is. (I'm able to re-create this on my own computer starting up 2 versions of my game. And on 2 separate computers loading the game with 2 HMDs.) There is only one OVRManager in the scene (each player loads in their own, but the OVRManager is not networked). The other players only load in the graphics of their player Avatar with their camera disabled so there isn't any conflicts. What I find odd is that if I do this in the Unity editor and I click on the P2 active CenterEyeCamera the "Preview" for that camera is correct and level, but what's piped out to the Game Window and Rift is off kilter. Even the CenterEyeCamera rotation is level. It's like it got off sync somehow with the camera in game. Is there any way to access what the camera is outputing to the Rift and get that object's rotation? Or manually recenter (other than UnityEngine.VR.InputTracking.Recenter(); as this just puts the CenterEyeCamera back to where it should be, but doesn't fix the off kilter output). Would it be better to manually add the Camera onto the CenterEyeCamera rather than just disabling/enabling? Or is there some special magic that Unity and the Rift do to get it to output to the Rift display that's embedded into that camera's settings (i.e. what settings do I need to set for a camera object)? Apologies for the vague questions, but I'm not sure how else to debug this issue.1.8KViews0likes2CommentsCV1 Bug relative tracking in fullmotion rigs
Not sure if this is the right places for Bugs, but anyhow: Since DK2 the tracking has changed and now i got this two Bugs: - If CV1 CAM is mounted on a motion rig and roll angles exeed 10°, the Cam looses tracking every few seconds (vertical/horizontal offset flickers). - CV1 can´t be setup with relative tracking (mounting sensor on moving part of rig), because roll, yaw and pitch vectors are tracked by the HMD only. - A workaround like disable/config the internal sensors is not possible in CV1 SDK All things as far as i know. Any hints, clues, solutions?695Views0likes1Comment