FPS not getting more then 30
How can we increase fps of oculus quest. We are using xr interaction toolkit. In empty scene with just camera i am getting 55fps max. But in any project i am getting just 30fps. I have also done normal urp setting you can go though in photos attached. I also applied occlusion culling.74Views0likes2CommentsToo Much Shooting! How about REAL LIFE?
Too many of these games require me to shoot things, even racing! For crying outloud, why cant I JUST RACE? Nascar drivers don't go around making laps shooting at other drivers or coins. And when it comes to a lot of the shooting games, WHY 2 HANDS FOR EVERYTHING? That's not realistic. When I fire handguns in real life, it's only 1 gun; not a handgun in each hand or a handgun in 1 hand and a shotgun in the other. For me, it takes the REALISM out of it. It's like you think every gunfight should be fought to dubstep. Cant you just keep some games simple and realistic to actual life???1KViews1like1CommentHelp Needed: LiveLink and Movement SDK Issues With Custom Avatar
Hi all - I'm attempting to follow Meta's implementation of the Movement SDK on a custom avatar, but am having difficulty getting the retargeting to function as intended. As you'll see in this OBS Recording, many of the bones have broken rotations. Any advice would be appreciated!517Views0likes0CommentsMR-to-VR transition
Hi there, I have a simple question for anyone who can answer it. Are there examples of apps that are live right now that transition from Mixed Reality to Virtual Reality? Is this something that is permitted within the bounds of the current Unity All-In-One SDK? I have a client who wants to make a critical narrative moment out of starting an experience with a few objects in Mixed Reality, inviting you over to a certain vantage point outside of where you're currently standing but near you, and then switching to fully-immersive VR once you enter the trigger point. Is this something that someone has done before? Is there a way to toggle between the passthrough of Mixed Reality functionality and a fully-rendered environment like the old days? Alternatively, has anyone faked this before by e.g. messing with the passthrough visiblity in scripting and then throwing up a skybox or something similar? I'm just starting the initial R&D phase of this project and am just getting started with the Presence Platform but I have been doing both Unity and Unreal-based Oculus development since the early days. Just trying to reach out to the community early on to get some insight into precedent as I start to dig into the SDK myself. Thanks!Solved2.6KViews0likes3CommentsAligning world with Guardian Play Area
I seem to be at a loss for I have been working on this problem for the last couple of days. I am trying to create a game where the player walks through the environment by walking around their play area given it meets the requirements (7 ft x 7 ft). I, however, seem to not be able to make the world align with the play area. I am able to get the play area and calculate where the player is inside of the play area as well as calculate the angle in which the world needs to be rotated to become aligned with the play area. Nothing that I do has seemed to work despite the numbers I have gotten. Has anyone else done this and can give me pointers or a solution on how this might be done?Solved9.7KViews2likes17CommentsWhy does my gameObject's mesh edges were flickering in Oculus?
Hello Guys, I am making a quite big world for my game in Unity for Oculus Rift. Initially when I tried my game with default settings it looked somewhat blurry and mesh edges were flickering like a hell (especially trees). So I changed RenderScale value to 2 from 1. It made huge difference in my game quality but FPS went below 60 and then I adjusted RenderScale value to 1.2 so now my game quality is okay and FPS is also okay. But the problem I'm facing now is the flickering in mesh edges. I even tried with 8x MSAA but still flickering exists. Does anyone know how to solve this. FYI : THOSE FLICKERING PROBLEM IS ONLY IN OCULUS SCREEN, PC MONITOR DOES NOT SHOW THAT MUCH FLICKERING Oculus Version : Oculus Rift CV1 Current Settings : Rendering Path -> Forward, AA -> 8x MSAA, VSync -> Disabled, Color Space -> Gamma, Stereo Rendering Method -> Single Pass. PC Specs : RAM -> 32GB, Processor -> Intel(R) Core(TM) i7-6700K CPU @ 4.00GHz 4.00 GHz, Gfx -> NVIDIA GeForce GTX 1070 If you guys need additional Info, I will provide :smile:6.6KViews0likes9CommentsHow to achieve texture and model quality equal to oculus workroom
I am working on a hospital room, which has all required equipments. There are a lot of jagged edges that are easily viewable. When i tried workroom in oculus, the texture quality and the model quality of the room is excellent. I would like to know that if ther is any specific settings w.r.t model or texture or material or project is needed to achieve this type of quality.701Views1like0CommentsVirtual Glasses
If all glasses do is warp the way that light hits our eyes to make our vision clearer, then do you think it would be possible to code a VR headset to take a glasses prescription and change the way it presents images to mimic the glasses and clear their vision in the virtual space? That we have glasses prescriptions means that we know how those measurements effect our vision, so why can't we program the headset to make those changes to how it presents images to us? If you combine that with eye tracking and foveated rendering, then it doesn't seem like it should be impossible.537Views0likes0CommentsCan't sync Unity "play-mode" with Oculus Quest 2 for developing/testing
I am entering the first stages of becoming a VR developer, and I am currently stuck on a very important aspect of the development process. I followed this tutorial (google this: Developing for VR with Quest 2 & Unity for the First Time) on Unity version: 2020.2.3f1, using a new Quest 2. I reached the end successfully and I was able to see my Scene and red laser hands in my headset! Very exciting for me. However, this only works when I "Build and run", which is a very tedious development strategy when working on a project and ensuring it works every step of the way. So I would like to be able to click the play button in the editor and have my scene "play" on my headset, however it doesn't appear to be so simple. Basically I have no idea how to implement this important aspect of development. I have the oculus app on my phone, Oculus desktop on my computer, but I can't find any clear instructions on how to click play in the unity editor and have it magically appear in my headset (like it does when I build and run). Any advice or links to good resources would be highly highly appreciated! Thanks for reading and have a nice day!1.4KViews1like1CommentHow to make the cursor in the center of the screen?
Hello everyone! I am developing an application in which I need to implement control using a gaze using a gaze cursor. I tried two implementations: 1) A cursor that slides over surfaces similar to the example from the ouclus integration package. I faced the problem that when using IPD tracking and cameras for each eye, it is noticeable how the cursor jumps from distant surfaces to near ones. This is due to the distance between the eyes, the change in the size of the cursor when switching between the near far surfaces is not noticeable only if you render one image per left and right eye. I need to preserve IPD tracking to create a VR effect, so this implementation is not suitable 2) I tried the implementation through a culling mask, when we create a separate camera that draws the cursor in the center of the screen and superimposes it on the image of other cameras. Here there was a problem of double cursor because when drawing the cursor relative to the central camera, this very center for the left and right cameras is located at different points Please suggest a suitable implementation or modification of the above Any help would be welcome :)1.2KViews0likes0Comments