Meta Quest 3 bitrate and latency for PCVR over Wi-Fi 6e, Air Link, and USB C
Can anyone provide this information? The only interest I have in the Quest 3, is if it can provide a better experience than the Quest 2 for PCVR. The Quest 3 has Wi-Fi 6e capability, but will it improve latency and video compression of the Quest 2? Right now the Quest 2 averages 47 to 57ms of latency, that and the video compression is unacceptable for people who use PCVR headsets with zero compression and 11ms of latency. I'm going to order one from Amazon, so I can return it if it can't improve on the Quest 2, but any help before hand would be appreciated 🙂73KViews4likes30Commentsquest 3 audio input latency ?
We have an app that works fine on quest 2. On quest 3 we have sync issues. The featuer is recording audio along with existing audio (and it syncs it that playback plays back same as recording) what could be different about quest 3 in this regard? Thank for any clue , cheers560Views0likes0CommentsAMD / OPENXR / Motion Reprojection / AMS2
I'm trying to get Motion Reprojection working on a system with 6900XT GPU. Unfortunately whenever I try to enable it, it simply crushes my fps. Under favourable circumstances (90fps / 30-40% headroom) whenever I enable it, it drops the fps to 20. I tried all combination of settings: Enhanced / Original / best frame rate, 1/2 framerate, etc. / Prefer frame rate over latency on/off. Nothing works. Whenever it is enabled, my fps is crushed. https://routerlogin.uno/ I want to use Motion Reprojection as a buffer for certain conditions, but unfortunately I am unable to. It has always behaved iffy on AMD cards, but at least it used to work much better. SteamVR works flawless, but fidelity is worse.851Views0likes0CommentsIntroducing artificial latency
Hello! I am working on an experiment for which I need to introduce latency in the rendering of the headset, but I need to be able to control how much latency (somewhere between 100ms and 2 seconds). This way, if the user performs an action, it will be reflected on what he sees after this simulated latency time has passed. Initially, my idea was to "take" the input from the headset, accumulate all the frames in some sort of queue, and after the specified latency value, pass them back to the headset. However, after doing some research online, I could not find any similar existing projects, and unfortunately I am not experienced enough to develop my own solution from scratch. Do you know a relatively straightforward way of achieving this setup? I would greatly appreciate anything ranging from an already existing similar project, to some simple pointers. Thank you for the help!702Views0likes1CommentIs the source code for the Unity 5 Windows ovrplugin.dll available?
I am trying to figure out problems where the reported touch controller positions given in Unity 5 using OVRCameraRig.TrackingSpace.RightHandAnchor.transform are not predicted correctly during high-speed (10 meters/sec) hand controller motions. The reported positions lag behind real-world positions by 10 centimeters or more, corresponding to at least 10 milliseconds of latency. The rendering is very simple and no frames are being dropped. I would like to know whether the poses are being predicted for the next frame display time, and see exactly what Oculus SDK calls are being made (e.g. GetPredictedDisplayTime(), GetTrackingPoseState()). But those calls are made in the Unity Oculus ovrplugin.dll in a routine called ovrp_GetNodePose() and I have not been able to find source code for that library.1.1KViews1like2Comments