Controller tracking and refresh rate
I'm looking for an in-depth understanding of Quest 2 controller tracking and differences between 72hz and 90hz headset refresh rate. Does the controller actually track any differently between these two settings? My understanding is the controller tracks at 60 frames per second, so does that mean there is no difference in the quality of controller tracking between the 72 and 90hz settings? I'm using a low pass filter on the controller speed and I can definitely feel that when my app is set to 90hz, I get higher speed readings on the controller vs 72hz. Any insight here would be appreciated, thanks.1.2KViews0likes1CommentShared space in same physical location for multiple users -- Space Sense?
Hi Everyone, We have an experience in which multiple people HAD been able to slowly walk through an orchestra as the sound changed. Worked very nicely for a large public. BUT: we had been using Space Sense to give people a sense of the others in the environment (which was being monitored), but it seems to have disappeared or failed last year. Sometimes one sees a bit purple throughput, like a ball of tumbleweed, but basically it's not working. This was essential to our mode of presentation. Does anyone have any idea 1) what happened to space sense or 2) what is the most simple, robust replacement? Am I missing something simple?404Views1like0CommentsGaze/Head Tracking Combined w/ Limited Hand Tracking or Gamepad Controller (Ex. Apple Vision Pro UI)
This implementation could drastically increase the ease of use of the Quest 3 UI, and bring it much closer to the usability of the Apple Vision Pro UI for a fraction of the cost, while still allowing the flexibility of trackable controllers, conventional hand tracking, and the use of gamepad controllers or other Bluetooth devices such as keyboards or mouses. This post is directed to other developers who may have ideas to suggest as a solution, or may be interested in incorporating some of these ideas into their projects. Related Post: https://communityforums.atmeta.com/t5/Get-Help/How-to-reposition-or-resize-windows-with-a-gamepad-controller/m-p/1190574#M300771 How to reposition or resize windows with a gamepad controller? (XBox One Controller) "I love the gaze cursor feature when using a controller. Lets me quickly take actions when I don’t want to use the Quest controllers. One thing bugs me though. I’m unable to drag using the gaze cursor. If I hold the A button on an Xbox controller it will rapidly select an item instead of select and hold on to it. Are there any tricks around this?" This is directly copied from a reddit post by Independent_Fill_570 a month ago and it hasn't received any responses yet. https://www.reddit.com/r/MetaQuestVR/comments/1byjora/is_it_possible_to_resize_windows_with_a_contro... I'm having the same issue. I love the ability to use the gaze tracking for the cursor, but it restricts the ability to resize windows, reposition windows, long click (as it only repeatedly single clicks rapidly instead), so selecting text longer than a single word is also an issue. The gaze control seems to be the best substitute to the Apple Vision Pro's eye tracking cursor. Is there any way of using the gaze control to guide the cursor, but a hand tracking gesture to select or click, without it engaging as conventional hand tracking while gaze tracking is enabled? I've spent many hours now searching the internet, looking into potential sideloading options, and even pouring over the Oculus PC source code, but haven't really found anyone talking about this except for the one unanswered reddit post I've linked to. The XBox One controller has basically the same buttons available as the Quest 3 controllers do, minus the tracking, but with the gaze tracking it would be wonderful to have the controller buttons mapped properly and I can't seem to find a way to remap the gamepad keybinds without it being run through Steam and a PC link. I'd like to be able to do this natively on the Quest 3 standalone. Hand tracking only for the selecting or clicking would also be great, but even just the buttons being mapped properly so that pressing A clicks, and holding A holds the click until released would fix the issue. I am aware that pressing the select and menu buttons together toggles the gaze tracking off, and enables very limited use of the Xbox controller buttons for use in a keyboard or such, but that's not what I'm asking about here. Thanks in advance to anyone who has any helpful information to provide on this issue. If gaze tracking were combined with limited hand tracking gestures like making a closed fist, I feel like the quality of this product could more easily rival the user interface of the Apple Vision Pro.1.2KViews0likes0CommentsTravel mode is simple
Use the non domimant controller as a point of constant reference, with multi modal now live you can now allow the user to use the non dominant controller while maintaining two points of control, the non dominant controller can be set down or clipped securely on to a level surface and the ir led for tracking could be used as a point of constant reference, in driving there are to many variables outside and would require a partnership with car companies to make vehicles with I'm certian specs until ai becomes better using the controllers ir trackers as a point of reference you could make the hmd efectively lock on to a position and maintain 6 dof. Think of it as inside out outside in tracking . The hmd tracks the controller for point of reference instead of environment or relying on base stations it just uses the one controller as a pseudo base station. Although not as perfect as imu rotating with the cameras you could utilise the cameras and some imu AI magic, to ignore a sharp or slow rotation without visual data from the cameras saying the controller has moved position the smoothing this out or make it only take data from the controlers position.this could in theory work for planes trains an automobiles.... see what I did there? that was a great film. But siriously this could allow for passthrough features to work more smoothly without needing 6 dof to be turned off and 3dof enabled ignore direction imu input prioritise visual turning data and to have a small amount of imu for head turns. All this and you still have one controller for accurate typing, and hand controlls and tracking with multimodal.. I got the idea from how they set horizon on the iss. They use the controller in a jig to keep horizon. We'll meta could use the tracking from the controler and just reverse the what moves what, and put a button on the hotbar. So it's instant and doesn't need to be trolled through setting like disable 6dof. I do t want royalties for my ideas, just a cake 🎂 and not a cake that's a lie please. Aha nah I just wana see meta do better than apple and hope this is a good idea, I don't like the idea of lucky's dream and metas build for the customer ethos being crushed by a company that values cash over customers also the apple can't do this and would be able to have the accuracy achieved by the tracking method1.2KViews0likes0CommentsExternal object tracking
Now that the passthrough API is available I was hoping I could use it to scan a QR code or AR tag, however, the camera layer is not available to us as developers, which I can understand but this limits so much on the AR front. Why wouldn't there be an option that asks this app can see your environment do you want to continue? Rather than have it blocked by default. To get to the point, what would be another way to track an external object? for example, I have a box in my real-world environment and would like to be able to track its position in-game. Are Bluetooth trackers an option? Any ideas are welcome, the main goal is to translate a real-world object to an in-game position. Greetings, Smoothy1015.9KViews9likes5CommentsCreating Tracked keyboard-like items
Hello everyone! TLDR: I am not a programmer! I am hoping to chat with someone about creating a tracked "keyboard" and ask how difficult it is to do. I am a Lighting Designer. One way I wish to use VR is to pre-vis my shows. The workflow I am imagining is using Immersed or other similar programs to have huge screens showing my virtual environment while I listen to music and program my lighting shows. The issue is I can't see my lighting console in VR/AR space. Since I can see my tracked keyboard, I was thinking it should be possible to track the lighting console too. So, anyone know how to do that or how hard it is? Thanks! Bobby548Views0likes0CommentsRoomscale Tracking inconsistent
Hi Everyone, I developed an app where there are 3D Models in VR, which represent furniture in real life. The purpose of this app is to see the furniture in VR but feel it in real life. The problem is, that the tracking of the real world has to be consistent and accurate, so that the position of the 3D Model / scene is always exact at the position of the furniture in real life. That works as long as the Quest 2 doesn't go in Standy or gets moved out of the guardian. If this happens, sometimes the guardian can not be found or is positioned slightly different, when waking up the Quest 2 from Standby. Is there something like an external tracker, or some other solution, which can help to keep the positioning/tracking as consistent and accurate as I need it for my app? Additional Information: The App gets streamed via Airlink onto the Quest 2 and is built in Unreal Engine 4.27.2 . Any hints welcome. 😉771Views1like0Comments