Misaligned headset view position/ body tracking
I have a quest 3s and have had no issues for the last few days since I got it, but now my avatar in games is hunched over making it hard to see my hips or chest, and feeling like my view is from in front of my head instead of where my head is at. It also messed up my controller tracking/ perspective in pass through mode, since usually my hands and controllers would match up perfectly and the controllers would have a mixed reality effect to see battery life and a representation of my controllers in passthrough, making it hard to distinguish the two from reality to vr, but now the representation of my hands and controllers dont line up and are smaller than what I see. It's almost like everything seems a bit bigger in passthrough, but smaller in games. Long story short, can I reset the position of my body and head to line up with my own because this is making it hard to interact with holsters and causing bad arm and hand tracking in game apart from the jarring experience of seeing from a foot ahead of my nose136Views0likes6CommentsUnity 6000 - Head tracking issue
Hello, After newest version of Unity 6000, i have issue with head tracking. Every time i create new project, import Meta XR All-in-One SDK(Currently 65.0 v, same with 64.0), everything works fine, except head tracking. when i do exactly the same with unity older version (for example 2022.3.15f1) tracking works. there are few things i can mention. during launch, frame rate is like 72 FPS, when normally its 600+ gives two errors and a warning. all i do is install unity editor(right now its 6000.0.2f1), create empty project(Clean project without anything), import SDK and launch. -----------------------------------------Error SerializedObjectNotCreatableException: Object at index 0 is null UnityEditor.Editor.CreateSerializedObject () (at <26627f99a7af447db68cd5961117dc4a>:0) UnityEditor.Editor.GetSerializedObjectInternal () (at <26627f99a7af447db68cd5961117dc4a>:0) UnityEditor.Editor.get_serializedObject () (at <26627f99a7af447db68cd5961117dc4a>:0) OVRManagerEditor.OnEnable () (at ./Library/PackageCache/com.meta.xr.sdk.core/Scripts/Editor/OVRManagerEditor.cs:39) -----------------------------------------Error Saving Prefab to immutable folder is not allowed: Packages/com.meta.xr.sdk.interaction.ovr/Editor/Blocks/InteractableItems/Prefabs/[BB] Pokeable Plane.prefab UnityEditor.AssetPostprocessingInternal:PostprocessAllAssets (string[],string[],string[],string[],string[],bool) -----------------------------------------Warning Your project uses a scriptable render pipeline. You can use Camera.stereoTargetEye only with the built-in renderer. UnityEngine.Camera:set_stereoTargetEye (UnityEngine.StereoTargetEyeMask) OVRCameraRig:EnsureGameObjectIntegrity () (at ./Library/PackageCache/com.meta.xr.sdk.core/Scripts/OVRCameraRig.cs:603) OVRCameraRig:Awake () (at ./Library/PackageCache/com.meta.xr.sdk.core/Scripts/OVRCameraRig.cs:190) Any clue?5.8KViews2likes10CommentsGaze/Head Tracking Combined w/ Limited Hand Tracking or Gamepad Controller (Ex. Apple Vision Pro UI)
This implementation could drastically increase the ease of use of the Quest 3 UI, and bring it much closer to the usability of the Apple Vision Pro UI for a fraction of the cost, while still allowing the flexibility of trackable controllers, conventional hand tracking, and the use of gamepad controllers or other Bluetooth devices such as keyboards or mouses. This post is directed to other developers who may have ideas to suggest as a solution, or may be interested in incorporating some of these ideas into their projects. Related Post: https://communityforums.atmeta.com/t5/Get-Help/How-to-reposition-or-resize-windows-with-a-gamepad-controller/m-p/1190574#M300771 How to reposition or resize windows with a gamepad controller? (XBox One Controller) "I love the gaze cursor feature when using a controller. Lets me quickly take actions when I don’t want to use the Quest controllers. One thing bugs me though. I’m unable to drag using the gaze cursor. If I hold the A button on an Xbox controller it will rapidly select an item instead of select and hold on to it. Are there any tricks around this?" This is directly copied from a reddit post by Independent_Fill_570 a month ago and it hasn't received any responses yet. https://www.reddit.com/r/MetaQuestVR/comments/1byjora/is_it_possible_to_resize_windows_with_a_contro... I'm having the same issue. I love the ability to use the gaze tracking for the cursor, but it restricts the ability to resize windows, reposition windows, long click (as it only repeatedly single clicks rapidly instead), so selecting text longer than a single word is also an issue. The gaze control seems to be the best substitute to the Apple Vision Pro's eye tracking cursor. Is there any way of using the gaze control to guide the cursor, but a hand tracking gesture to select or click, without it engaging as conventional hand tracking while gaze tracking is enabled? I've spent many hours now searching the internet, looking into potential sideloading options, and even pouring over the Oculus PC source code, but haven't really found anyone talking about this except for the one unanswered reddit post I've linked to. The XBox One controller has basically the same buttons available as the Quest 3 controllers do, minus the tracking, but with the gaze tracking it would be wonderful to have the controller buttons mapped properly and I can't seem to find a way to remap the gamepad keybinds without it being run through Steam and a PC link. I'd like to be able to do this natively on the Quest 3 standalone. Hand tracking only for the selecting or clicking would also be great, but even just the buttons being mapped properly so that pressing A clicks, and holding A holds the click until released would fix the issue. I am aware that pressing the select and menu buttons together toggles the gaze tracking off, and enables very limited use of the Xbox controller buttons for use in a keyboard or such, but that's not what I'm asking about here. Thanks in advance to anyone who has any helpful information to provide on this issue. If gaze tracking were combined with limited hand tracking gestures like making a closed fist, I feel like the quality of this product could more easily rival the user interface of the Apple Vision Pro.1.2KViews0likes0CommentsLaunch app with tracking disabled.
Hi there, can a UnityXR app launch while headset tracking is disabled? One of our clients wants to use their Quest 1 headsets on ships for training purposes. Due to the ship's movement, the headset loses tracking frequently, leading them to disable tracking entirely. Our app doesn't require 6DoF, but it fails to launch if headset tracking is off. I've examined the YouTube app's manifest file, as it can launch with tracking disabled, and attempted to replicate its settings in our app's manifest. However, our app still won't launch when tracking is off. Is there a way to launch a UnityXR app with tracking disabled? Unity 2021.3.25 Oculus XR Plugin 3.3.0 (latest that supports Quest 1) Thanks.868Views0likes1CommentTemporarily decoupling HMD sensors from OVR Camera Rig
Hello, I am developing an application for Oculus Quest2 that will eventually run as an experiment. For part of this experiment, I wish to mirror the rotation of the camera such that its y axis(yaw) is inverted. This will make it so that when the user rotates right, they will effectively see the opposite motion sequence - i.e: what they would have seen if they had rotated to the left. I want to be able to control when this mirroring happens, as it should only last for brief segments of the game. I have tried tinkering with the OVRCameraRig.cs file in the Unity Integration Package (in "if(updateEyeAnchors)" section) with no luck. Specifically, I have tried setting the localRotation of the centerEyeAnchor to its inverse y value. Because I don't know how to mirror Yaw with quaternions, however, I do this in a roundabout way - by converting centerEyeAnchor to Euler angles and doing the manipulation with Vector3s. Is hacking the OVRCameraRig script possible in this way or is some of this HMD to camera mapping inaccessible? I am open to any suggestions! Thanks in advance!738Views0likes0CommentsWhy can't I reset the Quest's headset position in Unity?
I'm developing in Unity for Quest and I've come across something disappointing and confusing. All I want to do is make sure when I load up a new scene that the headset resets its position and orientation to the place I've specified (so, you know, it doesn't put me outside of a scene, etc.). When you use move around physically, your play space position does not move with your headset. So these are two separate positions and orientations. Normally, if I was to move away from the play space origin and teleported somewhere to a place that loads the next scene, the headset is placed the same distance from where I place the origin in the next screen, making me appear in unintended places. I have tried using the code for resetting the camera position, UnityEngine.XR.InputTracking.Recenter(); at the beginning of the scene. This works as intended when previewing on Rift, but it does nothing on my Quest. I have tried absolutely everything to even manually change the headset's transform and return it to the play space origin, but again, works only on Rift. I read somewhere that Oculus intentionally disabled this feature on Quest because a user can reset the play space manually by holding the Oculus button. If that's true, how in the world am I supposed to do something that should he so simple like this? Why can't I move the headset position like on the Rift? It uses the exact same Oculus SDK and everything. Here's some of my script where at the beginning of the scene, the headset is supposed to be reset to the playspace origin that I set in the middle of my scene. I commented out a lot since I tried various things but all worked on the Rift but not Quest. public class MainManager : MonoBehaviour { private string sceneName; public OVRCameraRig _ovr; private Transform _OVRCameraRig; private Transform _centerEyeAnchor; // Start is called before the first frame update private void Awake() { sceneName = SceneManager.GetActiveScene().name; //_OVRCameraRig = _ovr.transform; //_centerEyeAnchor = _ovr.centerEyeAnchor; //_OVRCameraRig = VRTK_DeviceFinder.PlayAreaTransform(); //_centerEyeAnchor = VRTK_DeviceFinder.HeadsetCamera(); //_centerEyeAnchor = _OVRCameraRig; // This only works with Rift, not Quest, etc. // UnityEngine.XR.InputTracking.Recenter(); // Use my custom function provided here to manually reset the position and angle //ResetCameraToPositionOrigin(_OVRCameraRig.position, _OVRCameraRig.rotation.eulerAngles.y); }5.9KViews0likes2CommentsIs there any way to access CV1 tracking data and select what I need only?
I've seen a lot of similar questions in the Unity section, but I couldn`t find a solution. Also I`m pretty new to this, so please keep in mind that I`m not an expert. I`m using Unity 5.4.1f1, Oculus Utilities 1.9 and the CV1. What I want: I want to basically stream motion capturing data to Unity and use my CV1 as (just) a screen. -> Only using mocap data for position & orientation. I want to achieve that with the best possible performance. My application should allow the user to juggle with kegs or catch balls. Nothing too crazy. What I`ve tried: Seems like turning off Oculus support and use the CV1 als second screen resolts in a bad experience. Writing a script to cancel out the CV1 tracking data and set it as parent of the camera results in a lot of unnecessary traffic. My Question(s): Most of the stuff I read was older than 6 months. Is there a way now to turn of sensors/tracking data or select it? Can I achieve what I want in Unity? Can I achieve what I want outside of Unity? -> Is there any way at all or is the CV1 simply not meant to develop? It`s my thesis to topic to use mocap data instead of the build in tracking data, so please don`t answer with "just use the build in tracking data". I wouldn`t mind downgrading if it works and is compatible with the other stuff. I appreciate any helpful comment on this, since im realy stuck right now. :/Solved1KViews0likes3CommentsCross-trainer with the OR
Hi Rifters, i just wanted to share my idea/project i am currently working on with you. As many of you i am still waiting for my devkit (Germany Order ~ 8000) but i recently got a cross-trainer with a serialport and a razor hydra. I wrote a little program in C# to control (and read the data of it) the cross-trainer. My idea is to make a virtual track in Unity and use the razor hydra for positional headtracking and tracking one of the hands. With this it should be possible to ski or drive on an elliptic bike (see picture below) in a virtual world which might be really cool :) . You could for example change the power of the cross-trainer when the terrain changes (uphill/downhill) or when you leave the track. Only problem is that i am good in C# but not really familiar with Unity or modeling. I tried it a little bit in the last days and i hope this will be fine after spending some time with it. Maybe there are allready some models of elliptic bikes out there which i could use? Would be nice to know if someone else is also working on something like this and i am looking forward to get some ideas / criticism from you guys. (And sorry for the bad english and the crappy picture below... its google image search + paint.net ;D )3KViews0likes4Comments