Unity 6000 - Head tracking issue
Hello, After newest version of Unity 6000, i have issue with head tracking. Every time i create new project, import Meta XR All-in-One SDK(Currently 65.0 v, same with 64.0), everything works fine, except head tracking. when i do exactly the same with unity older version (for example 2022.3.15f1) tracking works. there are few things i can mention. during launch, frame rate is like 72 FPS, when normally its 600+ gives two errors and a warning. all i do is install unity editor(right now its 6000.0.2f1), create empty project(Clean project without anything), import SDK and launch. -----------------------------------------Error SerializedObjectNotCreatableException: Object at index 0 is null UnityEditor.Editor.CreateSerializedObject () (at <26627f99a7af447db68cd5961117dc4a>:0) UnityEditor.Editor.GetSerializedObjectInternal () (at <26627f99a7af447db68cd5961117dc4a>:0) UnityEditor.Editor.get_serializedObject () (at <26627f99a7af447db68cd5961117dc4a>:0) OVRManagerEditor.OnEnable () (at ./Library/PackageCache/com.meta.xr.sdk.core/Scripts/Editor/OVRManagerEditor.cs:39) -----------------------------------------Error Saving Prefab to immutable folder is not allowed: Packages/com.meta.xr.sdk.interaction.ovr/Editor/Blocks/InteractableItems/Prefabs/[BB] Pokeable Plane.prefab UnityEditor.AssetPostprocessingInternal:PostprocessAllAssets (string[],string[],string[],string[],string[],bool) -----------------------------------------Warning Your project uses a scriptable render pipeline. You can use Camera.stereoTargetEye only with the built-in renderer. UnityEngine.Camera:set_stereoTargetEye (UnityEngine.StereoTargetEyeMask) OVRCameraRig:EnsureGameObjectIntegrity () (at ./Library/PackageCache/com.meta.xr.sdk.core/Scripts/OVRCameraRig.cs:603) OVRCameraRig:Awake () (at ./Library/PackageCache/com.meta.xr.sdk.core/Scripts/OVRCameraRig.cs:190) Any clue?5.8KViews2likes10CommentsGaze/Head Tracking Combined w/ Limited Hand Tracking or Gamepad Controller (Ex. Apple Vision Pro UI)
This implementation could drastically increase the ease of use of the Quest 3 UI, and bring it much closer to the usability of the Apple Vision Pro UI for a fraction of the cost, while still allowing the flexibility of trackable controllers, conventional hand tracking, and the use of gamepad controllers or other Bluetooth devices such as keyboards or mouses. This post is directed to other developers who may have ideas to suggest as a solution, or may be interested in incorporating some of these ideas into their projects. Related Post: https://communityforums.atmeta.com/t5/Get-Help/How-to-reposition-or-resize-windows-with-a-gamepad-controller/m-p/1190574#M300771 How to reposition or resize windows with a gamepad controller? (XBox One Controller) "I love the gaze cursor feature when using a controller. Lets me quickly take actions when I don’t want to use the Quest controllers. One thing bugs me though. I’m unable to drag using the gaze cursor. If I hold the A button on an Xbox controller it will rapidly select an item instead of select and hold on to it. Are there any tricks around this?" This is directly copied from a reddit post by Independent_Fill_570 a month ago and it hasn't received any responses yet. https://www.reddit.com/r/MetaQuestVR/comments/1byjora/is_it_possible_to_resize_windows_with_a_contro... I'm having the same issue. I love the ability to use the gaze tracking for the cursor, but it restricts the ability to resize windows, reposition windows, long click (as it only repeatedly single clicks rapidly instead), so selecting text longer than a single word is also an issue. The gaze control seems to be the best substitute to the Apple Vision Pro's eye tracking cursor. Is there any way of using the gaze control to guide the cursor, but a hand tracking gesture to select or click, without it engaging as conventional hand tracking while gaze tracking is enabled? I've spent many hours now searching the internet, looking into potential sideloading options, and even pouring over the Oculus PC source code, but haven't really found anyone talking about this except for the one unanswered reddit post I've linked to. The XBox One controller has basically the same buttons available as the Quest 3 controllers do, minus the tracking, but with the gaze tracking it would be wonderful to have the controller buttons mapped properly and I can't seem to find a way to remap the gamepad keybinds without it being run through Steam and a PC link. I'd like to be able to do this natively on the Quest 3 standalone. Hand tracking only for the selecting or clicking would also be great, but even just the buttons being mapped properly so that pressing A clicks, and holding A holds the click until released would fix the issue. I am aware that pressing the select and menu buttons together toggles the gaze tracking off, and enables very limited use of the Xbox controller buttons for use in a keyboard or such, but that's not what I'm asking about here. Thanks in advance to anyone who has any helpful information to provide on this issue. If gaze tracking were combined with limited hand tracking gestures like making a closed fist, I feel like the quality of this product could more easily rival the user interface of the Apple Vision Pro.1.2KViews0likes0CommentsTemporarily decoupling HMD sensors from OVR Camera Rig
Hello, I am developing an application for Oculus Quest2 that will eventually run as an experiment. For part of this experiment, I wish to mirror the rotation of the camera such that its y axis(yaw) is inverted. This will make it so that when the user rotates right, they will effectively see the opposite motion sequence - i.e: what they would have seen if they had rotated to the left. I want to be able to control when this mirroring happens, as it should only last for brief segments of the game. I have tried tinkering with the OVRCameraRig.cs file in the Unity Integration Package (in "if(updateEyeAnchors)" section) with no luck. Specifically, I have tried setting the localRotation of the centerEyeAnchor to its inverse y value. Because I don't know how to mirror Yaw with quaternions, however, I do this in a roundabout way - by converting centerEyeAnchor to Euler angles and doing the manipulation with Vector3s. Is hacking the OVRCameraRig script possible in this way or is some of this HMD to camera mapping inaccessible? I am open to any suggestions! Thanks in advance!734Views0likes0CommentsWhy can't I reset the Quest's headset position in Unity?
I'm developing in Unity for Quest and I've come across something disappointing and confusing. All I want to do is make sure when I load up a new scene that the headset resets its position and orientation to the place I've specified (so, you know, it doesn't put me outside of a scene, etc.). When you use move around physically, your play space position does not move with your headset. So these are two separate positions and orientations. Normally, if I was to move away from the play space origin and teleported somewhere to a place that loads the next scene, the headset is placed the same distance from where I place the origin in the next screen, making me appear in unintended places. I have tried using the code for resetting the camera position, UnityEngine.XR.InputTracking.Recenter(); at the beginning of the scene. This works as intended when previewing on Rift, but it does nothing on my Quest. I have tried absolutely everything to even manually change the headset's transform and return it to the play space origin, but again, works only on Rift. I read somewhere that Oculus intentionally disabled this feature on Quest because a user can reset the play space manually by holding the Oculus button. If that's true, how in the world am I supposed to do something that should he so simple like this? Why can't I move the headset position like on the Rift? It uses the exact same Oculus SDK and everything. Here's some of my script where at the beginning of the scene, the headset is supposed to be reset to the playspace origin that I set in the middle of my scene. I commented out a lot since I tried various things but all worked on the Rift but not Quest. public class MainManager : MonoBehaviour { private string sceneName; public OVRCameraRig _ovr; private Transform _OVRCameraRig; private Transform _centerEyeAnchor; // Start is called before the first frame update private void Awake() { sceneName = SceneManager.GetActiveScene().name; //_OVRCameraRig = _ovr.transform; //_centerEyeAnchor = _ovr.centerEyeAnchor; //_OVRCameraRig = VRTK_DeviceFinder.PlayAreaTransform(); //_centerEyeAnchor = VRTK_DeviceFinder.HeadsetCamera(); //_centerEyeAnchor = _OVRCameraRig; // This only works with Rift, not Quest, etc. // UnityEngine.XR.InputTracking.Recenter(); // Use my custom function provided here to manually reset the position and angle //ResetCameraToPositionOrigin(_OVRCameraRig.position, _OVRCameraRig.rotation.eulerAngles.y); }5.8KViews0likes2Comments