How to extract/import/read the sensor Data from Oculus Rift S
Hello everyone, I am in a college project and we like to develop a VR experience in a Cabin of an Aircraft. Therefore we need to work with sensor Information. I searched very much but i could not find the right answers to my question. I would appreciate any kind of help or hint! So my question is: How can I import the sensor information in a programming tool like Matlab? Greetings, Preet918Views0likes1CommentHow to get angular velocity and acceleration of headset
Is there a way to get the angular velocity and acceleration of the VR user (the headset itself) in unity? I am only looking for angular velocity and acceleration because the user will be stationary while only moving around their head. In the oculus developers documentation there is page titled 'get raw sensor data' but doesn't actually show how. Thanks!1.9KViews0likes1CommentRift S SLAM Drift Correction in Warehouse Scaling
Hi! everyone, I'm trying to correct the Oculus Rift S SLAM Drifting in warehouse scaling. I assume that this drift is due to the XYZ 0.0.0 Reference set at the beginning of the experience and to the impossibility to create or add some Anchors in the real world. So... I would like to use OpenCV ArUco markers hooked in some space of my warehouse and to make the Virtual world moving slowly to this reference to fit and match with the real world. By the way I should be able to use the real objects occlusions, rather than avoid them... Is there a way to access to the Rift S SDK to use the buffer of the tracking SLAM Camera? I would like to implement all my stuff in a UE4 project does anyone else have some idea to share, or some previous experience with this objective? Thanks.542Views0likes0CommentsGame levels based on scanned environment
Dear Oculus and community, can we receive any 3d data about the environment area where we play? We want to build a game level that surrounds us based on information about obstacles in reality. To set the game obstacles at the same place. For Rift S and Oculus Quest using a modular environment.435Views0likes0CommentsHow to set the IPD (without using the Rift Sensor) correctly?
I`m trying to increase the area in which in can use my CV1. To achieve this I`ve tried to get rid of the CV1`s tracking and exchange it with data I receive from tracking my CV1 with a motion capturing system. I stream this data to Unity 3D, using a Unity 3D Plug-In, provided by the Motive developers. I`ve already given up on the rotation though, since I`m missing to much of the CV1`s correction software. And well, the rotation is not bound to the Rift Sensor anyways and cables can be extended. What`s left now is the position of my CV1. I want to exchange the position tracking data the sensor delivers with that of the mocap system. This works fine (I can just turn of pos tracking using the OVRManager script on the OVRCameraRig), but while I get the position of the WHOLE HMD(or to be precise it`s center), I have no information about the IPD(eye distance), which results in divergence. I`ve found out that I can interact with the position of the eye anchors, so I´ve tried to add a distance of 66mm manually, but this won`t do a thing once I put the HMD on(works fine in the Unity preview though, which is confusing). The code looks like this: trackerAnchor.localRotation = tracker.orientation; centerEyeAnchor.localRotation = VR.InputTracking.GetLocalRotation(VR.VRNode.CenterEye); leftEyeAnchor.localRotation = monoscopic ? centerEyeAnchor.localRotation : VR.InputTracking.GetLocalRotation(VR.VRNode.LeftEye); rightEyeAnchor.localRotation = monoscopic ? centerEyeAnchor.localRotation : VR.InputTracking.GetLocalRotation(VR.VRNode.RightEye); trackerAnchor.localPosition = tracker.position; centerEyeAnchor.localPosition = VR.InputTracking.GetLocalPosition(VR.VRNode.CenterEye); leftEyeAnchor.localPosition = VR.InputTracking.GetLocalPosition(VR.VRNode.LeftEye) + new Vector3(-0.033f, 0, 0); rightEyeAnchor.localPosition = VR.InputTracking.GetLocalPosition(VR.VRNode.RightEye)+ new Vector3(0.033f, 0, 0); Also I`ve tried to use the FakeTracking script provided here: https://forums.oculus.com/community/discussion/comment/450853/#Comment_450853 But nothing seems to work. Am I missing something? Is there more to the sensor than just position and IPD? Unity Version: 5.4.03f Oculus Utilities Version: 1.9 Mocap System: Motive Body 1.9 Plug-Ins: OptiTrack Unity Plugin1.9KViews0likes1CommentIs there a way of accessing camera tracking data on the CV1?
Sorry if this has been asked before. I searched, but didn't quite find the answer I needed. Okay, so, I made a decal that covers the entire front of my Rift, and the material is definitely translucent enough that at least *some* of the constellation IR is getting through, as I'm still being tracked with it on. I did, however, notice myself getting a little dizzy in an app I don't normally notice any problems with (I don't remember which, now), but didn't really see any visible problems. I took the decal back off, and I *think* the problems went away (if they were even there to begin with). I can test whether the camera can still see the front constellation LEDs through the decal by just putting my hands in front of the front of the Rift. Hands in front: no positional tracking (rotational-only). No hands in front, even with the decal: positional/rotational tracking. Anyway, I'd like to find some way of quantifying tracking data for testing. Brightness of the constellation LEDs, whether the camera is losing tracking on any of them (or the lower brightness is causing dropped tracking frames), etc. Any advice?570Views0likes0Comments