Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
Ttakala's avatar
Ttakala
Honored Guest
13 years ago

Rift full-body avatar with Kinect, Razer Hydra, and PS Move

Over one month ago we released the downloadable TurboTuscany demo with support for Oculus Rift with Kinect, Razer Hydra, or PS Move. Since then we released a new video demonstrating full-body avatar control using Kinect + PS Move combination:



Below I sum what we learned while developing the TurboTuscany demo. Some of our findings are consequential, while some are common knowledge if you have developed stuff for Razer Hydra, Kinect, or PS Move before.

Latencies of used devices:
Oculus Rift > Razer Hydra > PS Move > Kinect

Body tracking with Kinect has an easily noticeable lag, has plenty of jitter, and the tracking fails often. Nevertheless, Kinect adds a lot to the immersion and is fun to play around with.

From all the positional head tracking methods available in our TurboTuscany demo, PS Move is the best compromise: big tracking volume (almost as big as Kinect's) and accurate
tracking (not as accurate as Razer Hydra though). Therefore the best experience of our demo is achieved with Oculus Rift + Kinect + PS Move. Occlusion of the Move controller from PS Eye's view is a problem though for positional tracking (not for rotational).

Second best head tracking is achieved with combination of Oculus Rift, Kinect, and Razer Hydra. This comes with the added cumbersomeness of having to wear Hydra around the waist.

My personal opinion is that VR systems with a virtual body should track the user head, hands, and forward direction (chest/waist) separately. This is so that the user can look into different direction than the direction where he is pointing a hand-held tool/weapon, while walking in a third direction. In TurboTuscany demo we achieve this with the combination of Oculus Rift, Kinect, and Hydra/Move.

Latency requirements for positional head tracking

The relatively low latency of Razer Hydra's position tracking should be low enough for many HMD use cases. If you're viewing objects close, the Hydra's latency becomes apparent however when moving your head. Unless STEM has some new optimization tricks, it will most likely have different latency (higher?) than Hydra because it's wireless.

If head position tracking latency is less or equal to Oculus Rift's rotational tracking, that should be good enough for most HMD applications. Since this is not a scientific paper that I'm writing here, I won't cite earlier research that suggests latency requirements in milliseconds.

Because we had positional head tracking set up to track the point between eyes, we first set Oculus Rift's "Eye Center Position" to (0,0,0) which determines a small translation that follows the orientation of Rift. But we found out that the latency of our positional head tracking was apparent when moving the head close (>0.5 meters) to objects, even with Razer Hydra. Therefore we ended up setting "Eye Center Position" to the default (0, 0.15, 0.09), and viewing close objects while moving became much more natural. Thus, our positional head tracking has a "virtual" component that follows the Rift's orientation.

12 Replies