cancel
Showing results for 
Search instead for 
Did you mean: 

Head tracker to rotation around Y-Z-X

HWiese
Explorer
Hi folks,

I've got a hardware camera head which has three axes moved by three servo motors. I'd like to couple that head with the head tracker of the Rift.

What I need to know for this purpose is what kind of data I can get from the head tracker or through the SDK. Is it only raw sensor data that I need to translate into whatever I need by myself or is there already some code that translates the sensor data to, let's say, yaw-pitch-roll in world space? Because that's what I need in particular to rotate the camera head simultaneously with the Rift: yaw-pitch-roll in this order.

I'll be grateful for any information, hint, advice, whatever you Rift pros can tell me. Thanks!

Cheers,
Hendrik
8 REPLIES 8

genetransfer
Explorer
check out the documentation that comes with the sdk. you should be able to get all your data with the sensor fusion.
depending on your coord system you may have to do some math magic on the returned values.

HWiese
Explorer
Holy sh... That's almost perfect! The sensor fusion provides me with the exact information that I need. The only thing that I have to take into account is that the three axes of the Rift are twisted with relation to my axes. Yeah, some math magic is apparently all I have to do.

Cool! Thanks a lot! 😄

ericeide
Honored Guest
any chance you could share what you did. I am a little new at this. I am trying to set up the same idea that you are. I am using it for a robot head. it will have 3 servos and run two raspberry pi's thanks for any help you can give me.

HWiese
Explorer
I will. As soon as I have completed my bachelor thesis. It's however implemented using Willow Garage's Robot Operating System, hence, there's not that much to see regarding transforming the orientation quaternion into its Euler representation. It's just something like (pseudo code/Python like):


def orientation_callback(quat):
(pitch, yaw, roll) = tf.transform.euler_from_quaternion(quat)
msg = HeadOrientation()
(msg.p, msg.y, msg.r) = (pitch, yaw, roll)
head_pub.publish (msg)


Something like that... The callback function gets called from ROS when it receives an orientation message from the Rift node which is ready to use (https://github.com/otl/oculus.git). The rest is ROS magic... 8-)

Hope it helps nevertheless. /edit: alright, did it anyway, despite my thesis... after all... :mrgreen:

strainor
Honored Guest
Was there ever an update on this? I'm currently working on a school project that utilizes two cameras controlled by two raspberrry pis sitting on a two servo pan/tilt module. I'm having the exact problem of A) what functions of the SDK to extract the coordinate data? Like, I see the common use of the quaternion but I wasn't sure if that was a function of the math library, oculus SDK, or the ROS libraries you've been using. B) I have no idea what ROS is and would it be a good idea to use the libraries (if I could get them to work on Windows or Debian) to perform a similar task to what you did?

Thanks in advance.

earth5worker
Honored Guest
I've got a two server gimbal connected through a servocity controller.
I want the oculus rift tracking to move the gimbal (pan-tilt).
I don't want to spend a lot of time programming. I'm eager to get on with rest of my project.
Can someone please point me to a ready-to-use program that will do this for me.
I'm technically savvy but getting the rift tracking to drive the servos is unfamiliar territory for me.
Thanks.

earth5worker
Honored Guest
I'd like to connect my oculus rift to an arduino uno (via the usb host shield) so the the arduino reads the tracking information and drives a 3 servo gimbal. Has anyone done this? If so, can you share the C++ code you loaded into the arduino? Thanks.

spooky_sergio
Honored Guest
I see there is no response for earth5worker and strainor. I need something similar to yours. If someone can hand me some code that takes orientation or even raw data from the imu i will be greatfull. Im new to working with this and i need the raw sensor data from the accelerometers and gyroscopes. How did you guys solved it?