Forum Discussion
momothemonster
12 years agoHonored Guest
Touch Designer
rift-touchdesigner.jpg
I've created a GitHub repository in the interest of fixing / improving my attempt at Oculus Rift Support in Touch Designer.
https://github.com/momo-the-monster/MMMRiftChop
The repository includes:
It's so close - the head tracking is fine, the 3D effect is there, but something's not right. I think it has to do with the offset for each eye (ie the ViewAdjust matrix to be applied after the projection matrix and barrel distortion). I've tried the values that make sense that are coming out of the Rift Sensor: 0.0395 offset for each eye, in my case - but I still see some sort of de-convergence in my periphery. I don't see it in the pre-compiled demos or when I render a scene in Cinder, so it must be a step I'm missing, or some parameters I'm not applying right.
The basic flow I'm using is:
I've tried adding a Transform Top after the Crops in order to change the eye separation but none of the values from the sensor fix the problem I see.
We're so close! Looking forward to some input on this, hope someone else has a Rift they can test with!
I've created a GitHub repository in the interest of fixing / improving my attempt at Oculus Rift Support in Touch Designer.
https://github.com/momo-the-monster/MMMRiftChop
The repository includes:
- A Visual Studio 2012 project for modifying/compiling the DLL
- A TD 088 Composition with my attempt at implementing proper rendering
- A compiled DLL if you want to get straight to the action (it's in the TD folder with the Comp)
It's so close - the head tracking is fine, the 3D effect is there, but something's not right. I think it has to do with the offset for each eye (ie the ViewAdjust matrix to be applied after the projection matrix and barrel distortion). I've tried the values that make sense that are coming out of the Rift Sensor: 0.0395 offset for each eye, in my case - but I still see some sort of de-convergence in my periphery. I don't see it in the pre-compiled demos or when I render a scene in Cinder, so it must be a step I'm missing, or some parameters I'm not applying right.
The basic flow I'm using is:
- Extract a 4x4 Projection Matrix from the Sensor for the Left Eye
- Flip coordinate at M[2][0] to make Matrix for the Right Eye
- Create one camera for each eye, use custom matrices generated above
- Use Render TOP to render each eye separately at 640x800
- Feed output of Render TOPs into GLSL Shaders created from example code
- Shaders are fed incoming Uniform parameters from Rift sensor
- Feed output of each shader into a Crop TOP, extending the left eye to the right and the right eye to the left
- Composite the Crop TOPs using Additive Blending
- Send final output to Window set to 2nd monitor, fullscreen
I've tried adding a Transform Top after the Crops in order to change the eye separation but none of the values from the sensor fix the problem I see.
We're so close! Looking forward to some input on this, hope someone else has a Rift they can test with!
4 Replies
- whoisonlineExplorerCool .. ill need to check it out, ill get back to you.
- gionnizHonored GuestI am having issues with the oculusrift chop,
what can be the problem?!
https://www.derivative.ca/Forum/download/file.php?id=3466 - malcolmbHonored GuestHey, to use the DK2 you need to install a build 24241 or later. You can get it here:
http://www.derivative.ca/temp/TouchDesigner088.24241.64-Bit.exe - greenpatternHonored Guestso windows 10, doesn't run 1.3 runtime if I understand well? only people with windows 7,8 can use it.
Am I right?
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 8 years ago
- 3 years ago
- 1 year ago