Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
MShekow's avatar
MShekow
Honored Guest
12 years ago

Need Rift integration advice (Unity+external application)

Hi. We're building an AR application where we use Unity3d as a "dumb renderer". That means that we have one (external) Windows 7 C++ application which accesses various sensors and writes them to shared memory. This application, let's call it "server", also has a GUI used for user interaction (the one who uses it is an "operator", it's not the same person that wears the HMD). This server application allows to record and replay sensor values, and can do various other things. On the other side, we wrote a Unity DLL plugin whose exported functions are called by our Unity C# script. That plugin/script does nothing else than reading those sensor values from the shared memory, assigning these values to Transform components in our Unity scene(s). This way, we've already integrated another HMD (that is, its headtracker), which is also HDMI-connected display.

Now we'd like to integrate the Oculus Rift. First of all, I was a bit disappointed to see that you don't provide sources for the UnityPlugin.dll. It remains unclear to me how much additional logic is inside that DLL. Obviously, many of the exported functions in the DLL are simply routing the calls/information to the available SDK functions.

Anyways, here's my question: is it possible to integrate the Rift the following way:

  • My server application reads the tracker orientation values (and writes them to shared memory)

  • I modify the Unity scripts provided by your unity-integration-package to use the values read by my own unity plugin (which reads them from shared memory)


One reason I could imagine why this might not work is initialization of the Rift device. I know that my server application needs to initialize it in order to read the tracker orientation. Except getting some of the Rift's parameters (e.g. aspect ratio), which I could provide via the shared memory channel, I'm not sure whether there are any continously called render-related functions of the Rift that would require it to be initialized from within Unity. The Rift can obviously not be initialized from both Unity and my own application.

Best regards!
Marius

4 Replies

Replies have been turned off for this discussion
  • This sounds possible to me, though you would have to make a number of modifications to our Unity integration for it to work like this.

    For reading tracking orientation, you could use the C++ SDK. This is not too difficult. Although you can only obtain tracker values in one app at a time, you should still be able to access the static properties from multiple apps (for example, the resolution, distortion constants, etc.). At least with C++ programs, it works like this.

    In terms of getting the tracking values into Unity, you would just need to rewrite the Unity scripts to use your shared memory code instead of reading values directly.

    So seems entirely possible, though you will have to make some modifications yourself.
  • I made a C# script a while ago that uses pipes to send and receive data between a custom program and unity. The idea was to use this for a hardware vr glove I'm working on, since Unity breaks with the serial code running in it. (hard crashes without error warning)

    This same code could be use to send data to unity from the rift. It's called over 7,000 times a second on my pc. I don't know if this would be of much help to you since you are using C++, but maybe you could use the .net pipes to send the data. It's sent and received very similar to a socket connection.

    O-o I don't know if there is a C++ equivalent to pipes, but it seems very fast.
  • MShekow's avatar
    MShekow
    Honored Guest
    Thank you! I'll get my hands dirty on these modifications these days and let you know if I have further questions.

    Currently I use Windows shared memory (actually, boost's encapsulation of it) and I'll run some performance tests to get to know the latency this mechanism imposes. So far, with the other HMD and its headtracker, we've not noticed any addional delay, regardless whether Unity accessed the values directly (via a C++ plugin DLL) or via shared memory.

    I have one question right now actually, regarding the polling of the orientation data: it seems that the values are polled by the Unity script at 60 Hz (at least OVR_GetSensor(Predicted)Orientation(...) is called only once per frame, either in OnPreCull or OnPreRender). Although the documentation doesn't mention it explicitly, it seems that a different thread (started within the Oculus SDK) makes new sensor values available to the SensorFusion instance at 1000(?) Hz. Then a call to SensorFusion.GetOrientation() simply returns the most recent value? And I suppose OVR_GetSensor(Predicted)Orientation(...) really does just that, i.e., calling SensorFusion.GetOrientation(), returning the data.
    I'm just checking that this is correct, so that I can reproduce this behavior in my own program.
  • MShekow's avatar
    MShekow
    Honored Guest
    Just wanted to report back that it was possible to integrate the Rift with my solution without problems. In my own application, I get a SensorDevice (instead of a HMDDevice) and use the SensorFusion class as explained in the SDK. I simply commented out various OVR_ functions in the OVRDevice.cs script and replaced them with my own implementations. The unityplugin.dll was still able to "init" the Rift and read out the renderspecific configuration variables.