Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
DanB's avatar
DanB
Honored Guest
12 years ago

[Suggestion] Open Motion Tracking API

There will be a lot of motion tracking input devices & associated APIs out there by the time the Rift arrives, but it will hard for any developer to support more than a few of them & so these devices will only work a for a few games.

In my opinion, what is needed is an open motion tracking API that developers can target, allowing them to target many different devices with the one API & allow users to use their devices in many more games.

You might have cameras/input devices/APIs that can identify & locate objects such as: hands, humans, QR codes, dogs, tables, keyboards, user/developer defined custom objects. It would be very useful if the tracked data was all accessible from the same API & perhaps allow some re-use of images captured from cameras & error-correction between different models.

Some of the low level features a tracking API might need:
  • Different platforms specializing in certain tasks

  • Querying device capabilities (image size, image warping, depth/EM wavelengths measured)

  • Access to raw data/images

  • Information on (perhaps even control?) emitted light of certain devices

  • Positioning of capture devices & output devices (Possible to imagine a setup with a static Rift camera & a couple of Kinect-like sensors, multiple leap-like cameras on VR headset, other input devices attached)


Some higher level features:
  • Database of known objects (hand, human, dog, QR code) & identification/positioning of them

  • Certainty of tracking data

  • Gestures: recording/detection/playback

  • Merging of tracked data to form a combined view of the world

  • Allow correction between platforms when one is more certain about tracked data than another is

  • Should probably be data-driven if possible rather than code-driven, preferably not requiring extra code to track new objects/gestures


Some usability issues:
  • Shouldn't just track humans

  • Should be customizable for non-typical people, such as missing limbs/conjoined twins.

2 Replies

  • getnamo's avatar
    getnamo
    Honored Guest
    I've been thinking about something like this. Initially I think the most useful category would be to provide an abstract skeleton to which different inputs can bind. Then the developer can use the skeleton and based on the parts being tracked, adjust overall control schemes.

    Then adding new tracking devices would update accuracy or categories of tracking, but the downstream game code could remain unchanged.

    Will see if in the coming weeks I can't code up a plugin for UE4 that will be the middle point for hardware to forward skeletal information for vr input.
  • I've been considering this myself. I agree that a standard skeleton would be ideal. Preferably it would be as detailed as possible and the input devices would control the appropriate bones, and different inputs could have different priority. For instance, you could have a STEM controlling your hand, but a Dexmo controlling the fingers. IK could deal with bones that aren't directly tracked. Speaking of Dexmo, it should be able to forward collisions to appropriate force feedback capable hardware, with resistance levels because you know people are going to figure that out.