Forum Discussion
mptp
11 years agoExplorer
[Hypothetical] Input Device API
I just had a thought... There seem to be two big issues with controllers for VR right now: 1. There are so damn many of them 2. Oculus have hinted very strongly that they will be releasing a con...
mystify
11 years agoHonored Guest
"NeoTokyoNori" wrote:
mptp, It looks like you and I have been thinking about the same topics (c.f. the locomotion issue),
but you are more motivated to write up a post on the topic :D
In fact I am quite surprised that Oculus has not announced that they will be releasing a common API for interfacing VR peripheral hardware. Since given their position as pioneers and champions of the new VR era, I would think it is in their interest to create a large and vibrant eco-system for hardware, and not just software.
It should not matter what form factor or capturing method the particular device itself is taking.
They are all just trying to represent the position, orientation, and maybe action, that a body part is performing.
Therefore, each body part could be represented by its name and its current position in space, for example:
Right Hand Wrist (x, y, z, yaw, pitch, roll)
Right Hand Index Finger (x, y, z, yaw, pitch, roll)
Right Hand Middle Finger (x, y, z, yaw, pitch, roll)
Left Knee (x, y, z, yaw, pitch, roll)
Left Ankle (x, y, z, yaw, pitch, roll)
Navel (x, y, z, yaw, pitch, roll)
Seventh Cervical Vertebrae (x, y, z, yaw, pitch, roll)
and so on...
for any part of the body, which a particular device wishes to capture/emulate.
As long as Oculus defines the format of these data points in a common API,
then it should be just a matter for the device makers to format their data accordingly, so that they can interface with it.
The game and software makers, can then rest assured, that they have a common data format which represents the players body parts, REGARDLESS of the particular device they maybe using.
And it is simply up to them to decide whether to use that body part data as a control mechanism for their game/software, not worry about whether that particular device is going to sell, or still be around next year.
I maybe over-looking some technical or political issues that are standing in the way,
but I am imagining, and hoping that it is an issue that Oculus could quite easily solve. :)
This type of api is even less useful. It only encompasses body tracking, and not other forms of input, and the specifics are constraining. Each device will fill a different subset of those inputs, and so as a developer I still couldn't make an input scheme with it that will work with whatever devices people have. I will, say, pick the STEM controller to develop for, use it to track hand position/orientation, then work with all of the buttons, and someone who has a hand controller that gives hand position/orientation without all of the buttons will fail. Or if I design something to use finger tracking for each finger, devices that give a coarser feedback on hand position won't work. The end effect is that you still have to design your program for specific devices, but instead of doing it directly with the device and being able to tune it to that device's properties, you are constrained by an API that neither the device creator nor the developer designed.
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 3 years ago
- 2 years ago
- 1 year ago