Forum Discussion
Sebbi
13 years agoHonored Guest
Android Development
Hi there,
I don't have my Rift yet, but when I do I'm probably going to try to make it work with my Nexus 10 (Android tablet). It's one of those tablets that have both HDMI out and USB which can be used to get the orientation data from the Rift sensors (http://developer.android.com/guide/topics/connectivity/usb/host.html).
Has anyone tried this already? Or maybe just the stereo output of some rift screencap?
EDIT:
I'll update the first post with the development progress so others can easier follow and find information about Android and the Rift.
Repository for the Android Rift library:
https://github.com/sebastianherp/riftlibrary
Get sample app here:
http://appsdoneright.net/files/rifttest_0.2.apk
Compatible (=tested) Android devices (Version 0.2):
Screenshot (Version 0.2):
screenshot_0.2.png
Left side of screen moves the player, lower right side rotates the player (yaw). In the upper right side up/down changes the FOV and left/right the IPD. I tried to find good values for both and they should be ok on a Nexus 10.
I don't have my Rift yet, but when I do I'm probably going to try to make it work with my Nexus 10 (Android tablet). It's one of those tablets that have both HDMI out and USB which can be used to get the orientation data from the Rift sensors (http://developer.android.com/guide/topics/connectivity/usb/host.html).
Has anyone tried this already? Or maybe just the stereo output of some rift screencap?
EDIT:
I'll update the first post with the development progress so others can easier follow and find information about Android and the Rift.
Repository for the Android Rift library:
https://github.com/sebastianherp/riftlibrary
Get sample app here:
http://appsdoneright.net/files/rifttest_0.2.apk
Compatible (=tested) Android devices (Version 0.2):
- Nexus 10
- Nexus 7 (no HDMI out)
- Asus Transformer Prime
Screenshot (Version 0.2):
screenshot_0.2.png
Left side of screen moves the player, lower right side rotates the player (yaw). In the upper right side up/down changes the FOV and left/right the IPD. I tried to find good values for both and they should be ok on a Nexus 10.
26 Replies
- cyberealityGrand ChampionThe video may work, although you will want to be sure you can select 1280x800 (the output might default to a standard resolution like 1080P, depends on the phone). Also, even though the phone may have a USB port, you will still need a device driver to interface with the tracker. Currently this is only available on Windows (with OSX coming very soon). I am not sure if (or when) we will be supporting Android.
- SebbiHonored Guest1) Well ... the Nexus 10 has a 16:10 screen, but seems to output the full 2560x1600 on its HDMI port (haven't tried it yet, but read it somewhere). If the Rift can handle that, we are good. Maybe somebody out there with both devices could try it out? ;-)
2) Android has an API to write USB drivers and it should be trivial to write an application which can communicate with a HID device. If nobody has tried this yet, I'll give it a shot as soon as the Rift (order #40xx) arrives here ... - Thoth_The3xPartnerGet the Rift working well in my Nexus 10 and maybe I'll make a VR android game for you xD
- SebbiHonored Guest
"spire8989" wrote:
Get the Rift working well in my Nexus 10 and maybe I'll make a VR android game for you xD
Deal! I hope I have some time tomorrow to port the basic sensor reading functionality. - Thoth_The3xPartner
"Sebbi" wrote:
"spire8989" wrote:
Get the Rift working well in my Nexus 10 and maybe I'll make a VR android game for you xD
Deal! I hope I have some time tomorrow to port the basic sensor reading functionality.
Disclaimer: I might not have anything awesome for you until I get my Rift. And as a post-kickstarter backer, I'm looking at late May or sometime in June. :(
But maybe I'll work on some stuff anyway for you to test out. I do have a 100" 3d projector for gaming. - SebbiHonored GuestI got it working yesterday and made it an Android library. The code can be found here:
https://github.com/sebastianherp/riftlibrary
It is currently using the sensorfusion algorithm found in the official SDK which I'll probably replace so it can have a more liberal license.
Get a testbuild here:
(link in first post)
It is a bit crude, but should display the Rift's orientation and a screenshot of the Tuscany demo.
Next steps:- Support all the configuration options
- Display a block with the Rift orientation (similar to SensorBoxTest in the SDK)
- Implement the warp
- Put sample app on play store
- Thoth_The3xPartnerDang, really wish I had my Rift to play with this. Keep it up though!
- geekmasterProtege
"cybereality" wrote:
The video may work, although you will want to be sure you can select 1280x800 (the output might default to a standard resolution like 1080P, depends on the phone). Also, even though the phone may have a USB port, you will still need a device driver to interface with the tracker. Currently this is only available on Windows (with OSX coming very soon). I am not sure if (or when) we will be supporting Android.
I have my Rift tracker working in x86 linux and on my Raspberry Pi, using the "linux C" SDK version in another thread here. It does not support everything in the full SDK, but at least the tracker is returning rotation and acceleration values as expected.
It may work on Android with little or no changes needed.
viewtopic.php?f=20&t=667 - SebbiHonored GuestYeah, I used that code as a reference for the HID and message decoding stuff. It's a little bit different on Android since Java has no unsigned variables, but it works.
Today I managed to get a OpenGL view working on Android. It looks ok through the Rift, but I'd like to get warping done before I push those commits to Github. I guess you have to use OpenGL ES on the RasPi too? - geekmasterProtege
"Sebbi" wrote:
I guess you have to use OpenGL ES on the RasPi too?
So far I have only used direct framebuffer access on the RasPi, and just touching all the pixels eats the CPU for breakfast. Perhaps I could free up a little CPU power if I stop using the threaded head tracker code and just do a single sample between frames (or make the thread a "one-shot" that I enable once per frame). But soon, I need to use the much more powerful GPU to do more than simple demos.
But yeah, fancy stuff will need to use GL. I just wanted to prove that you actually COULD do some interesting "old school" stuff with only the CPU.
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 9 months ago
- 3 years ago
- 4 years ago