Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
Murderink's avatar
Murderink
Honored Guest
12 years ago

Controling the AR Drone 2.0 with the Oculus Rift and the MYO

I have an idea to incorporate the Oculus rift with two devices:
The first being the MYO gesture control armband and the second being the AR Drone 2.0 Parrot.
The concept being able to control the up, down, left and right with the Oculus rift and the forward, backward and spin motions with the MYO armband. The first will be using the Oculus combined with the current smart phone wifi control feature already set out for the AR Drone then once the MYO is released taking away the smart phone control and replacing it with the MYO. The objective being to simulate actually being the AR Drone 2.0.

I want to know if anyone has had a simular idea and if so have they has a successful sync with the Oculus rift?

link for the MYO: https://www.thalmic.com/myo/

10 Replies

  • darren's avatar
    darren
    Honored Guest
    I also have the same idea, but I would like to control it using the Xbox 360 controller from Windows.

    One major hurdle, I'm not sure you've thought of, is that you'll need dual cameras to achieve the stereoscopic vision. You can't just use one camera and duplicate it for each eye or you won't get depth effect. This will require someone to physically mount two cameras with correct interpupilary distance and to merge the video on the device and stream it back as 1 stream through their existing WiFi channel, OR, stream two separate video streams (probably it would eat a lot of bandwidth, as you don't really need 2 High Definition video streams, but you just need 1 High Definition video stream from 2 cameras).

    What I usually do on these hobbies is I solve the most complicated and difficult problem first. So... Until that is out of the way, getting the VR flying drone of our dreams is not a possibility.

    Imagine though, how wonderful it would be to get stereoscopic vision of yourself flying above the ground using VR goggles. My god.
  • darren's avatar
    darren
    Honored Guest
    UPDATE: Some people are claiming that to create a 3D stereoscopic image, you can just take 1 image and shift it left slightly for the left eye and right slightly for the right eye.

    Can anyone confirm this? Why then couldn't you take just 1 rendering and shift it left/right rather than rendering the entire scene twice once for each eye camera.

    I am not buying it very much... I thought there was some difference in angles for the eyes that pronounced the 3D effect.

    However if this is true then VR can be achieved on the Parrot with just the stock unit.

    Another problem arises however... The Parrot connects on a WiFi network and so you've only got 300 Meters or so at best. If you can find a LIGHTWEIGHT public IP 4G wireless cell data network unit you could mount it directly to the Parrot and it would be controllable from over the Internet and use the Cellphone network for connectivity to its control program. You could then literally launch the Parrot from home using the Oculus Rift to visualize what it sees and fly out over the countryside as far as the batteries will take you provided you have cell coverage.

    How great would it be to fly above the real world using your Oculus Rift to visualize and look around. I would like to put a microphone and speaker on it just to hear and communicate with people if they get freaked out.

    What's more is they have a complete API so you could program it to be autonomous and still watch what it's doing and take control if necessary. You could take pictures, record video, but I think it would be particularly interesting to fly places where you can't go by foot and take impossible pictures such as directly from over water.
  • I actually write the Android ADRrone Flight Pro app which has support for 3G connections over something like a MiFi.

    I have the HDMI out working on the OculusVR with more or less the correct distortion and even the HUD 3D model in 3D ;)

    I am currently working on the USB Host input for head tracking yaw control. I have sensor information, but they precess wildly, so I think my code isn't quite right yet. I might need to move the USB processing into the native code too...

    Of course Android isn't officially supported, so I'm working off the PC SDK source.

    But good fun anyway :)
  • Success :)
    http://youtu.be/J9zw8fx-pMM
    I moved to NDK so I could use the Oculus SDK with less hacking and I now have full USB head tracking.
    ARDrone Flight Pro already has support for gamepads, so I can use this to control the drone when using the Oculus.
    The rendering speed is because I am using a Tegra 2 Xoom - a modern device will be much better but my Transformer Prime HDMI out is broken...
    But this is all running on Android - rendering and USB input headtracking for yaw.
    I even got it switching to Oculus rendering when you plug in the USB...

    I think the Oculus needs a built-in webcam, so you can at least see what you are doing - so run in webcam mode, pick up the controller and see the real world, then switch to VR mode ;)

    It's definitely a case of because I can, not because it makes sense, as I really need a co-pilot to keep the drone away from things, but my app does have a co-pilot mode, so it can be done on Android and you can fly the Drone, but it really needs something like an NVidia Shield (shame they aren't available in the UK).
  • This is exciting stuff, but I fear latency is going to be a killer here , i have two ardones and a myo on order, details from myo are very slim on the ground but they've promised release of their sdk real soon now.
  • So much good information here, Still waiting to get my Oculus. I can't wait but the funds are running a bit low so once I have it I will be back with an update.
  • I am currently scheming a multirotor / Oculus system on a different platform than the AR drone.

    There has to be a left and right camera image for 3D. Differences in the left/right images is what cue your eyes to depth. But this can also mean two transmitters from the craft to your base-station.

    I don't know a heck of alot about this guy, but I am watching him closely. He makes a left/right camera and is working on a device that reworks a single (interlaced) video signal into Oculus style.

    http://www.emrlabs.com/index.php?pageid=0
  • Looks like an exciting project, I'll try to stay tuned. As for the stereo vision issue, I've been wondering if anyone has tried using a 2D-3D converter with the rift. Obviously it's not "true" stereo, but my converter (AdapSonic) has impressed me more than I thought it would, enhancing my experience for cartoons, nature programs, and 3D games without built-in stereoscopic support. No latency either. I encourage others to just give it a try.
  • Does anyone actually own a MYO yet? I have seen some very limited demos, and very little about the tech inside.