Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
Suprcow's avatar
Suprcow
Honored Guest
10 years ago

3d fps Input device plus virtual positioning system

Good day all, I've been thinking on input methods for the virtual hmd's as well and I might have come up with something new, or not... but I've never really come across something like my idea so I will share it here for maybe there are people interested in this I don't know, just want to help :) I have some drawings made by hand which are well not great haha and I will add them later as im at work right now and don't have access to them. hopefully I can share my idea in words for now. ive been thinking would it not be fun to have a 3d fps controller for virtual reality, now I've seen someone post about a gun but my idea is a sort of an addition to this ideas perhaps, but ive never seen it before registering here in the first place.

alright. if the simgun you hold in your hands, has 3 dots on them that can be calculated as a line thus recognized "somehow", one could use this gun as a virtual mouse pointer to point at things in the virtual world. if this gun has a shoulder button ( a button that is pressed when you put it against your shoulder), the game engine could know you where pressing the simgun against your shoulder and switch in game (inside hmd screen) to the aim down sight modus (cool!). if it could shake differently like with a built in motor to simulate different fire modes this would rock aswell :P sort of a rumble pack thingy for simulation purposes :) )

now comes the funny part, if we had a (cheap but solid) simple portable foldable system like 8 simple telescope poles forming a cube around the player in real life actually that scans all sorts of points within its dimensions it could be hardcoded in a game engine so that it knows where the player (+ his orientation information based on scanning points on the Hmd itself) with the hmd is for perspective but also the gun could be tracked in the real world cube with known dimensions. if we divide heights (multiple scanning points on all corners plus in between as segments to add to the scanning resolution) like the waistline and the head or use more points in the scanning cube we could have the engine know whether a player stands or crouches or crawls, (transpose in game off course) and actually a reload button, walking button and fire mode select button would be awesome as well. oke now if we had loosely separately available points to add to any real life object really and we selected for instance in a Ping-Pong game we would like to use a bat and place the scanning points which the Vps recognizes on a real world Ping-Pong bat we could have a Ping-Pong bat in game. (or like a real close 3d avatar don't know depending on how small these scanningpoints would be and depending on how many were added as a set) this can work for many objects of course this is just an example. if the first company that brings this to market makes deals with game engine company's we could define these dimensions to work for all of virtual reality (open standards are probably much better we all love Vr and would love to see it make it this time). if any object can be used like a mousepointer like the previously described 3d fps controller this would make making selections in in-game menus pretty easy as well). ive read about a new company that oculus just bought up, scanning real worlds and virtualizing them, this could be used as a bonus to scan yourself (the player) for ingame avatar purposes aswell.... if this system were to be added somewhere to the vps scanner.

if oculus (and or all other companies perhaps could agree on a standard...) and where to bundle this (hmd + scanner box + world virtualiser?) Vps wow.... we could specify in game and use virtually any real world object in a virtual game engine scanned at home real-time (if we standardize these dimensions (i.e. within human height range so one size fits all) it would really help and it could be very portable light weighted and wireless also, plus nobody would like to leave this used up space unused when not using the Hmd hense foldable). just define the real world dimensions of the virtual positioning system in game and make it scan the to be added points on the hmd itself as well so the players orientation is known as well. as I've said I hope this gets my idea across, hopefully im not bothering anyone with it if it was already thought of (again I couldn't find anything), just trying to help :) if there's some interest I could add the drawings, I think I've had some more ideas but no access to my original drawings now so ill add later if im not boo'd off stage :)

or would this be a nice kickstarter? as it could work globally for all Vr systems?
it seems a Vps could globally specify the space around players setting a standard for the dimensions of virtual reality used in game engine dimensions and can work perfectly together with exiting technologies like the sixense or the prioVR systems or even the cyberith and omni if you placed it around the setup, when combined theres double presision for maximum skillbased immersive gameplay.

update:
here is a simple zip file with 4 hand drawings, if you can look thru the simplicity, but it might be a little more visual.
https://dl.dropboxusercontent.com/u/70805005/Vpsx4.zip

ah man dont mind the spelling either :P was conceived at a late hour :)
drawing 3 seems to demonstrate it best i think. it could be very small and an addition to all vr systems perhaps.

6 Replies

  • It would be much simpler to just add to such "prop gun" several tracker markers, so it's position would be always accurate in VR, and game doesn't need to do any "fake" aiming. You would aim like in real life. Perfect usage scenario for such gun would be Vive set, where controllers calculate their position on their own. I've tested current devkit and it's incredible feeling when controller has identical mesh representation in VR, you know where it is, you just grab it and you fell it in hands (haptic feedback out of the box).
  • Suprcow's avatar
    Suprcow
    Honored Guest
    actually I think what your saying is what I was saying already pretty much... (are you saying that this idea already exist? then please point me in the right direction)
    trackers on the propgun and helmet for orientation.... and using the propgun as a mouse pointer
    relative to the virtual screen off course.... so depending on the orientation you could look around
    and aim the gun mid air seeing a red dot on the end of the virtual pointing (but onscreen in the Hmd) device like a propgun.

    update, added a zipfile to my original post.
  • "Suprcow" wrote:
    actually I think what your saying is what I was saying already pretty much... (are you saying that this idea already exist? then please point me in the right direction)
    trackers on the propgun and helmet for orientation.... and using the propgun as a mouse pointer
    relative to the virtual screen off course.... so depending on the orientation you could look around
    and aim the gun mid air seeing a red dot on the end of the virtual pointing (but onscreen in the Hmd) device like a propgun.

    update, added a zipfile to my original post.

    I don't understand this part. If you have true orientation and position of your gun in VR, you don't have/need any "mouse pointer" there is non as gun is part of 3d scene. If you want to make it easier to know where you're pointing you can render virtual laser beam from the gun. There is no surface on which that "mouse pointer" would move as you're immersed into the world.
  • Suprcow's avatar
    Suprcow
    Honored Guest
    yes we can render a laser from the gun (we should this is the idea to find a point onscreen) but this complete laser need not even be rendered, this could also just be shown as a red dot ( lighting up or down depending on simulated depths from collision with 2d overlay intersections) in first person shooters at a certain depth on a 2d plane inside the Hmd. if i aimed at a tree but then looked away left the laser aim should be independend and still on the tree as the virtual world is 360 degrees (np with 2d overlay moving along steady), would be fun if menus could come up anywhere actually, find the menu on a wall in the virtual world :). it does not matter a complete laser or just the dot for pointing at a ingame menu... thats why i said it could be used as a mouse pointer as well. the mouse pointer must be virtually rendered in the game engine, independent of to which plane the real world player is oriented. define north south and define and name all sides or planes, hardware and define in software north south so it can be synced) , then the plane aimed to just acts as a 2d screen overlay in the virtual world all the time with X,Y coordinates like a mouse does now and orientation doenst even matter this way, so we can even have different mouse speeds or even accellerators) but you seem to understand what i mean now.
  • Ever heard of the Vive and Lighthouse?
    Though, your suggestion is not bad for mobile VR. I have experimented with a simple bluetooth gyro mouse and Durovis Dive Unity plugin and I got pretty much a basic functionality that you are describing. I will resent it next week at our VR Meetup and maybe I will post more info here too.
  • Suprcow's avatar
    Suprcow
    Honored Guest
    thanks keep me posted Always interested in new developments :) mobile is a good option indeed, if one could take it with him it would be nice indeed :) bring it to a friends house :D