11-24-2023 04:33 AM - edited 11-24-2023 05:25 AM
I am building a VR game using the new Meta XR SDK (previous Oculus Integration). I want the user to be able to press an object (using ray interaction), and get teleported to that location with a given rotation. Then I want to disable all form of movement either with controllers or hand tracking - the only movement possible should be physically moving around. I also want the player to be stuck to the same y-position level (and not fall due to gravity). To resume controller movement and hand teleportation I want the player to ray interact with another object. Any idea how I can do this?
02-01-2024 04:58 PM
Hi @Haakonf,
I'll suggest some ideas that could work for your scenario. They'll require you to build on top of some existing features, but it'll still save you some time.
To press an object and then teleport to it with a given rotation, follow the Create Locomotion Interactions tutorial, and use the object you want to press as the Locomotion Interactable. To more closely resemble the ray interactor's raycast, you should be able to customize the teleport arc so it has a flatter trajectory--you can find reference info about the arc in the Locomotion Interactions doc.
However, if by press an object, you mean gesturing like you're physically pressing a button, then it's a bit trickier. In that case, I'd still use the locomotion tutorial above to make the object a teleport location, then use the Create Ghost Hand Reticle tutorial to draw a ghost hand on the object whenever you're hovering or selecting. You'd also need to add a poke interaction so you have the ability to poke. Then you'll need to make that poke interaction the selection mechanism for the teleport, replacing the default arc. To do that, I think you can pass the poke interactor to the TeleportInteractor's Selector field, but I haven't tried it myself.
As for disabling hand tracking & controllers until the player ray interacts with another object, I don't think that's the best course of action, since you'd need to track the hands in order to register a ray interaction. Maybe you could visually freeze the hands by forcing the HandVisual prefabs to render in one spot, while the SyntheticHands (which track the location of your physical hands) are still tracking and responding to your input, like initiating a ray.
To prevent the player from falling due to gravity, it may be possible to get the player's body position data via Movement SDK? I've never worked with Movement SDK or attempted to get body position data, so I'm guessing.
Hopefully this info helps get you started!
02-02-2024 09:43 AM
I misspoke about SyntheticHands--those are what you would use with HandVisual to change where the hands are rendered.