Forum Discussion
taciejones
5 months agoExplorer
Scene Change interaction issues
I'm very new to this world and learning curve is real. I've been teaching myself Unity all summer for an art exhibit in two weeks where I plan to have this VR project. It's close, but I can't seem to get the controllers to interact with the UI Buttons and change scenes. I have four UI canvases (each with a UI button child) that will connect to the 4 different scenes.
I started with Unity's XR Interaction Toolkit and switched to Meta Interaction and using Building blocks hoping it would simplify the process. I've read so many different ways to get the ray to change scenes, but so far it isn't working. At this point, I think I have too many components on my game objects so I'm not sure how to troubleshoot.
On the one UI Canvas I'm testing, I have graphic raycaster, tracked device graphic raycaster, pointable canvas. On the UI Button I have a button component with OnClick set to the scene via SceneLoader script (which was working with mouseclick, but now its not), Pointable Unity Event Wrapper with When Release set to the scene it should go to, and Ray Interactable. On the Event System I have XR UI Input Module and Pointable Canvas Module.
I'm also not sure it isn't something I'm missing on the Controller Tracking Settings. I also added XR Ray Interactor on the Building Block GameObjects for left and right.
At this point I'd be happy to start from scratch on the UI scene with new UI Canvas GameObjects if it means getting this to work, but I need to understand the most streamlined process to take first.
I'd be very grateful for guidance. Can anyone help?
I'd recommend starting with a clean scene and following these tutorials:
Add a rig, which will give you the ability to do raycast interaction among other things
https://developers.meta.com/horizon/documentation/unity/unity-isdk-add-comprehensive-interaction-rigCreate a canvas with a UI button, and add raycast interaction to that canvas
https://developers.meta.com/horizon/documentation/unity/unity-isdk-create-raycast-uiThen, you should only need to wire up your button to your script that changes scenes.
9 Replies
Replies have been turned off for this discussion
- RiverExplorerStart Partner
You are not alone. There are many ways to get UI buttons to work. And they are not compatible with each other. It is a pain and badly documented.
There does not seem to be a quick fix for it. I made a simple one I'll call MyRayInteractable.
Cast a ray out from the the controller(s). If it hits something you get the object it hit.
Then I do:
MyRayInteractable WhoIHit = TheHitObject.GetComponent<MyRayInteractable>()
If it comes back NON NULL, then I call its WhoIHit.OnHover(...), (that you implement in the MyRayInteractable component) .... And if the trigger is pressed at the same time, I call its WhoIHit.OnClick(...).
Make sure the raycast has a layer mask that matches the UI object (ray hits only UI layer for example, and buttons you care about in the UI layer).
As for the mouse, on mouse click. You get a position. (You may have to convert the click position to world space, there is a call for that or google it) Cast a ray from that position forward. If it hits something, do the same as with the controller.
It is simple. And it also allows you to define the parameter to the OnHover(), OnClick(), OnUnHover(), ... methods to suit your own needs. Because you write them in your MyRayInteractable component.
My MyRayInteractable component has two bools, IsGrabbable, and IsRayInteractive.
Then when you get the WhoIHit object, you can check those to see if you can grab it if you want, or do button I/O if you want.
MUCH simpler than the overly complex XR or Meta methods. (Or perhaps they are simple and the docs suck)
I also wrote a MyInteractor component, that does the controller or mouse side, now I can just add that component to the player mouse/controllers. And add MyInteractable to each object you care about.
Add a LineRenderer to the MyIteractor component to draw a line from your ray cast point using the same vector as you used to cast the ray. You may have to adjust the origin and direction of the ray/line. One for each controller. I only show the line if the ray hits something I care about. I think I set the line width to 0.01, and changed the color of the line.
You need a raycast and line renderer for each controller. And not a LineRenderer for the mouse, as it is always above the point you care about.
For extra fun, you can also cast a ray from the center eye position forward and do some head/eye control. I do that with pop up menus. The user sees a ray that I start from the mouth position to the ray cast hit position on menu objects (buttons). The OnHover() method highlights their selection. Then they can just click without the controller having to point to them. Your MyInteractable OnHover helps with that.
If using the Meta avatars on the Pro, you can use eye position and direction for the ray.
If you still are confused or lost feel free to ask more. I am retired, a bit bored, and having fun.
- taciejonesExplorer
Thank you! I appreciate all your help! I agree it is fun, and getting less and less confusing each day. I'm working on all this and advice on the post below.
- RiverExplorerStart Partner
I sent you in a PM my Interactor and Interactable. They will not just pop in and work. As they are tied to other stuff I have. But they should give you ideas.
- RiverExplorerStart Partner
Oh, and in my implemenetation, I added a collider to object I want to hit, with the trigger checkbox on, and only care if I hit trigger objects.
- taciejonesExplorer
This might be what's keeping it from working in the build. Trying next.
- taciejonesExplorer
Starting fresh and adding a mesh collider to the canvas did it! Thank you!
- jkeogh1413Meta Employee
I'd recommend starting with a clean scene and following these tutorials:
Add a rig, which will give you the ability to do raycast interaction among other things
https://developers.meta.com/horizon/documentation/unity/unity-isdk-add-comprehensive-interaction-rigCreate a canvas with a UI button, and add raycast interaction to that canvas
https://developers.meta.com/horizon/documentation/unity/unity-isdk-create-raycast-uiThen, you should only need to wire up your button to your script that changes scenes.
- taciejonesExplorer
Thank you! I got the scene to change in a new test Canvas! It works in Meta XR Simulator, now I'm trying to figure out why the interaction isn't translating to the build and in the headset. I'm going to try adding a collider, but I think it might in the build settings or controller settings. Do you have any thoughts on that?
- taciejonesExplorer
Thank you so much!
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 2 years ago
- 4 years ago