cancel
Showing results for 
Search instead for 
Did you mean: 

Simple UI interaction using the new all-in-one SDK [SOLVED]

MetaDevXR
Explorer
I am new to Quest development and have no prior experience with the now deprecated SDKs.  I am rapid prototyping a simple training concept demo using Meta's new all-in-one SDK.  Overall, it seems pretty good.  I was able to get passthrough, teleporting, and player movement working pretty easily and happy with the results.  But I still have not been able to get a simple left wrist floating UI menu button highlight/press interaction to work with the controller and pointer/lasers in worldspace, similar to how the standard UI canvas buttons work with a mouse in flat screen apps. 

All the building block examples seem to demo only hand interactions.  I've pulled in the Unity samples from Git and can test those via link but cannot seem to reproduce them on my own custom menu tests due to confusion between the old methods and the changes in the new SDK.  It seems from researching and comparing the older docs and tuts that Meta renamed a lot of stuff and rebuilt some of the settings and ways the UI interaction scripts and prefabs work together?  I am fine with that, assuming they've improved things, but would like to know if anyone can point me to updated docs, samples, or tutorials that explain how to create simple worldspace UI menu button interaction using the new SDK?
2 REPLIES 2

MetaDevXR
Explorer

Welp, turns out it's a lot easier than I thought it would be to make this work.  I think I over complicated the script assignments on the EventSystem and Canvas objects while testing out several different examples from older YT tutorials and other places.  My canvas script setup became messy and over complicated. Even though I had no errors in the consol, I suspect there were conflicts somewhere or incorrect script references assigned.  I finally found this page that showed how to simply add ray interactions to the canvas via Meta's handy right-click menu tool.  [Right-click] -> [Interaction SDK] -> [Add Ray Interaction To Canvas].

Add an Interaction with QuickActions | Oculus Developers

So, I created an empty scene to see what would be added to the canvas GameObject for comparison. The structure and script assignments were much simpler than what I had.  When I went back to my demo scene and created a new canvas and used this tool, I could finally get a hover/press response from the button.  Thankfully all the controller ray interactors were already set correctly, and I didn't have to clean anything up in that department.

Big_Flex
Expert Protege

I'm glad you figured this out! You mentioned having a UI menu on the left wrist. If you need help getting the wrist position, you can check the left wrist bone position with the Get Hand Bone Position tutorial.