cancel
Showing results for 
Search instead for 
Did you mean: 

Pointable Canvas Event works with Quest Link, but not when built to headset.

Iseldiera
Protege

Hello fellow devs,

I am using the RayCanvasFlat that comes with the samples for Meta Interaction SDK for my project.

There is a wall object in an architecture scene and I want to be able to use the ray pinch functionality to select this wall and call an event to pop up a menu to interact with this wall. 

I used the RayCanvasFlat prefab and removed all text and buttons from it, duplicated it 3 times and placed them on 3 sides of the exposed wall:screenshot unity.png

 I added a pointable canvas unity event wrapper and in the select part I added my menu scaler.

Here is the pointable canvas module settings:

Screenshot 2024-08-21 051338.png

When I run the scene in unity editor, and I point and pinch at the canvases around the wall, I get the event called successfully and the menu pops up. However, when I make the build and run the build on Quest 3 untethered, pointing at the canvases still works (I see the cursor and the click effect) but the unity event to scale up the menu does not work...
Any ideas how I can debug this or fix it? If there is also a way for me to use this ray interactor + pinch without a canvas where I can see the cursor similar to how it appears on canvas, I would be grateful for directions too.

Thanks in advance!

1 REPLY 1

jiea
Explorer

I had the same issues and tried many things, this video helped me, and the result is somewhat consistent...
However, I still have random problems when sometimes buttons won't be clicked in the Editor or built application. But it mostly works...
This Meta SDK is borderline functioning.