Meta Quest Unity Real Hands Building Block not showing real hands
Hi all! I'm somewhat new to VR development, especially in mixed reality. I am trying to use Meta's Real Hand building block, but I can't seem to get it to work. I have a very basic scene with some of the fundamental building blocks (camera rig, passthrough, passthrough camera access, interaction rig), along with the real hands building block and a single cube. When I build the project to my Quest (Meta Quest 3), and move my hands in front of the cube, I can only see the virtual hands - the occlusion does not work to show my real hands (i.e. it works the same as it did before I added the Real Hands building block). Why is this and how can I fix it? Unity Version: 6000.3.4f1 Meta Quest Packages: Meta XR Core SDK (85.0.0), Meta MR Utility Kit (85.0.0), Meta XR Interaction SDK (85.0.0) Steps to Replicate: Create a new empty scene Add the following building blocks: Camera Rig Passthrough Passthrough Camera Access Interactions Rig Real Hands Add a cube at (0, 0, 3) Build the project and deploy to the Quest Wave your hands in front of the cube - only virtual hands are visible, not real hands58Views0likes1CommentIs there a way to make the player not see their own avatar
Hello, I am building a Unity Project with Meta All-In-One SDK and I'm using the Networked Avatar Building Block on top of Matchmaking and Hand Tracking and Passthrough to create an experience where users can see other avatars with their hand movements with Passthrough in the real world , this creates an effect where you can see people walking around, talking and interacting with things in a room that they are not physically in with passthrough. My issue with this is the fact that the player (host) can see other players' Avatar and other players can see the host's avatar but when the host themselves also look down they also see their own avatar arms connected to their hands and body. I do not really want this Is there way for the player to just see their normal hand prefabs/meshes from their perspective while other connected players see the full avatar and vice versa?16Views0likes1CommentMake GameObjects translucent to the Passthru?
For debugging and for a few specific bits (I want to make a Leia hologram in my scene), I'd love to have the gameobjects be translucent (alpha fine) to the passthru images. If I "just set alpha," the gameobjects are transparent to themselves... but they still block all of the passthru... So... is there a way to make a hologram in your living room that works?Solved106Views0likes5CommentsPassthrough randomly disables during runtime
UE 5.5.4. Meta XR plugin v78. I have an apk that I sideload onto several headsets. The app works fine while connected to my laptop for development. Never an issue. The packaged apk works fine on the headsets during testing. Seems solid. But then I'll launch the app and hand out 10+ headsets to the customers, and they walk around for a while, eventually some of the headsets lose passthrough. The app is still running and tracking. You can see the virtual objects, but they are on the field of black. You can't see and of the surroundings. Double-tapping the side of the headset simply pauses the app and takes me to the menu. I can't get passthrough working again once this happens. I added blueprint code that every 10 seconds checks if the passthrough object is null and, if so, initialize another passthrough, but that didn't help. I've fired up 10 headsets let them run and walked around the room holding several of them, grabbing them all over the headsets (in case the customers are pressing some button), but I can't get the passthrough to stop working. Hence why I don't have a log file to attach. I'm at a loss what could be happening and how to recreate the issue. Any help or suggestions would be appreciated.67Views0likes3CommentsWhy does an MRUK Room sometimes get loaded at an angle?
I'm working on an AR app in Unity which makes used of the global mesh of the room I'm currently in. It seems that on rare occasions the whole room gets created/loaded at an angle. Not a rotation around the up-axis, but an actual tilt. I can easily detect that this happens, but from there on out I seem to be stuck. From what I can tell there is no API to go "hey, please attempt to reload the rooms again". I guess I could reload the scene, triggering a reload of the MRUK prefab, and see if that fixes things, but that's not ideal. So I guess my question is twofold. Is this a known bug/issue? How can I, when I detect this happens, cause a specific room to get "reloaded"? Unity Version: 6000.0.56f1 Current Meta Packages: 81.0.1 (Although the behaviour dates from well before that)162Views0likes4CommentsStringscape: Turning Hand Distance into Pitch
I’m currently building a Quest experience called Stringscape, and I wanted to share the core idea and get feedback from other developers here. The concept is simple: You stretch a glowing “string” between your two hands, and the world-space distance between them controls the pitch. Closer hands → higher pitch Farther apart → lower pitch The experience is designed to be more of a creative playground than a structured music tool. I’d love to hear your thoughts. It’s currently in Early Access on Quest as well if anyone is curious to try it. Thanks!
22Views0likes0CommentsIs there a flag for World Locked vs Free?
I have the MRUK world locking seemingly everything by default... But not quite. Evidently Zombies aren't world locked and nether is my DebugMenu... But my "Main Menu" is world-locked and I'd like it to reposition itself in front of me. How can you tell MRUK "free this" and how do I know what is "in front of me, near eye level" so when I spawn an orb... it's "findable?" (Hunting the real world for virtual items is... odd)58Views0likes0CommentsAccessibility Feature Request: Conversation Focus Mode for Ray-Ban Meta Display Glasses
Hi everyone! I’m a Ray-Ban Meta display glasses user who is hard of hearing and wears hearing aids daily. I’d love to see a conversation focus mode added that prioritizes voices directly in front of the wearer and reduces background noise. In busy environments, this would make a big difference for hearing-aid users and others who rely on clearer speech in real time. If this type of accessibility feature is ever developed, I would absolutely love the ability to have it added to my glasses and would be happy to provide feedback or participate in any beta or user-testing opportunities. I’ve also submitted this through support channels, but wanted to share here in case the team is gathering feedback.141Views1like0Comments3D Raycast from hands... where is it coming from?
I have a Unity Mixed Reality project... and added the hands and real hands and that's cool. Tossed in a Zombie so I can try to get him to follow me or I can shoot him (finger gun special). Now I want to fixup the ray cast from the hands/controllers so I can interact with game objects from afar... but I'm not even sure where that ray is coming from so I can see the script. My "change" would be to have the ray extend 50m... and return a bunch of "hits" with "target class:" GameObject, yellow [put an explosion effect on that zombie -- if it's hitting the mesh] Interactable, a blue disc appears [press trigger to activate object] something from the 3d depth [depth raycast], an orange disc appears [put bullet hole on that] a scene object [floor/wall], a green [grounded] disc appears (note that that may not be the final terminus -- if there's "more models" outside the window or wall (or maybe you're picking something on the other side of a table).... [code has to see if you can shoot through it] All of course, flat against the object, and usable in code (you might be able to fire a laser through the window, but it won't go through a wall; code will see if that works)... But... I don't know where to look... the ray from the hands does #3... but I don't know where in the Asset Tree it's coming from --it will probably also tell me how to make those discs (is it a Gizmo, or a GameObject?). I figure I can add #1/2 [from the cubes, but I haven't quite figured them out yet either, and #4 [EnvironmentalRayCast [ERC] but I might have to iterate on that one because I don't see a "give me all the hits" from the ERC). Questions: a) Where is this 3d ray coming from in the asset tree so I can learn? b) Is there a good way to "scale" the discs so they're "always ~20px dia" no matter how far away they are? c) It looks like I need to change the shader of my zombie, but I'm not getting the terminology -- it occludes fine (eventually I want the IRL table to occlude it), but I need to say "oh, user picked the bellybutton -- spawn an explosion effect in his gut..." -- and how do you change shaders anyway? I can change materials from the editor, but...?Solved41Views0likes1Comment