Meta Quest Unity Real Hands Building Block not showing real hands
Hi all! I'm somewhat new to VR development, especially in mixed reality. I am trying to use Meta's Real Hand building block, but I can't seem to get it to work. I have a very basic scene with some of the fundamental building blocks (camera rig, passthrough, passthrough camera access, interaction rig), along with the real hands building block and a single cube. When I build the project to my Quest (Meta Quest 3), and move my hands in front of the cube, I can only see the virtual hands - the occlusion does not work to show my real hands (i.e. it works the same as it did before I added the Real Hands building block). Why is this and how can I fix it? Unity Version: 6000.3.4f1 Meta Quest Packages: Meta XR Core SDK (85.0.0), Meta MR Utility Kit (85.0.0), Meta XR Interaction SDK (85.0.0) Steps to Replicate: Create a new empty scene Add the following building blocks: Camera Rig Passthrough Passthrough Camera Access Interactions Rig Real Hands Add a cube at (0, 0, 3) Build the project and deploy to the Quest Wave your hands in front of the cube - only virtual hands are visible, not real hands12Views0likes0CommentsMake GameObjects translucent to the Passthru?
For debugging and for a few specific bits (I want to make a Leia hologram in my scene), I'd love to have the gameobjects be translucent (alpha fine) to the passthru images. If I "just set alpha," the gameobjects are transparent to themselves... but they still block all of the passthru... So... is there a way to make a hologram in your living room that works?57Views0likes4CommentsMeta SDK Mixed Reality - shaky scene
I'm working on a project with Meta SDK and mixed reality in Unity I'm trying to find a way to stabilize the scene, when I shake my head or move side side, it seems to reload often or shake back-and-forth. How do I fix this? I was told to use MRUK. Is it already installed in the package of the complete SDK?14Views0likes0CommentsMeta building blocks should not override settings
I m using for years Meta sdk, lately wth openxr backend in Unity. There are some main issues which I found after doing dozens of VR projects with hand tracking and other feature. I don t know how other people are using building blocks, but BUILDING BLOCKS SHOULD NOT OVERRIDE SETTINGS, at all. I spend weeks on a project, with hands, features like poke etc, then I just want to drag a building block for the audio for example, and suddenly, things are changed in the ovr camera rig and things suddenly don t work. If for example I have in my hierarchy locomotor deactivated, it should not activate it automatically. And also other settings. If it s a building block designed to do a thing, should only do that thing. Also, after all these years with hand tracking and grabbing, it cannot still be so rudimentary to have so many components for one thing: grabbing an object. It shouldn t need grabbable, hand grab interactable, grab interactable to just grab an object. Then, from one update to another, you just want to move the grabbed object in a way, but scripts don t work, new ways appear, like grab free transformer, which doesn t cover all the use cases. The structure of the camera rig is way too complicated, so many hand objects like hand anchors, hand visuals and other objects for interactions, left and right. Also, Oculus hands are still by default activated in editor, openxr hands are deactivated. Over time, things should become easier to use, not harder. Thank you.40Views0likes2CommentsAccessibility Feature Request: Conversation Focus Mode for Ray-Ban Meta Display Glasses
Hi everyone! I’m a Ray-Ban Meta display glasses user who is hard of hearing and wears hearing aids daily. I’d love to see a conversation focus mode added that prioritizes voices directly in front of the wearer and reduces background noise. In busy environments, this would make a big difference for hearing-aid users and others who rely on clearer speech in real time. If this type of accessibility feature is ever developed, I would absolutely love the ability to have it added to my glasses and would be happy to provide feedback or participate in any beta or user-testing opportunities. I’ve also submitted this through support channels, but wanted to share here in case the team is gathering feedback.125Views1like0Comments3D Raycast from hands... where is it coming from?
I have a Unity Mixed Reality project... and added the hands and real hands and that's cool. Tossed in a Zombie so I can try to get him to follow me or I can shoot him (finger gun special). Now I want to fixup the ray cast from the hands/controllers so I can interact with game objects from afar... but I'm not even sure where that ray is coming from so I can see the script. My "change" would be to have the ray extend 50m... and return a bunch of "hits" with "target class:" GameObject, yellow [put an explosion effect on that zombie -- if it's hitting the mesh] Interactable, a blue disc appears [press trigger to activate object] something from the 3d depth [depth raycast], an orange disc appears [put bullet hole on that] a scene object [floor/wall], a green [grounded] disc appears (note that that may not be the final terminus -- if there's "more models" outside the window or wall (or maybe you're picking something on the other side of a table).... [code has to see if you can shoot through it] All of course, flat against the object, and usable in code (you might be able to fire a laser through the window, but it won't go through a wall; code will see if that works)... But... I don't know where to look... the ray from the hands does #3... but I don't know where in the Asset Tree it's coming from --it will probably also tell me how to make those discs (is it a Gizmo, or a GameObject?). I figure I can add #1/2 [from the cubes, but I haven't quite figured them out yet either, and #4 [EnvironmentalRayCast [ERC] but I might have to iterate on that one because I don't see a "give me all the hits" from the ERC). Questions: a) Where is this 3d ray coming from in the asset tree so I can learn? b) Is there a good way to "scale" the discs so they're "always ~20px dia" no matter how far away they are? c) It looks like I need to change the shader of my zombie, but I'm not getting the terminology -- it occludes fine (eventually I want the IRL table to occlude it), but I need to say "oh, user picked the bellybutton -- spawn an explosion effect in his gut..." -- and how do you change shaders anyway? I can change materials from the editor, but...?Solved37Views0likes1CommentGray thing that blocks passthru?
I have a Q3, Unity6, using the Building blocks for Camera Rig, PassThru, MRUK, Raycast Viz, PassThru Camera Access, hands... and some cubes and things. And I did the "don't show guardian" thing in the Manifest... When I run the app, it is fine... until I move my head somewhere... then I am surrounded by a gray blob that blocks everything (including passthru; cannot see the scene nor can you see the passthru). Hands do continue to work (palm up meta key) even though you can't see them [you do see the meta glyph]. Sometimes it happens when I leave a room, other times I have no idea what or why it's there... but since it blocks everything... it quickly becomes dangerous because I can no longer see my surroundings. If I keep pushing... eventually, I get to the other side, but... no idea what it is, why it is, or even where it is (not visible in scene; doesn't belong to any of my gameobjects [cubes or dragon]). When seated, it's usually X=0,Y=.4-.6, Z is eye level (as in, if you move forward, it blocks you -- it seems to be spherical r=.2 but... it's not visible in any way). Annoying when I lean forward to see what's on the debug screen... Example of last time in a concretish way [and this one doesn't seem to be a sphere...]: I went out to my deck and put on my Q3, and started the program. It immediately went to "we need to scan the room," so I did so... I went back to the sliding glass door [also labelled "door" on the planes" closed the program and launched it again to line things up. It was "a little close," so I opened the sliding door and stepped thru and the gray blocker attacked... I shut the door (it's cold out there!), and shortly thereafter the cloud cleared and I could see that the models loaded and were okay... I then walked around to the backside of the deck and it was still there and mostly where I left it (yay! I figured it might lose the anchor), so I figured I'd go inside... as soon as I got "under" the room... the gray blob attacked... and didn't go away until I go inside the basement (into another "mapped room")... Like I said, sometimes the blob attacks when I'm seated... in a mapped room, so it's not like I left... but it doesn't happen every time... If you double-tap your Q3, it goes to the "passthru, but game paused" mode... if you double-tap again, it resumes... still in the blob. So... anybody know what this is and how I can banish it? I want to be able to see my surroundings at all times....Solved112Views0likes5CommentsMeta XR Simulator Standalone Help
I'm an educator teaching Unity & XR development using Quest 3 and Meta Building Blocks. But I have been really struggling because of the difference in learning materials online (from Unity's end, and from content creators online from even just 3 months ago, let alone 1 year). The most current / pressing issue in my class is the lack of updated documentation and examples using the new standalone version of the Meta XR Simulator. Half the documents in the official Meta XR Simulator Overview documentation are from 2024 and use the old interface (which had WAY more features and customization options). I have a bunch of students relying on mouse and keyboard controls trying to test behaviors like the locomotion building block, but they don't work. Current issues I would love suggestions or hints on how to solve (from just importing Building Blocks into a Unity Core 3D scene, nothing customized yet): I have duplicate controller models and ghosting (only in the simulator, visible when moving) I have weird graphical glitches sometimes that look like snow or fuzz (only in the Unity game view & simulator when running the simulator) I cannot get rays or aiming reticles to come from the controllers no matter where they or my mouse are pointing (but they work in the headset). Even with point and click on. Do the movement inputs (default WASD and Arrow Keys) simulate the left and right joystick? Or do they override/bypass those inputs? Some teleport control options involve aiming and pressing up on the joystick, and I'm not sure how to test that in the simulator Is there some way to add simulation input options that actually trigger the controller's inputs like the Unity package version used to? I would also appreciate any general advice or resources on new/recent best practices, customization options, and debugging tips using the building blocks and interaction SDK.100Views0likes1CommentGame-Design for post-stroke patients, need help hacking the interactions SDK
Hi everyone, I'm currently working on a Unity game designed for post stroke patients with hemiplegia undergoing motor rehabilitation. I want to use Hand Tracking to make the players grab different objects. Problem is : the impaired hand often has very reduced finger mobility, so grabbing motion is out of the question. So I would like to know how to : 1- Trigger fake hand motions to simulate grabbing on one hand with a simple event (also ignore actual hand motion except for position) 2- Even better : simulating a grab from a different object that i could control independently for animations etc... (Also i need this because eventually the game will have to work with a motorized rehabilitation glove that will be tracked via Passthrough because default hand tracking doesn't track him at all) I'm currently using the basic interaction building block and sample scenes prefabs for the objects (Chess piece prefab : Touch Hand Interactable + Grabbable components) If you have any leads on how to approach this problem, I'll be very grateful ! Joseph, student gamedev56Views0likes3Comments