ERROR_CLOUD_MESH_UPLOAD_FAILED(-400039)
Unity colocation build with MRUK, getting this error in Logcat and "host can't share the room with group xx-xx.." on immersive debugger... User ID and User profile active in dashboard, device has spatial permission enabled, project uploaded to alpha channel, my own user invited as tester, Keystore signed, tried with 4 different WIFI, no LLM has a clue... was working before upload alpha, What else should I check?24Views0likes0CommentsMRUK QR Tracking suddenly not working
I am making a quest3/3s app that uses QR codes for alignment, I was able to get the system working but suddenly this week I am now getting an 'MRUK Shared: queryCompleteEvent->result returned error code: -2', and QR tracking has stopped completely. The headset has stopped asking for me to confirm the environment setup before launching. I was able to confirm its broken on an old push and fresh project. I have spent most of a day trying to troubleshoot, but my best guess is something on the headset changed, and I have done all the troubleshooting steps on the headset and unity project as possible. Any ideas here would be extremely helpful. Ideally all i need is tracking, in fact, id love to remove the scene setup step, since the app is boundaryless and only uses the scene for QR code tracking support MRUK/Meta XR core/Meta Interaction/SDK Essentials all V85.0.0 Quest3 OS V2.178Views0likes3CommentsAccessibility Feature Request: Conversation Focus Mode for Ray-Ban Meta Display Glasses
Hi everyone! I’m a Ray-Ban Meta display glasses user who is hard of hearing and wears hearing aids daily. I’d love to see a conversation focus mode added that prioritizes voices directly in front of the wearer and reduces background noise. In busy environments, this would make a big difference for hearing-aid users and others who rely on clearer speech in real time. If this type of accessibility feature is ever developed, I would absolutely love the ability to have it added to my glasses and would be happy to provide feedback or participate in any beta or user-testing opportunities. I’ve also submitted this through support channels, but wanted to share here in case the team is gathering feedback.141Views1like0Comments3D Raycast from hands... where is it coming from?
I have a Unity Mixed Reality project... and added the hands and real hands and that's cool. Tossed in a Zombie so I can try to get him to follow me or I can shoot him (finger gun special). Now I want to fixup the ray cast from the hands/controllers so I can interact with game objects from afar... but I'm not even sure where that ray is coming from so I can see the script. My "change" would be to have the ray extend 50m... and return a bunch of "hits" with "target class:" GameObject, yellow [put an explosion effect on that zombie -- if it's hitting the mesh] Interactable, a blue disc appears [press trigger to activate object] something from the 3d depth [depth raycast], an orange disc appears [put bullet hole on that] a scene object [floor/wall], a green [grounded] disc appears (note that that may not be the final terminus -- if there's "more models" outside the window or wall (or maybe you're picking something on the other side of a table).... [code has to see if you can shoot through it] All of course, flat against the object, and usable in code (you might be able to fire a laser through the window, but it won't go through a wall; code will see if that works)... But... I don't know where to look... the ray from the hands does #3... but I don't know where in the Asset Tree it's coming from --it will probably also tell me how to make those discs (is it a Gizmo, or a GameObject?). I figure I can add #1/2 [from the cubes, but I haven't quite figured them out yet either, and #4 [EnvironmentalRayCast [ERC] but I might have to iterate on that one because I don't see a "give me all the hits" from the ERC). Questions: a) Where is this 3d ray coming from in the asset tree so I can learn? b) Is there a good way to "scale" the discs so they're "always ~20px dia" no matter how far away they are? c) It looks like I need to change the shader of my zombie, but I'm not getting the terminology -- it occludes fine (eventually I want the IRL table to occlude it), but I need to say "oh, user picked the bellybutton -- spawn an explosion effect in his gut..." -- and how do you change shaders anyway? I can change materials from the editor, but...?Solved42Views0likes1CommentStylized passthrough: How can i retexture walls?
Meta Horizons documentation on Scenes gives this image as an example of a Basic stylized passthrough. This looks to me like a screenshot of a stylized hall way. How can such an effect be accomplished in Kotlin without using Unity or Unreal? Can this effect also be achieved on Quest 2 or only on Quest 3(s)? The article mentions that Assisted scene capture (available on Quest 3(s) only) shouldn't be used to create such an effect.26Views0likes0CommentsHorizonOS v81 update broke Passthrough in startup Splash Screens
Hi folks, This is for a project in a C++ UE 5.5.3-0+oculus-5.5.3-release-1.109.0-v77.0 engine from official Oculus Branch. A recent update to Horizon OS, likely v81, seems to have changed the way Splash screens are handled to now send the user to a black void with particles and wisps flying by. This has created a big problem for our MR application in that now NO Passthrough is visible during splash, which is causing us to fail VRC (Functional.14). This was not an issue before this OS update. These are the settings and image (PNG with transparency) we are using: Even forcing bAutoEnabled to True in DefaultEngine.ini does not help: Setting to Black also gives the same result, which is the black void with a very low resolution proxy of the PNG we have: For reference, this was the behavior before the v81 update: We also scouted other MR apps and it seems they face similar situations (?) Any help is appreciated as we'd rather not request any waivers during our QA process. Thank you! -Sebastian206Views0likes2CommentsHiding the Scene Mesh for AR
I feel like I've tried everything, but I cannot get my AR project to load a scene without overlaying a giant box mesh over everything. I.e. I'm looking to be able to raycast against walls, but have it look as though you're interacting with the actual walls of the room via the Quest 3 camera. I stole the setup from the MRUK Sample: I've tried messing with the Material for the GenerateProceduralSceneMesh call. I've tried asking the Room for the GlobalMeshAnchor and setting the actor hidden. I found a tutorial that recommended adding an OculusXRSceneActor to your scene and changing the static mesh class types based on anchor type - it didn't seem to do anything and apparently this class is now deprecated. Same thing with adding an OculusXRSceneGlobalMesh component and messing with the Visible toggle and the Material field - also deprecated. No matter how I mess with the materials or actor visibility it always looks like this: I'm able to use MRUKSubsystem->Raycast to interact with the walls. I just don't want to see them. What's the accepted way to hide them?36Views0likes0CommentsLink does not have passthrough enabled?!?
Hi guys, I'm getting this error... "Link installed on the machine does not have passthrough enabled which is used in the project. Please enable it in Settings > Beta > Passthrough over Meta Quest Link." It's happening because this code in OVRProjectSetupDeviceTasks is failing... private static bool IsPassthroughEnabledOnLink() { return GetLinkWindowsRegistryValue("PassthroughOverLink"); } I'm a little embarrassed to say this, but... I don't have a Beta section in my Settings. Where do I find look for this? -Chilton297Views2likes6Comments