Recent Discussions
Interface of getting Controller joystick travel range
Hello, I’m fairly new to the Meta Spatial SDK, and while reviewing the Controller Input documentation, I couldn’t find any interface that allows retrieving the joystick’s travel range. What I came across were examples treating joystick directions as “Buttons” (like ButtonThumbRL). However, what I’m trying to do is map the joystick’s travel to an attribute like “velocity.” Is there any way to access this information? If I’ve missed something, my apologies, thank you!xiayip16 hours agoHonored Guest0likes0CommentsRun every frame vs. every tick?
Is there a way to run a method in an activity once every frame instead of every tick? I'm aware that Systems exist, but they seem to be tied to the activity tick rate. Side note, my system also seems to not consistently run at 90 ticks per second even though the contents of its execute() method take less than 11ms consistently. I'm running a simple project with this system in it and nothing else-- the tick rate jumps between 45 and 90 sporadically.PointyLlama52739 days agoExplorer0likes1CommentHow to change appearance(baseColorTexture) of a preload glb runtime?
Im a beginner in 3D development. I import a glb object into environment.env with predefined textures(baseColorTexture, normalTexture, occlusionTexture), and I try to change the appearance(runtime) when clicking buttons. Here are my code, but It didn't work to change the texture (baseColor or baseColorTexture). Is there anyone could help me? Thank you very much. private fun changeMaterial(node : SceneObject) { val material = node.mesh?.materials?.get(0) var mesh = node.mesh var meshMaterial = node.mesh?.getMaterial(0) as SceneMaterial var test_texture = SceneTexture(activity.getDrawable(R.drawable.test_texture)) //var texture_name = "baseColor" //--> exception: baseColor not found var texture_name = "occlusion" //--> worked and texture is changed meshMaterial.setTexture(texture_name, test_texture) }SolvedBigMac17719 days agoHonored Guest0likes1CommentHow to set alphaMode on custom material?
I only see the alphaMode in the SceneMaterial constructor, but not in the SceneMaterial.custom function? Not is there a settable property for this? I want objects with some of my custom materials to use the TRANSLUCENT_POST option.Solvedwaynecochran13 days agoExplorer0likes1CommentCan't stack anymore activities in Panel in immersive
Hi, on v74 it was simple to stack activities into an immersive Panel (just start new activites from the one in the panel). With the v76 it doesn't behave like that anymore, it opens new activities in the home environment. Since v74 is based on Android 12 and v76 on Android 14, I assume launching activities on top of the other in a panel may have changed. Sadly couldn't find the solution, nor any clear documentations. Can somebody give me a hint 😉 Thx in advance.SolvedShadowyLynx186622 days agoExplorer0likes7CommentsBinding objects in reality using spatial anchors
Hi, I'm new to development, so maybe my problem is too simple. But still, thanks in advance for any recommendations! I'm trying to create a very simple AR app for Meta Quest 3, following the official instructions on Meta's website. However, the result so far leaves much to be desired. My goal is to fix an object from a prefab onto a table in the real room. Later, I also plan to add UI buttons to the object to enable interactions — but that comes later. I'm using Unity version 6000.1.6f1. I’ve installed the Meta All-in-One SDK (v77) and decided to start by using the provided Building Blocks. I also installed the OpenXR plugin, as recommended. In Player Settings, under Active Input Handling, I selected Input System Package. In XR Plug-in Management, I selected OpenXR > Meta XR Feature Group. Under OpenXR settings, in Enable Interaction Profiles, I selected: Oculus Touch Controller, Meta Quest Touch Pro Controller and Plus Controller Profile. In OpenXR Feature Groups, I enabled: - Meta XR Feature, - Meta XR Foveation, - Meta Subsampled Layout, - Meta Quest Anchors, - Boundary Visibility, - Bounding Boxes, - Camera (Passthrough), - Display Utilities, - Meshing, - Occlusion, - Planes, - Raycast, - Session. In the scene, I added the following Building Blocks: Camera Rig, Passthrough, MR Utility Kit, Find Spawn Positions, Controller Tracking Left, and Controller Buttons Mapper. I added my prefab object (currently without UI buttons) and connected it to Find Spawn Positions. I also configured labels like “table” and other placement surface definitions. When I run the app, the room scan loads and the object appears as if it’s placed on a table — but the position is different every time, and sometimes the object ends up on the floor instead of the table. I also tried writing a simple script: when pressing the B button on the controller, a beam should be fired and an anchor placed where the beam intersects with a surface (using my prefab object as the anchor). But nothing happened when I tried this, so I abandoned the idea and looked for something ready-made in the Building Blocks. Eventually, I found the Instant Content Placement Tool. I added all three recommended objects to the scene: Environment Raycast Manager, Cube, Raycast Visualizer. However, when I launched the scene, the controller-based placement didn’t work. The cube simply appeared at the player’s origin point and stayed on the floor, without any movement or interactivity. Of course, I was also hoping to make the placed object stick to real-world surfaces and persist across sessions by saving its anchored position in the room scan — but I understand that this might still be far off. Could you please suggest a clear algorithm or sequence of steps for achieving this? It seemed like placing and saving an object in a room should be a simple task. But after spending a whole week trying to understand the Building Blocks and how everything connects, all I could manage was to place the object — and even that is inconsistent, with its position shifting slightly every time.KovalevBoris28 days agoExplorer0likes4CommentsIs there a way to place a object in the hand instead of the contorllers
Hey, I am trying to make a shooting range application, and I wish for the weapon in the game to be in the place of the controller. I don't have a beefy computer and meet the bare minimum, and this was the only way I code for Quest 3 at the moment, which is through the Meta Spatial SDK. I see in the spatial Editor there is an AvatarBody Component that allows you to attach things to it, but I am reading that it allows you to use it in the sense of making your own Playable Avatar and using that to attach things to your avatar. there are no samples out there yet anyone figured out how to replace the controller model with another object yetsimplecast28 days agoMember0likes0CommentsCustom Item Tracking using Spatial SDK
We are trying to implement a custom tracking system along with Quest's tracking. We want the headset to be able to detect custom gloves that we put in our hands, is it possible to train the current ml models or the Spatial sdk currently only uses the pretrained model?manrock778829 days agoHonored Guest0likes0Comments