Mixed Reality with Unity and Meta SDK Test
Hi, I have been developing in Meta Horizon since 2020 and have learned UnityXR/MR. I will graduate with a masters degree in Art and Technology in May 2026. For my final project I will be working on a Mixed Reality interaction for dyslexic learners with hand tracking. I will be applying for the smart glasses grant for accessibility. I've been in education for the past 19 years, teaching students with dyslexia for the past ten years. This video shows my first test. Link and image below. Mixed Reality Test, Quest 3: Mixed Reality Test, Unity and Meta SDK by Tina Wheeler7Views0likes0CommentsDouble Tap Hand Tracking Issue/Question
I just got the headset not too long ago and wanted to try out the hand tracking feature without setting down the controller. There is a setting that allows double tapping the Touch Plus controllers together to enable, however after no less then 5 seconds the controller regains control and hand tracking turns off. (Using the Velcro Straps from Meta instead of provided string straps) Is there a way I can make my controllers not re-attain control while still having them technically in my hands? Or is this feature only allowed when the controllers are set down? I got a friend who sent me a solution via a third-party attachment that enables finger-tracking similar to Vive, however I'd rather stick with what I got and the attachment isn't even available yet anyways(March 2026).Solved46Views0likes5Comments👊 Workshop: Master Advanced Hand Tracking
Build natural interactions into your VR experiences using advanced hand tracking features. In this workshop, Start Mentor Valem will cover hand pose detection, microgestures, grab poses, hand swipe, hand tracking locomotion, poke and how to combine hand tracking with controllers. This session is focused on practical examples and patterns you can directly reuse in your own VR experiences. Join on Zoom19Views0likes0CommentsOutdated documentation has me stuck on making custom hand gestures in vr unity
Hey there all I still somewhat new within the meta quest game dev so I been following some of the documenations from the Meta site to use their SDK within Unity, however I been stuck on the "Build a custom hand pose" tutorial where it uses a outdated OVRCameraRig where it still has LeftHand and HandFeaturesLeft within the OVRHands which is not with the v83 of the SDK. Any help of how to do this now would be very appreicated! This is the link I been using https://developers.meta.com/horizon/documentation/unity/unity-isdk-building-hand-pose-recognizer/45Views0likes1Comment➕ What’s New: v83 Meta SDK Update
V83 of the Meta XR SDK introduces major improvements to core XR functionalities. Join Start Mentor Valem to hear a rundown of the top features and changes you can implement in your project right away. See the new AI Building Blocks, enhancements to hand tracking, new locomotion options, physics based hand interactions and more. Join on Zoom556Views2likes3CommentsProblem with Quest 3 tracking mode
I have the following problem with spatial tracking on the Quest 3 headset. I noticed that during gameplay in the project I am working on, the headset seems to switch between two tracking modes. The project is about a climbing wall that rotates, and in VR we climb on it. Because the wall rotates and moves us downward, at the same time we move the virtual world (the player) upward. Here I noticed a problem: the wall moves at 2 cm/s, but the virtual world sometimes moves at 2 cm/s and sometimes at 20 cm/s. Depending on which tracking mode the headset switches to, the player (virtual world) is moved either correctly by 2 cm or incorrectly by 20 cm per second. I tried monitoring various headset parameters via logcat that could indicate the headset switching to a different tracking mode, but I couldn’t find anything. I should add that the headset operates during climbing in rather uncomfortable conditions (very close to the wall), so I take into account that these are not ideal operating conditions. However, I need to find out what exactly changes in the headset (at different moments) that causes the world offset to sometimes be 10× larger than it should be. If I were able to detect some variable that indicates switching to this second mode, I could correct the movement programmatically. Therefore, I would appreciate any suggestions as to what might be changing in the headset. I monitored the tracking origin to see if it switches between Floor and Eye, but everything looks fine there. Setting the boundary to Stationary or Roomscale also makes no difference. I will also add that for the purposes of this project I disable Boundaries, because we climb quite high (3 m) and the headset goes outside the play area. I would be very grateful for any suggestions on where I could look for or monitor variables that would allow me to determine the tracking mode the headset is using. The project is based on the Meta XR plugin and Hand Tracking.30Views0likes0CommentsWho do I have to talk to, to get this done right?
Your art director may need to be replaced. Here's a common sense suggestion, that Meta needs to implement into the user interface; allow the user to adjust the opacity of the tool bars and menus, from 0% - 100%. If you decide to do things "your way" then make the main menus: the horizon feed, the navigation bar, appz background, 0% transparent, and **Be Sure To ADD DROP SHADOWS**, behind every text, div , box, and so on, to counter light colors in the far background such as walls, or television program. The Main thing is to be able to see your background sceneries in both mr/vr . The second thing which should have been the first thing, because you had it right in a earlier version, when Meta allowed us to carry a compact version of the menu around like a tablet. That was the greatest concept yet in my opinion. Meta need to allow the users to adjust the menus and navigation bar, through stretching, (not just expand) but adjusting the size of the actual navigation menus, the way they automatically do it when you move them away and towards you. The user should be allow to adjust it manually, for those that wear corrective lens, that would allow them to adjust the menus to a view that they can see clearly while searching for their favorite app, or for just general reading of text. Its selfish and counter intuitive to be the developer and then adhust the screen distnce and size to how you see fit, rather than giving the user full control of the distance and size of the bar and menus. Currently if i want a larger menu screen and bar, I'd have to lean back if im sitting, stretch the screen, and then sit forward for the screen to be closer and larger, without it shrinking like when I pull it towrds me. I want it to stay the same size i had it adjusted to just closer. I would like some feedback regarding this so I am assured that i am not just typing this for my health. I hope I'm not here just wasting my energy sharing ideas that I believe you should have figured out a long time ago, as a simple fix. You should be adding things and not replacing them, unkess necessary. The original backgrounds were good, but then someone went left, and ebtered their second childhood it seemed. But again hope this is not a waste of time for me, if you're not even taking things like this into consideration but chose to take advice from 9 year old using their parents equipment. Stop preying on children, they should not be the focus. Children want to do what the adults do, that's why they gravitate to gta, baby stuff is simply for babies, and if you have a baby, that baby almost always want your things over theirs. Cater to the adult, modern age people, or you'll be stuck on 67, which is brain rot. It similar to rapp music, this stuff is for a certain demographic, they should be catered to, and as time goes one focus on the same age group, as well as your first supporter. The ones that got you this far, by spreading the word, and most people still haven't even tried vr. So again, I do expect feedback. You should be paying me for this advice, seriousor. But I want to see you win, because if you win,in my mind I should he winning being i been here since 2019. horizon worlds was a big waste of time, when Meta could have made this thing futuristic, but instead, focused on children that don't pay money, and then tried biting an idea from a newer but degenerate console when Meta already had the bread and butter You just need to adjust what you already had/have, and that's allowing three things to the disposal of the user: transparent menus (with drop shadows under text and icons, and tinted boxes), allow manual adjustment to the actual size of the menu (not just expanding) and allow a compact version of the menu (or bring back the original compact model) like you had several updates back. [I dont even know why yall got rid of that]. That was a real slow move, because anyone complaining about that may have been trolling you or just complaining just to complain. I used to do graphic design once upon a time. So it's wierd for me to see a newer generation, designing like degenerates. And I have to blame the art director and project management because that's such a simple fix, you're just changing the opacity and/or allowing the user to adjust them. Its a simple code. Even the stretching, instead of having it as static, assign it to stretch. But I went on a tangent, but I came back and doubled down. If I came off rude sorry, but seriously what's going on over there? I've been here since 2019. If you want a change just update and enhance what you already have. Feedback please. Even if its to call me an A-hole 🙏26Views0likes1CommentHow to Polish your XR Title : Onboarding, Optimization and Game Feel
Hi guys, I recently ran a workshop for the Meta Horizon Start Competition focused on polishing an XR title. At that stage of the hackathon, with only a week left before submission, one question mattered more than any other: How do you make the biggest impact in the least amount of time? This post is a written summary of that workshop aiming at answering this question around three topics: Player onboarding Optimization Game feel But why these three? Because when you are close to shipping, it becomes critical to look at your application through the eyes of a player, not a developer. For example : The build can behave very differently on device compared to the editor, which makes optimization and debugging essential. If players do not understand what to do, onboarding needs improvement. If it works but feels flat or unsatisfying, game feel needs polish. Together, these areas tend to give the highest return on investment when time is limited. Let’s start with player onboarding, using the Meta All in One SDK. Player Onboarding Player onboarding is how you teach players to understand and interact with your game. A useful principle to keep in mind is: The best tutorial is no tutorial. Ideally, players learn by doing. You introduce interactions one at a time, following the player’s natural progression through the experience. That said, refining this approach requires time and playtesting, which is often limited near the end of a project. When time is short, these faster onboarding tools can help communicate essential information clearly. Tooltips Short, spatialized text pointing at an object or interaction. Best used to highlight a short action. UI Panels The most basic solution, using Unity UI to display text, images, or short videos. Ghost Hands Very effective for hand tracking. You can visually demonstrate a hand pose or motion. Tip: With hand tracking, you can click on play, do a certain pose with your own hand, then drag the hand model from the hierarchy to the project window to save it as a prefab that you can reuse and that will directly have the pose you want to show. Voice Prompts Recording short voice lines can be faster than writing UI and often feels more natural. Keep them short and contextual. I provide a Unity package with a sample scene demonstrating each of these approaches here: https://drive.google.com/file/d/1n3IUzLMH6_60foStzgSdR5WignTkWUFh/view?usp=sharing Here are three more guidelines about onboarding : Optimization Optimization is crucial. It is not only about meeting Meta VRC requirements so your app can ship, but also about player comfort and enjoyment. Below is a simple optimization workflow that works well even for beginners and usually resolves most common issues. Step 1: Check your project settings Many performance problems come from incorrect project setup. Use the Project Setup Tool from the Meta SDK Ensure Single Pass Instanced is enabled Review URP settings and quality levels Bake your lights whenever possible For a full setup checklist, see this video: https://youtu.be/BeB9Cx_msKA?si=AjfnZdoPxH3jPxk- Project Validation Tool in Action Step 2: Check your scene complexity Triangle budgets vary depending on your content, but a good general target is around 150,000 triangles visible at once. Draw calls are just as important, and often more critical so keep them as low as possible (under 80). Ways to reduce complexity: Lower triangle count per model Use frustum and occlusion culling Add LODs to complex meshes Use batching techniques where possible For more information about the target fps, drawcalls and triangle count, go check out this meta documentation page: https://developers.meta.com/horizon/documentation/unity/unity-perf Step 3: Use OVR Metrics Install OVR Metrics and always test outside the Unity editor. OVR Metrics gives you real performance data on device, with live graphs and indicators. Here are the three most important data to track : FPS GPU and CPU usage Render Scale (must stay above 0.85 to meet VRC requirements) You can download it here : https://www.meta.com/en-gb/experiences/ovr-metrics-tool/2372625889463779/ Play your build and look for frame drops, reduced render scale, or sustained high GPU or CPU usage. Step 4: Find and remove your bottleneck The previous step tells you when and where performance drops occur. Now you need to understand why. Use the right tools: Unity Profiler Identify CPU bottlenecks, scripts taking too long, physics spikes, or garbage collection issues. Frame Debugger Analyze draw calls and rendering passes to understand what is actually being rendered each frame. Meta Runtime Optimizer Helps identify XR specific performance issues related to the runtime and rendering pipeline. Once you know the bottleneck, you can make targeted changes instead of guessing. Step 5: Last resort performance boosts When time is very limited, two features can significantly improve performance with minimal effort: Dynamic Resolution Fixed Foveated Rendering Both offer configurable levels to balance performance and visual quality. Be careful when lowering values too aggressively, visual quality can degrade quickly. Always remember that Render Scale must stay above 85 percent to pass VRC. Fixed Foveated Rendering Applied to Eye Texture Game Feel Game feel is often underestimated, but small improvements here can dramatically improve how polished your XR experience feels. Moreover having responsiveness also helps guide the player to understand your game and help onboarding. Haptic feedback Haptics add physical feedback to interactions and are extremely effective in XR. Triggering haptics is often just a single line of code. If you want more advanced effects, you can build layered patterns using a haptic tool or studio. Tutorial: https://youtu.be/RUUwWMkXFt0?si=1L92-NwIL9xy4CfJ Grab poses Hand grab poses let you enforce a custom hand pose when grabbing an object. The pose can vary depending on how or where the object is grabbed. For example, a cup grabbed by the body uses a different pose than grabbing it by the handle. This small detail greatly improves realism and comfort. Documentation: https://developers.meta.com/horizon/documentation/unity/unity-isdk-creating-handgrab-poses/ Sound design Sound is often forgotten, yet incredibly important. Add sounds to every interaction Use pitch variation for natural randomness Adjust volume based on the intensity of the action The Meta All in One SDK already includes UI focused audio, but Meta also provides a free audio pack with more general sounds: https://developers.meta.com/horizon/downloads/package/oculus-audio-pack-1/ Outro So here it is guys, these are the most impactful steps you can take to polish an XR title when time is limited. I hope this breakdown is helpful. Feel free to share your own tips and tricks below on how you polish your XR titles.64Views1like0CommentsFeature suggestion: Browser usage
Hello, since i could not find any suggegstion submission board, i will put this here hoping someone from the dev team will see it and improve, possibly vibecode, this feature. Please add browser copy and paste by hand gesture (every desktop and mobile browser supports, just copy what they do). most intuitive would be pinch and open hand onto another browser window. would be cool to also copy paste onto virtual whiteboard, of anything including desktop contents or screenshots, but that’s the ideal state.25Views0likes1Comment